[-] thundermoose@lemmy.world 6 points 2 weeks ago

Not sure where that 200k number is from. The article you linked doesn't say that and I haven't seen a number that high reported anywhere myself. All the info I have seen bounds the estimates between 30k and 50k killed, either through active combat or through disease/malnourishment/injury.

https://www.aljazeera.com/news/longform/2023/10/9/israel-hamas-war-in-maps-and-charts-live-tracker

[-] thundermoose@lemmy.world 15 points 2 weeks ago* (last edited 2 weeks ago)

I'm sure there are plenty of Israelis that want to do this even if they won't admit it to themselves but this isn't the final anything. The IDF has killed around 37,000 Palestinians out of ~2.3 million. That's horrible but nowhere near the "barely any left" stage.

A genocide on the scale of millions takes industrial effort to accomplish. I'm not saying it couldn't happen, but given Israel's reliance on foreign aid, current industrial capacity, and political position, it seems unlikely. My guess is Israel will take some more territory and the conflict (kinda tough to call the IDF bombing almost exclusively civilians a war) will peter out. Foreign aid will be allowed back in and Israel will put its mask back on.

Personally, I don't see how this doesn't end with half the middle east actively going to war with Israel if they don't stop soon. The only thing really keeping them safe is the US, and Israel has burned a lot of political capital here. Their leaders are awful, power-hungry shits, but they're not stupid. If they don't try to rebuild some of that capital, there's every chance that Israel loses its lifeline.

What comes years after things die down, I don't know. Gazan sentiment towards Israel was already overwhelmingly negative before this, but the IDF has never done anything on this scale before. I don't think Israel can allow Gaza any type of self-governance for decades after this. This is beyond even post-WW2 Japan levels of destruction, and unlike Japan every nation around them is still on their side.

[-] thundermoose@lemmy.world 4 points 3 weeks ago

I've been using Mint for about 6 months now and it works with Nvidia just fine BUT the new user experience isn't great. You have to use the nomodeset kernel option and install Nvidia drivers, otherwise you'll boot to a black screen.

Helpful guide: https://forums.linuxmint.com/viewtopic.php?t=421550

[-] thundermoose@lemmy.world 4 points 2 months ago

You're using "machine learning" interchangeably with "AI." We've been doing ML for decades, but it's not what most people would consider AI and it's definitely not what I'm referring to when I say "AI winter."

"Generative AI" is the more precise term for what most people are thinking of when they say "AI" today and it's what is driving investments right now. It's still very unclear what the actual value of this bubble is. There are tons of promises and a few clear use-cases, but not much proof on the ground of it being as wildly profitable as the industry is saying yet.

[-] thundermoose@lemmy.world 8 points 3 months ago

Updated to be specific, I'm using Cinnamon. Muffin is the builtin tiling window manager for Cinnamon and it does exactly what you're describing. The problem is that it moves tiles, it doesn't absolutely position them. You have to keep moving tiles around to get them where you want them, Rectangle just has hotkeys to immediately place and resize to fit the active window for each quadrant that it supports:

  • ctrl+cmd+left: top left quadrant
  • ctrl+cmd+right: top left quadrant
  • shift+ctrl+cmd+left: bottom left quadrant
  • shift+ctrl+cmd+right: bottom left quadrant
  • alt+cmd+left: left half
  • alt+cmd+right: right half
  • alt+cmd+up: top half
  • alt+cmd+left: bottom half
  • alt+cmd+f: full screen

It's hard to express how natural that feels after using it for a bit, and I'm still using a Macbook for work so the muscle memory is not going away.

73
submitted 3 months ago* (last edited 2 months ago) by thundermoose@lemmy.world to c/linux@lemmy.ml

To preface this, I've used Linux from the CLI for the better part of 15 years. I'm a software engineer and my personal projects are almost always something that runs in a Linux VM or a Docker container somewhere, but I've always used a Mac to work on personal and professional projects. I have a Windows desktop that I use exclusively for gaming and my personal Macbook is finally giving out after about 10 years, so I'm trying out Linux Mint with Cinnamon on my desktop.

So far, it works shockingly well and I absolutely love being able to reach for a real Linux shell anytime I want, with no weird quirks from MacOS or WSL. The fact that Steam works at all on a Linux environment is still a little magical to me.

There are a couple things I really miss from MacOS and Rectangle is one of them. I've spent a couple hours searching and trying out various solutions, but none of them do the specific thing Rectangle did for me. You input something like ctrl+cmd+right and Rectangle fits your current window to the top right quadrant of your screen.

Before I dive into the weeds and make my own Cinnamon Spice, I figured I should just ask: is there an app/extension that functions like Rectangle for Linux? Here's the things I can say do not work:

  • Muffin hotkeys: Muffin only supports moving tiles, not absolutely positioning them. You can kind of mimic Rectangle behavior, but only with multiple keystrokes to move the windows around on the grid.
  • gTile: This is a Cinnamon Spice that I'm pretty sure has the bones of what I want in it, but the UI is the opposite of what I want.
  • gSnap: Very similar to gTile, but for Gnome. The UI for it is actually quite a bit worse, IMO; you are expected to use a mouse to drag windows.
  • zentile: On top of this only working for XFCE, it doesn't actually let me position windows with a keystroke

To be super clear: Rectangle is explicitly not a tiling window manager. It lets you set hotkeys to move/resize windows, it does not reflow your entire screen to a grid. There are a dozen tiling tools/window manager out there I've found and I've begun to think the Linux community has a weird preoccupation with them. Like, they're cool and all, but all I want is to move the current window to specific areas of my screen with a single keystroke. I don't need every window squished into frame at once or some weird artsy layout.

[-] thundermoose@lemmy.world 19 points 3 months ago

Maybe this comment will age poorly, but I think AGI is a long way off. LLMs are a dead-end, IMO. They are easy to improve with the tech we have today and they can be very useful, so there's a ton of hype around them. They're also easy to build tools around, so everyone in tech is trying to get their piece of AI now.

However, LLMs are chat interfaces to searching a large dataset, and that's about it. Even the image generators are doing this, the dataset just happens to be visual. All of the results you get from a prompt are just queries into that data, even when you get a result that makes it seem intelligent. The model is finding a best-fit response based on billions of parameters, like a hyperdimensional regression analysis. In other words, it's pattern-matching.

A lot of people will say that's intelligence, but it's different; the LLM isn't capable of understanding anything new, it can only generate a response from something in its training set. More parameters, better training, and larger context windows just refine the search results, they don't make the LLM smarter.

AGI needs something new, we aren't going to get there with any of the approaches used today. RemindMe! 5 years to see if this aged like wine or milk.

[-] thundermoose@lemmy.world 26 points 5 months ago

Why do you think ventilators made people worse? They only put people on ventilators when their O2 stats dropped so low they were going to die of oxygen deprivation.

[-] thundermoose@lemmy.world 44 points 6 months ago

Part of the reason these rules are similar is because AI-generated images look very dreamlike. The objects in the image are synthesized from a large corpus of real images. The synthesis is usually imperfect, but close enough that human brains can recognize it as the type of object that was intended from the prompt.

Mythical creatures are imaginary, and the descriptions obviously come from human brains rather than real life. If anyone "saw" a mythical creature, it would have been the brain's best approximation of a shape the person was expecting to see. But, just like a dream, it wouldn't be quite right. The brain would be filling in the gaps rather than correctly interpreting something in real life.

[-] thundermoose@lemmy.world 8 points 6 months ago

I can hear René Auberjonois in that line

[-] thundermoose@lemmy.world 4 points 7 months ago

In reading this thread, I get the sense that some people don't (or can't) separate gameplay and story. Saying, "this is a great game" to me has nothing to do with the story; the way a game plays can exist entirely outside a story. The two can work together well and create a fantastic experience, but "game" seems like it ought to refer to the thing you do since, you know, you're playing it.

My personal favorite example of this is Outer Wilds. The thing you played was a platformer puzzle game and it was executed very well. The story drove the gameplay perfectly and was a fantastic mystery you solved as you played. As an experience, it was about perfect to me; the gameplay was fun and the story made everything you did meaningful.

I loved the story of TLoU and was thrilled when HBO adapted it. Honestly, it's hard to imagine anyone enjoying the thing TLoU had you do separately from the story it was telling. It was basically "walk here, press X" most of the time with some brief interludes of clunky shooting and quicktime events.

I get the gameplay making the story more immersive, but there's no reason the gameplay shouldn't be judged on its own merit separately from the story.

[-] thundermoose@lemmy.world 8 points 7 months ago

This is an honest question, not a troll: what makes The Last of Us groundbreaking from a technical perspective? I played it and loved the story, but the gameplay was utterly boring to me. I got through the game entirely because I wanted to see the conclusion of the story and when the HBO show came out I was thrilled because it meant I wouldn't have to play a game I hated to see the story of TLoU 2.

It's been years, but my recollection is the game was entirely on rails, mostly walking and talking with infrequent bursts of quicktime events and clunky shooting. What was groundbreaking about it?

[-] thundermoose@lemmy.world 13 points 9 months ago

People from east and southeast Asia have been cultivating and eating soy beans as a staple food since before Babylon. I mean that literally; there is evidence of soy bean cultivation in what is now China from like 7000 BC.

It's tough to take a phrase like, "Soy makes men weak," as anything other than racism when it puts down a quarter of the population of the planet. At best, it's ignorance, but in my experience the people who hold this opinion don't change their mind when you explain this to them.

view more: next ›

thundermoose

joined 1 year ago