BaumGeist

joined 2 years ago
[–] BaumGeist@lemmy.ml 4 points 11 hours ago (3 children)

took me too long to figure out what the three arrows were pointing to.

[–] BaumGeist@lemmy.ml 22 points 11 hours ago

OP, what's your address? I have a "present" for you

[–] BaumGeist@lemmy.ml 8 points 15 hours ago

Debian. Because it's the best about "Just Works" (yes, even moreso than Ubuntu, which I tried). It has broken once on me, and that was fixed by rolling back the kernel, then patched within the week.

BUT I'm also not a "numbers go up" geek. I don't give a shit about maxing out the benchmarks, and eking every last drop of performance out of the hardware; to me, that's just a marketing gimmick so people associate dopamine with marginally improved spec numbers (that say nothing about longevity nor reliability).

If you wanna waste something watching numbers go up, waste time playing cookie clicker, not money creating more e-waste so your Nvidia 4090 can burn through half a kilowatt of power to watch youtube in 8k.

(/soapbox)

My gpu is an nvidia 970 and my cpu is a 4th or 5th generation core i7. I just don't play the latest games anyway, I'm a PatientGamer, and I don't do multimedia stuff beyond simple meme edits in GIMP.

It has plenty of power to run VMs, which I do use for my job and hobby, and I do coding as another hobby in NVIM (so I don't have to deal with the performance penalty of MS Code or other big GUI IDEs).

It all works fine, but one day I'll upgrade (still a generation or two behind to get the best deals on used parts) and still not waste a ton of money on AAA games nor bleeding-edge DAWs

[–] BaumGeist@lemmy.ml 5 points 15 hours ago (1 children)

The cool thing about Arch is that with some learning, time and effort, you can make it function just like Ubuntu

[–] BaumGeist@lemmy.ml 1 points 15 hours ago

... And Canonical...

[–] BaumGeist@lemmy.ml 15 points 15 hours ago

I'm a 30-something woman myself. I've been gaming longer than I've had a phone. Here's my two cents:

You're already into videogames. Fuck what the haters say about mobile gaming not being """true""" gaming (whatever the heck that means), they're just sour they can't game whenever wherever without investing a ton of time. Then again, maybe I'm just mad because I've recently invested a ton of time into Youtube's playables.

If you want to get into PC or console gaming, I recommend starting off with popular E rated games in the genres you already know you like. Generally these games are more complex than mobile games, but this type will usually introduce difficulty curves to gradually transition you into their mechanics and complexity and teach you to be a master without having to look up training online.

If you want to branch out, start with genre-bending/-blending games. I'm personally a fan of puzzle-platformers, as those are my two favorite genres; while I'm not big on card games, they recently had an explosion in popularity, so there's a blend of just about every genre you could want.

[–] BaumGeist@lemmy.ml 5 points 1 day ago (1 children)

here's the Programmer Readable version of that wall of text: https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpriseEdition

[–] BaumGeist@lemmy.ml 9 points 1 day ago

false dichotomy. Sometimes people justifiably dislike something for reasons beyond elitism (e.g. Canonical is a for-profit corporation that muddies the waters of FOSS), but it's also not just playful bants.

Also, as with every opinionated topic: do your own research and think critically. Don't hate Ubuntu until you have tried it and have investigated those who maintain it. Don't praise it until you do so either.

I don't care if you come to a different conclusion than me, as long as you didn't just function on the "wisdom of the crowd"

[–] BaumGeist@lemmy.ml 22 points 1 day ago* (last edited 1 day ago) (2 children)

Implementing Equality in Haskell:

    deriving (Eq, Ord)

After learning how easy it was to implement functional programming in Rust (it's almost like the language requires it sometimes), I decided to go back and learn the one I had heard about the most.

It opened my mind. Rust takes so many cues from Haskell, I don't even know where to begin. Strong typing, immutable primitives, derived types, Sum types. Iterating and iterables, closures, and pattern matching are big in Haskell.

I'm not saying Rust uses these because Graydon Hoare wanted a more C-like Haskell, but it is clear it took a lot of elements from the functional paradigm, and the implementations the designers were familiar with had descended through Haskell at some point.

Also, deriving is not the same as implementing. One is letting the compiler make an educated guess about what you want to compare, the other is telling it specifically what you want to compare. You're making, coincidentally, a bad comparison.

[–] BaumGeist@lemmy.ml 27 points 3 days ago

When does Debian update a package? And how does it decide when to?

These both can be answered in depth at Debian's releases page, but the short answer is:

Debian developers work in a repo called "unstable" or "sid," and you can get those packages if you so desire. They will be the most up to date, but also the most likely to introduce breaking changes.

When the devs decide these packages are "stable enough," (breaking changes are highly unlikely) they get moved into "testing" (the release candidate repo) where users can do QA for the community. Testing is the repo for the next version of debian.

When the release cycle hits the ~1.5 year mark, debian maintainers introduce a series of incremental "freezes," whereby new versions of packages will slowly stop being accepted into the testing repo. You can see a table that explains each freeze milestone for Trixie (Debian 13) here.

After all the freezes have gone into effect, Debian migrates the current Testing version (currently Trixie, Debian 13) into the new Stable, and downgrades the current stable version to old-stable. Then the cycle begins again

As for upgrades to packages in the stable/old-stable repos: see the other comments here. The gist is that they will not accept any changes other than security patches and minor bug fixes, except for business critical software that cannot just be patched (e.g. firefox).

[–] BaumGeist@lemmy.ml 2 points 3 days ago (1 children)

I keep finding tracks to love on each album, but Controlling Crowds has too many, so I just recommend the entire thing.

Also Axiom is killer too, but that might just be because I'm a sucker for narrative concept albums

[–] BaumGeist@lemmy.ml 6 points 1 week ago* (last edited 3 days ago) (3 children)

Robert Glasper - Black Radio

Sungazer - Perihelion

Unexpect - Fables of the Sleepless Empire

Frank Zappa - Civilization Phase III

Will Wood - "In case I make it,"

The Algorithm - Brute Force

Devin Townsend - Empath

Miles Davis - Bltches Brew

Oneohtrix point Never - R + 7

Panopticon - Autumn Eternal

King Capisce - Memento Mori

Cynic - Kindly Bent to Free Us

Archive - Controlling Crowds The Complete Edition Parts I-IV

Intronaut - The Direction of Last Things

SHT GHST - 1: The Creation

Dan Deacon - America

Opeth - Ghost Reveries

Steve Reich - Music for 18 Musicians

 

Finally, another web engine is being developed to compete with Chromium and Firefox (Gecko), and they're also working on a browser that will use it.

Here's the maintainer talking about the current state of the project, and a demo of the current functionality

 

I occasionally see love for niche small distros, instead of the major ones...

And it just seems to me like there's more hurdles than help when it comes to adopting an OS whose users number in the hundreds or dozens. I can understand trying one for fun in a VM, but I prefer sticking to the bigger distros for my daily drivers since the they'll support more software and not be reliant on upstream sources, and any bugs or other issues are more likely to be documented abd have workarounds/fixes.

So: What distro do you daily drive and why? What drove you to choose it?

 

It's the series finale for our friend Plague Roach. Big props to Drue for all the work he's put into this project

Here's the full series playlist on youtube

 
 

I've been using nala on my debian-based computers instead of apt, mostly for the parallel downloads, but also because the UI is nicer. I have one issue, and that's the slow completions; it's not wasting painful amounts of time, but it still takes a second or two each time I hit tab. I don't know if this is the same for all shells, but I'm using zsh.

I tried a workaround, but it seems prone to breaking something. So far it's working fine for my purposes, so I thought I'd share anyway:

  1. I backed up /usr/share/zsh/vendor-completions/_nala to my home directory
  2. I copied /usr/share/zsh/functions/Completion/Debian/_apt to /usr/share/zsh/vendor-completions/_nala
  3. I used vim to %s/apt/nala/g (replace every instance of 'apt' to 'nala') in the new /usr/share/zsh/vendor-completions/_nala

Already that's sped up the completions to seemingly as fast as any other command. And already I can see some jank peaking through: zsh now thinks nala has access to apt commands that it definitely doesn't (e.g. nala build-dep, nala changelog and nala full-upgrade), and it has lost autocompletions for nala fetch and nala history.

Once I understand completions files syntax better, I'll fix it to only use the commands listed in nala's manpage and submit a pr to the git repo. In the meantime, if anyone has suggestions for how to correct the existing completions file or more ways to make the _apt completions fit nala, it'd be much appreciated.

 

As a user, the best way to handle applications is a central repository where interoperability is guaranteed. Something like what Debian does with the base repos. I just run an install and it's all taken care of for me. What's more, I don't deal with unnecessary bloat from dozens of different versions of the same library according to the needs of each separate dev/team.

So the self-contained packages must be primarily of benefit to the devs, right? Except I was just reading through how flatpak handles dependencies: runtimes, base apps, and bundling. Runtimes and base apps supply dependencies to the whole system, so they only ever get installed once... but the documentation explicitly mentions that there are only few of both meaning that most devs will either have to do what repo devs do—ensure their app works with the standard libraries—or opt for bundling.

Devs being human—and humans being animals—this means the overall average tendency will be to bundle, because that's easier for them. Which means that I, the end user, now have more bloat, which incentivizes me to retreat to the disk-saving havens of repos, which incentivizes the devs to release on a repo anyway...

So again... who does this benefit? Or am I just completely misunderstanding the costs and benefits?

 

Most people are aware that gasoline sucks as a fuel and is responsible for a large portion of carbon emissions, but defenders love to trot out that "if every end consumer gave up their car, it would only remove like 10% of carbon emissions"

I can find tons of literature about the impact gasoline vehicles have, but is there any broader studies that consider other factors—like manufacture, maintenance, and city planning—while exploring the environmental and/or economic impact of cars and car culture?

I know there's great sources that have made these critiques, but I'm looking for scientific papers that present all the data in a single holistic analysis

 
view more: next ›