this post was submitted on 13 Aug 2024
16 points (57.1% liked)

Linux

47356 readers
1448 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Maybe this is a hot take. However, a lot of the Chromebooks that were deployed by schools during covid are build like tanks while being super lightweight and having great battery life. Meanwhile the old thinkpads are 10 years old and are probably starting to wear down. Many Chromebooks support coreboot these days so theoretically they have the potential to be more private and secure. Some of them are also arm which means that they are more efficient from an architecture perspective.

Edit:

I like how incredibly controversial this is. I have successfully split the votes

you are viewing a single comment's thread
view the rest of the comments
[–] j4k3@lemmy.world 1 points 1 month ago

As far as I understand it, wouldn't the cell library be more like the node equivalent of a KiCAD library for 0402 passive footprints for PCB design? Like here is how we must do gates, buses, etc. But that has nothing to do with the way the ALU is setup or the LCR aspects of a final design? I've honestly only watched Asianometry, skimmed Intro to VLSI a few times, dabbled in FPGA, built Ben Eater's bread board computer, and screwed around with a CPU scheduler to learn why my last computer sucked at complex CAD assemblies. When I was looking for AI hardware to run LLM's I went deep enough to understand the specific CPU limitation and upon learning about my phone's matrix coprocessor I tried learning enough to understand why the thing even exists. That lead me to the understanding that a model can be designed for a specific architecture and run MUCH faster and smaller. I explain things as they sit on my road map of understanding, and knowing I'm likely wrong on the edge cases. I am no expert. I'm trying to give anyone enough rope to pull on so that I can find out here I'm wrong and learn. I share because I want to learn, I want to be wrong, but only in a way that I can extend incrementally from my mental roadmap.