this post was submitted on 23 Aug 2023
484 points (98.2% liked)

Linux

48335 readers
499 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

What use to be the PPA that allowed Ubuntu users to use native .deb packages for Firefox has recently changed to the same meta package that forces installation of Snap and the Firefox snap package.

I am having to remove the meta package, then re-uninstall the snap firefox, then re-uninstall Snap, then install pin the latest build I could get (firefox_116.0.3+build2-0ubuntu0.22.04.1~mt1_arm64.deb) to keep the native firefox build.

I'm so done with Ubuntu.

you are viewing a single comment's thread
view the rest of the comments
[–] bear@slrpnk.net 1 points 1 year ago* (last edited 1 year ago) (1 children)

But one thing I always thought should be obligatory was that during installation of such programs, only the resources absent from the system would be added to the installation/system and any other resource bundled would be automatically discarded, thus saving disk space and avoiding redundant libraries present on the system.

Do flatpaks have such working structure?

It's possible, but rarely allowed because that would produce instability. Linux programs are built to rely on a specific version of a library. Depending on how much actually changes, you can sometimes get away with using a different version than the one it expects, but the more it changes the riskier it gets.

One of the major goals of flatpaks was to create a way for developers to ship one build that was guaranteed to run the same regardless of distro or environment. The isolation is very much the point. It does use more storage space, but in most cases it's not enough to matter. When storage space is at a premium, yeah, you generally want to avoid containers. They trade space for stability.

Pretty much everything in the Linux space is converging on this concept. Desktop is moving to immutability with flatpak apps. The server space has been entirely taken over by containers. Even Valve has shipped a separate Linux runtime for as long as they've officially supported it, and they're progressing on deeper containerization. You can direct it to run against your native packages instead of the runtime, but it's rarely a good idea.

The point is that it gives developers a single target that they can all rely on, instead of having to account for 20 distros with multiple still-supported versions each. And believe me, these efforts have made Linux so much easier as a user as well. It used to be that lots developers only targeted Ubuntu. Trying to get anything to run on another system was off like pulling teeth. Now, you can almost always expect to find a flatpak instead which runs on any distro.

[–] qyron@sopuli.xyz 1 points 1 year ago

You mind if I poke the subject for a little more? It is opening a new understanding for me.

Please keep in mind I'm not a programmer, to any degree.

As per what you are explaining, flatpaks working remembers me of a flower blooming on a tree: it uses resources provided by it, adds functions to it but doesn't alter it in a significant fashion.

But again on the space saving and version controlling.

Let's take a given flatpak, where 50 libraries are shipped with it to ensure it works properly, on any given distro.

As you already said, library versions between distros can vary wildly but would it be that difficult to have a script running pre installation (I think "connection" is more adequate to describe the process at this point) to check for what already available required resources exist on the system to avoid redundancies?

I can understand that by having this sort of an homeostatic environment aids in assuring a given program will be capable of running on any machine but I can't shake the intuition that at some point this will backfire. It's not hard to imagine software to be kept relying on older, perhaps unsafe or not as streamlined versions of given libraries just because the developer is not that motivated to make whatever changes necessary to keep up to date with the new versions, as their software already runs as expected.

I'll risk it and try it.