this post was submitted on 07 Sep 2023
64 points (97.1% liked)

Programming

17443 readers
148 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

There was a time where this debate was bigger. It seems the world has shifted towards architectures and tooling that does not allow dynamic linking or makes it harder. This compromise makes it easier for the maintainers of the tools / languages, but does take away choice from the user / developer. But maybe that's not important? What are your thoughts?

you are viewing a single comment's thread
view the rest of the comments
[–] 0x0@programming.dev 4 points 1 year ago (2 children)

Disk space and RAM availability has increased a lot in the last decade, which has allowed the rise of the lazy programmer, who'll code not caring (or, increasingly, not knowing) about these things. Bloat is king now.

Dynamic linking allows you to save disk space and memory by ensuring all programs are using the only one version of a library laying around, so less testing. You're delegating the version tracking to distro package maintainers.

You can use the dl* family to better control what you use and if the dependency is FLOSS, the world's your oyster.

Static linking can make sense if you're developing portable code for a wide variety of OSs and/or architectures, or if your dependencies are small and/or not that common or whatever.

This, of course, is my take on the matter. YMMV.

[–] unique_hemp@discuss.tchncs.de 5 points 1 year ago (1 children)

Except with dynamic linking there is essentially an infinite amount of integration testing to do. Libraries change behaviour even when they shouldn't and cause bugs all the time, so testing everything packaged together once is overall much less work.

[–] 0x0@programming.dev 0 points 1 year ago (1 children)

Which is why libraries are versioned. The same version can be compiled differently across OSs, yes, but again, unless it's an obscure closed library, in my experience dependencies tend to be stable. Then again all dependencies i deal with are open source so i can always recompile them if need be.

More work? Maybe. Also more control and a more efficient app. Anyway i'm paid to work.

[–] unique_hemp@discuss.tchncs.de 2 points 1 year ago (1 children)

More control? If you're speaking from the app developer's perspective, dynamic linking very much gives you less control of what is actually executed in the end.

[–] o11c@programming.dev 2 points 1 year ago

The problem is that the application developer usually thinks they know everything about what they want from their dependencies, but they actually don't.

[–] uis@lemmy.world 1 points 1 year ago

Static linking can make sense if you're developing portable code for a wide variety of OSs

I doubt any other OS supports linux syscalls