Why do we need appimage when we can have single binary statically compiled executable?
Additionally, I can’t really understand why are dynamically linked libraries so popular and how on earth anyone who ever had a “.dll / .so not found” error thinks this is a good idea.
The main idea as far as I understand was to be able to share code which I think everyone knows work only in theory, besides would it not be easier to just recompile the executable from source with updated dependency statically linked?
Other idea behind dlls were that you could replace a dll with different dll as long as the api was compatible which is very unlikely scenario for average people.
Yet another possible advantage would be that the library code is shared so it takes less space on disk which might be true for some libraries which are very common but on the other hand static compilation only includes the part of library code that is used by the program so it takes less space anyway and is more optimized.
So the reasons to use the dlls can be easily dismissed.
About the disadvantages - if the dll is not present the program will not work. It’s quite simple and in my view if you create a program which does not work by itself then that’s a failure on your side.
The dlls are just nightmare most of the time, static compilation for the win. (pic for attention)
When a basic dynamic library needs to be updated because, for instance, there is a big security issue, then all your statically linked binaries will have to be updated. Which means every one of those developer teams need to keep track of all the security fixes, release a new version of the binary and push it, and every user will have to download gigabytes and gigabytes of data.
While if you have dynamic libs you only have to download that one, and the fix will be pushed earlier and all the apps will benefit from it.
if users compile the program on their computers (like AUR) then no need to download gigabytes, you just need the source code.
That route already exists today as “the web”, where the “latest” JavaScript source is downloaded and JIT-ed by browsers. That ecosystem is also not the greatest example of stable and secure software.
Mmmmmmm yes yes can I interest you in a big surprise piping hot dish of Log4J?
Gentoo builds take a long time. Plus hard to use the computer during that time.
(I may be presuming as the comment above yours is about OS wide. And this was how Gentoo handled things.)
I use my machine all the time when its updating no problem. You can always configure portage to leave a core or two idle anyway.
Did you,… hrm,… did you even take classes about this stuff. Ffs, this is why this career pays well: you have to understand complicated things.
Maybe your issue is with Windows. I suggest moving away from that platform.
Dynamic libraries are essential to computing, and allow us to partition out pieces of the code. One giant library would have to be recompiled with every change.
I mean yeah, dynamic libraries are great if used correctly (via something like Nix), but the unfortunate truth is, that they are not used correctly most of the time (the majority of the Unix and Windows landscape is just a mess with dynamic libraries).
With modern systems programming (Rust) the disadvantages of static compilation slowly fade away though via e.g. incremental compilation.
That said dynamic libraries are still a lot faster to link and can e.g. be hot-swapped.
Shared libraries are not a theoretical good, they have been the backbone of computers for decades and many vendors have successfully maintained ABIs for decades.
Modern languages do the statically compiled solution and it has its own downsides. Makes language bindings hard, no stable ABI means no binary platforms exist (other than awkward C wrappers), rebuilds are slow and OS wide results in a lot of churn, reasoning about security fixes is very hard.
Additionally, I can’t really understand why are dynamically linked libraries so popular and how on earth anyone who ever had a “.dll / .so not found” error thinks this is a good idea.
- You can load a DLL once and all programs can share it, saving memory. It also makes programs start faster since the DLL might be already loaded, so there’s less to load from disk. That mattered more back in the 90s
- You can update one file and have the patch apply to all programs
Honestly, I love statically compiled binaries for their simplicity. I was writing a small utility in Rust today and I wanted to share it with a colleague on windows.
One command to cross compile my Linux version to windows version and it worked on first attempt on his computer. To me it’s worth giving up a lot of the advantages of shared libraries for that kind of simplicity.
There is no install. There is only run.
We should replace software repositories with the friendly person who stops by with a USB. Running “apt upgrade” pulls up an Uber-like interface that says when your software will arrive. Latency is terrible but bandwidth is phenomenal.
IIRC apt actually does support external media (because back in the day, not everyone had fast internet).