"Self-contained" and "No bundled libraries" are two very important concepts that a subset of our ecosystem decided was too much work. Then they re-discovered all the problems that result, and have now coined terms like "software supply chain" to describe them.
Meanwhile Debian doesn't suffer from any of this because it's been doing things so as to avoid these issues all along.
If your goal is to consume software for which you need long term reliability, accepting software that bundles an unmaintainable (to you) set of dependencies does not make sense. Unless you have no better option [edit: or if you're paying to delegate your problems to someone else I suppose].
As a user, using software sources that make the same choices Debian makes is always preferable for you if that alternative is available.
Engineering tradeoffs are always tradeoffs. One is not strictly better than the other from the user's perspective.
Mandating shared dependencies means that Debian is often running software against a dependency version that the original author did not develop against or test against. Sometimes the Debian package is effectively a fork. This results in Debian-specific bugs which get reported upstream. Distribution-specific bugs are a crappy experience for upstream developers because it wastes their time, and it's a crappy experience for users to be told that their software cannot be supported upstream because it's a fork.
Maintaining a huge repository of forked software is also an enormous undertaking. It's common for Debian users to be running fairly old versions of software. This is also not ideal, particularly for desktop users who read upstream documentation and require support when entire features are missing from their antique Debian version.
You seem to assume that Debian users are hapless and are ending up in these situations by accident. That's not true. Most users choose Debian's model because they want to use software maintained by people who care about their use cases. They use old versions of software by choice, because they want a platform that doesn't change under their feet. Others use Debian or something Debian-based because it is popular, but it is popular because of its quality as a direct result of making these choices, not despite them.
If you're an upstream who gets frustrated by Debian users, then it's worth considering why they're using Debian the first place.
There are some users who don't want this, and they tend to be the vocal minority. Debian is not the right distribution for them!
I was a contributor to a small Linux desktop application ages ago. We absolutely had a regular flow of hapless users who installed the Debian-provided package and reported bugs that either never existed in upstream builds or had been fixed months before. I believe the situation was that Debian had packaged an obsolete version of the software for stable because of a misunderstanding of the versioning system. When upstream discovered the mistake and contacted them, they refused to update it to a modern version. Instead, they requested that we maintain their fork and backport literal years of fixes.
This situation resulted in Debian distributing a broken version of our software for several years. I did not come away with positive impressions of their packaging processes.
> If your goal is to distribute software across multiple distros and operating systems, bundling dependencies makes sense.
Of course, an important "exception to the exception" is when you're making software that can easily be distributed by distributions, e.g. because it's end user software and open source.
I think the optimal cases for bundled dependencies are (a) large closed source binaries that never change, like games, and (b) self-deployed software, e.g. something like a server written in Go that is compiled and maintained in its running environment by a single developer or company.
> is when you're making software that can easily be distributed by distributions, e.g. because it's end user software and open source.
I have trouble understanding why this is desirable for either authors or end users. Even for open source end user applications, I want the software that I'm running to be reflective of the software that was authored and not the software that some distro maintainers think it should be.
> I want the software that I'm running to be reflective of the software that was authored
I don't. As an end-user, I couldn't care less about what the author wanted, I want to run the best possible version of the software. Often that's the version maintained by my distro, as they've put in the effort to make sure all the different software on my system works well together.
Usually developer of application tests it only with a specific version of a library. If you use another version of library, you need to carefully test it and fix all found bugs and I am not sure if Debian has resources to do it. So we can assume that they simply use untested combinations of libraries and hope that everything will be ok (it won't).
As if this isn't an issue with fast-moving "let's bundle everything" upstream code drops either?
Distribution releases have the advantage that they have a large number of followers who share the same set of versions, and so can shake out the issues and fix the bugs together. In practice I think this beats what most upstreams that each pick their own sets of versions can achieve on their own.
It only takes one skilled engineer to fix any given issue in a given distribution release, even at today's scale. That's not a big burden, and is even available to those not skilled with a relatively inexpensive support contract.
Corporate upstreams additionally tend to focus on what matters to paying customers; other use cases can often receive a "not supported" answer. A community of followers operating on the same set of versions can address these use cases more easily, too.
The self-contained thing isn't always true either, for a long time the open firmware inherited from the linux-firmware repo wasn't built from source, we just shipped the binaries. I expect there are other cases in the archive too, Debian doesn't systematically strip generated files from all tarballs and regenerate them. Especially with more AI/ML stuff, where we probably can't even get the training data, let alone afford to train them.
Debian has a ton of embedded code copies, inherited from all our upstreams who bundle libraries for Windows/macOS/etc and also sometimes fork them etc.
Meanwhile Debian doesn't suffer from any of this because it's been doing things so as to avoid these issues all along.