Guild Wars 2 and others solve this with progressive downloads. You may start playing after a fraction of the content is downloaded. There is effectively no downside: new players can't access most of the content anyhow. It also applies to updates: you can start playing after a big update with a partial download. The background process can be safely interrupted and resumed without glitches.
War Thunder kind of has that, except that the "progressive" bit means you need to download the full thing to progress in the game beyond a certain point.
Enlisted, which uses the same engine, is fine though.
The design of ArenaNet's GW/GW2 has always impressed me. 99.4% of all of the game is contained in one file: GW2.DAT, which is (currently) 66.7 GB. I/O for updates is extremely efficient.
I've seen many games create a vast file system mess, and they're forever plagued by "corruption" or something, complete with "fix" tools, "cache" clearing push-ups and players reinstalling after big updates that don't install correctly for whatever reason. Somehow ArenaNet surrounded this whole problem and solved it optimally.
A massive amount of engineering went into making it work and some of the stuff we did was downright evil :) in gw1, a lot of data was compiled into the exe as c constants so that we wouldn't have to do a bunch of file management and I/o to locate and load it. We ended up hitting limitations in the C compiler.
This happens to touch upon a pain I keep getting with Steam. Games like squad keep getting tiny patches (compared to the whole game size), which is great, until you see that steam needs to read and write the whole file again to finish the patch. It's not as simple as their "dumb allocation wearing out SSD" problem either -- there's no API to do this anywhere unlike the common posix_fallocate(). Linux FALLOC_FL_INSERT_RANGE comes close, but no studio will tend their files so they make exact filesystem block-sized units.
I suppose there's always the approach of running the thing like a DB, as in you don't care about the representation of data being the same and liberally mark parts as "unused", compacting once in a while. But then the integrity check & delta update code of your distribution platform needs to understand that too.
You wouldn't even need Linux extensions for that. You just do it like ZIP (or multisession CD-R, whatever) and put a new directory at the end. Just don't ask about streaming.