Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If I were working on such a thing I'd want the patch to be downloaded as the smallest possible delta from what's on the disc.


Call of Duty updates are 200gb regardless of pre existing installs. It’s absurd.


Some Call of Duty discs contain basically no data at all.

>Game disc only contains 1GB of data (In some regions it has even less data on disc) forcing you to download a 40+GB patch (at launch) and another 40GB of data packs in order to play the game.

https://www.doesitplay.org/game/Call%20Of%20Duty%3A%20Modern...


That is a nice site. I was first made aware of this issue with Switch games. Some publishers will cut content on the memory card and force a download to stop their game requiring a card with larger capacity which costs more.

These aren't even new games that it is reasonable to expect to be patched. Re-releases like "Spyro Reignited Trilogy" require a download which is just a cost saving exercise.


It’s also an plausible anti-leaks measure - if the gamecard contains everything needed to play the game, the game can easily leak early when the cards are going to retail.

If a day1 patch is required, then it can’t leak until that patch is available?


day -1, not day 1


Actually it’s called day one patches. Similar to zero day vulnerabilities (in the name sake only) these patches are usually required to play day 1 of the game…

example: https://www.gamedeveloper.com/business/why-you-should-want-d... (Only linking this article for it’s referencing them as day one patches)

another example: https://en.wikipedia.org/wiki/Glossary_of_video_game_terms#d...


I wonder how many publishers use S3 for this. Because, at current retail (quantity 1) prices, a bigger card looks like it will pay for itself after a whopping two downloads.

I assume that the game downloading ecosystem uses something that’s actually cost-effective. At AWS prices, it seems like it would be basically impossible to be a profitable publisher of multi-gigabyte games at any scale.


Each of the console manufacturers operates their own CDNs, typically. Valve do too on PC.

They often have some kind of proof-of-ownership (like a license ticket) required to download game data or updates, too.


That also has the effect of preventing pre-release leaks, though as we've seen some of Nintendo's own games shared on the internet weeks before release I don't imagine it's a big part of the reason for requiring a download.


Tears of the Kingdom leaked a week or so early from the cart, full game.


Yes, that's one of the games I'm referring to.


if the cart/d isn’t playable as-is, you only bought a license, and eventually the game itself won’t be available.

but that’s ok, the next console will likely have a re-re-release.


Since figuring THAT out might require reading the whole disk and doing byte by byte comparisons (or a whole disk checksum), easier to just download the whole thing.

Unless they track literally every single DVD variant perfectly, which ain’t happening.

And that is ignoring that many disks are basically just a hardware license dongle, and don’t actually have a full playable version of the game.


You can't imagine a patch format that avoids this? It sounds like you lack some creativity.

If you need to lookup patches by pre-patched hash of the source file, you could, for example, precompute hashes and store them on the disc.


It’s an info theory thing. But nice try. Why don’t you propose something better? Besides what I already proposed anyway.

The ways to optimize/‘solve it’ all require degrees of rigor in information control and tracking that aren’t realistic given the market conditions and supply chains.

At least unless people stop being okay with paying for $40+ games that download 40G patches anyway. Which would require severe changes in trajectory of bandwidth availability, which isn’t likely.


No sir, excuse my bluntness but you are full of shit trying to claim that information theory blocks this from being practical. I worked on stuff like this. I worked on the format that Windows setup uses for installation media for example. It has delta patches too. I think the same exact idea used there would work. It has a hash of every file precomputed and only stores what's unique.


HN rated limiting sucks. And client crashes suck more.

You’re not reading my comment, or thinking about the game distribution problem.

MS can get benefit from reading all the files, verifying hashes, etc.

And in a typical OS update scenario, MS can trust that a files contents haven’t been updated since the hash was checked. Which reading from a DVD/BluRAY, scratches are a problem and it isn’t that simple.

Games typically don’t, especially those on a DVD or BluRAY. Because they are slow, and have terrible seek times.

So, like I pointed out - it doesn’t make sense to do the work you need to do in info theory to actually apply a delta patch in these scenarios.


The file format I was describing is used in the Windows DVD. I'm sure you know more about it than I do.


And as noted, windows has a lot of use cases games don’t. Which is why essentially nobody does what it does.

Also, windows updates take forever - as you’ve probably noticed.


I mean, zsync? Performing simple hashes of blocks of data isn’t exactly hard for the console. On the CDN side, just add a caching layer for the resulting chunks and it should sort itself out, since there are only so many variants of the source disc. It won’t get you the best compression ratios, but it’s flexible. We were considering this for firmware updates of an IoT product. It’s not like differential updates are unheard of.


For that sync to work, you have to read the whole disk, do hashes, then compare to hashes for a given other version. Which requires reading the DVD/BluRAY (slow) and comparing to the versions on the server.

And preferably, if you’re just reading hashes off the disk, that none of the actual data on the disk is corrupted or doesn’t match the hashes.

And then, due to the reality of the way games are distributed, downloading 90%+ of the game anyway.

Or just download the full version, which is simpler and likely the same amount of bandwidth, and faster since you’re not having to read/check the slow disks for more than basic ‘is it this game’ checking.


I understand the types of considerations that may make console developers not bother, but also it is a bit ridiculous on its face when they are selling physical media. But for patch-heavy games like CoD, maybe it’s a lost battle anyways, archival be damned.

But I will push back a little and say that a zsync-like differential update scheme would still be totally feasible. BD read speeds are going to be in the 100s of megabits per second, and the compute for the hash is free (ie. It’s I/O bound). You can parallelize and start downloading blocks before you’re done hashing every block on the disc. It seems likely to me that you’d still end up better off with this scheme if you have slower than gigabit download speeds (which is true for the vast majority of the US). Zsync is fairly coarse and flexible, and essentially looks like BitTorrent between two peers over HTTP if you squint. If you assume the download speed and network reliability is the bottleneck that outweighs things like disc I/O and compute, it essentially degrades gracefully to just downloading the entire update in the case that there are zero matching blocks.

Edit: I should mention that another key aspect of the setup is that there are a small number of printed disc revisions, and a small number of target download (most people will get the same game files for a given region). This means that a CDN cache will quickly find the hot blocks to serve, even without any precomputation of the diff between source and target.


Most games, near as I can tell, the version on the DVD/BluRAY for the initial release is pretty much never finished - often barely playable. Even at 'release' date, the initial update is often at least as large or larger than the data on the DVD/BluRAY.

So possible? Sure, almost anything is possible with enough work and tradeoffs. It just isn't economical or likely actually faster given current bandwidth constraints and how they're distributed.

Especially if you consider that if there is network play, they'll have to be up to date anyway or most games won't let them connect, so 'offline' play is going to be a relatively rare situation. So why optimize for it?


Are delta patches still viable given the current sizes of games? I'm not sure if this is the state of the art, but according to https://www.daemonology.net/bsdiff/, bspatch would require more memory than most systems can offer.


I'd expect the patch generation to be memory hungry, not the patch aplication, which should be only data and offsets. If it uses maximum compression it might generate a huge data dictionary, but since it has to be distributed too, that would be contraproducent to patch size.


You would probably do the deltas in batches. Not input the entire disc in a single chunk.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: