@scalzi (23 déc. 2018) :
« Honestly, why the fuck is every video game "update" something like 15 goddamned gigabytes? Did you guys fuck up the original release so badly I have to download the equivalent of a new game every time? Signed, a person in the boonies for whom your "update" takes six goddamn hours »
@codemonkey_uk (23 déc. 2018) :
« Since the 90s games have been working around slow OS file systems by basically writing their own ones.
File systems are complicated and show so game developers started putting most of the game assets (sprites/textures/meshes/etc) in one huge file, keeping that file open to seek and read from to load stuff. It’s much much faster then opening and reading individual files. It also lets you do compression across assets.
Now big file compression is interesting. But a consequence of how it works is a small isolated change to the source content can result in a radically different compressed file out the other end. So how games were built was optimised for fast loading, patching wasn’t a thing in the 90s, and we are still building games like that.
Add to that, building huge asset sets into game ready content can be very slow, so asset build pipelines tend to be heavily use concurrency to spread the load and that often means the pipeline is non-deterministic. That means even with 0 actual content changes, two concurrent builds of the same game could produce radical different WAD files (just based on chunk order compleation).
DLC/Patching is usually implemented as replacement whole files. But even on PCs where you can patch bytes in files, the diff for a small change (or even no change) could be almost as big as the whole game.
And that’s how you end up with a 15gig download to fix a spelling mistake. »