Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

People take performance for granted. Even within gzip (and similarly .png), you can set compression level to 4 (default is 6) and get ~15-20% faster performance at the cost of ~5% larger file sizes.

No one ever tweaks that one setting even though they should, file sizes are a significantly smaller bottleneck than they were with MB hard drives and dial-up modems.

If your justification for not serving up larger .png is that not everyone has fast internet, then you should be either detecting and handling that case separately, downscaling the images, and/or serving .jpeg instead.

One time I was using Topaz AI to upscale video, and I spliced that into their ffmpeg filter and took a whole day off of a week long encode. Low hanging fruit.



In video games long loading times for levels is a serious pain point, so video game developers put a lot of effort into tuning up compression algorithms to get the best wall clock time considering both the time to fetch content from storage and the time to decompress.

If the target is a console you may know exactly what hardware is there so you can justify the effort in tuning. (it’s more complex today because you have a choice of what kind of storage to use with your XBOX). With a PC or phone your results may vary a lot more.


Don't most games/game engines use TGA format for their textures? Those are all RLE-encoded if I'm not mistaken (which is very fast but very inefficient space-wise). Or perhaps that is just at game creation and those will get baked to some other image format for distribution?


People use all kinds of compression schemes for textures

https://aras-p.info/blog/2020/12/08/Texture-Compression-in-2...


The most important thing for modern texture compression is that the GPU supports it without ever having to decompress - saves VRAM and memory bandwidth. So it's usually specialized formats like ASTC.


Economics of scale come into effect as well. Gzip decompression speed is slightly better at higher levels as well. A one time higher cost of compression can pay off pretty quickly when you are decompressing it a lot of times, or serving it to enough people.


I'm not so sure about this. Generally speaking there will be more work done on the CPU to decompress at higher levels (e.g. 6 through 9). It is possible (although unlikely) that you will get higher decompression speed, but only if the bottleneck wasn't CPU to begin with (e.g. network or disc).

My gut feeling is that if you are pulling down data faster than 40 Megabits and have a CPU made within the past 7 years (possibly including mobile), you won't be bottlenecked by I/O generally speaking.


Most compression algorithms don't take more work to decompress at higher levels, and actually perform better due to having less data to work through. Gzip consistently benchmarks better at decompression for higher levels.

It's not just about bottlenecks, but aggregate energy expenditure from millions of decompressions. On the whole, it can make a real measurable difference. My point was only really that it's not so cut and dry that it's a good trade off to take a 5% file size loss for 20% improved compression performance. You'd have to benchmark and actually estimate the total number of decompressions to see the tipping point.


That is not the case with zstd.

According to this benchmark [1] zstd does not drop its decompression speed as the compression ratio increases. It stays about the same level.

[1] https://www.truenas.com/community/threads/zstd-speed-ratio-b...


> No one ever tweaks that one setting even though they should

That entirely depends on the use-case. Most people running FFMpeg do it as a once off thing - and if those people like me, when I rip a movie I want the highest quality and lowest size I can get, and I'm happy that the default sacrifices speed for quality and size. The processing can be slow because I'm doing it only once. If you're in the business of encoding video and do it all day everyday, your calculus will be different and you won't be using the defaults regardless.


I agree, but to be pedantic, the cost of storage may work out to be lower than the cost of energy to encode even in that use case.


That depends under what conditions the energy is used. In a place where you need extra energy for cooling yes. In my home no. I live in a cold place, so I need heating the bigger part of the year. And I heat using electricity (which might be stupid, but that's the house was built 30 years ago). So whatever energy my computer wastes, I save it in my heating bill. Computer energy is free, except during some warm summer weeks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: