Is there a compression level which will provide an optimal balance between file size reduction and client decompression time/cost?
I was wondering which GZIP compression level provided the best balance between file size reduction and decompression time. Compression time doesn't really matter since it will only be compressed once and cached. However, I'm concerned that if I set the compression level too high, any gains from file size reduction will be wiped out by the time and cost required to decompress the file on the client side.
It is unlikely that the compression level affects the decompression time. The tradeoff is in how much time/memory is spent in compression searching for the smallest way to express the input data — more compression isn't just extra layers of processing on both ends (that would be a good way to make the data larger).
Once the data is compressed, decompressing it is just following the instructions in the compressed file; there's no guesswork, and a better-compressed file might well be faster to decompress since less input data needs to be read.
- You're right - for some reason I had the idea in my mind that it would take longer to decompress. It's actually faster to decompress a file compressed with a higher setting (with gzip): tukaani.org/lzma/benchmarks.html - Thanks for clearing it up!