Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We are talking about decompression speed and not encryption. Decompression is necessarily deterministic.


The compression speed is also an issue for developers. In many cases the compression step takes longer than the rest of the build.


May be the point is that compressed package can change every time, which is an issue for reproducible builds idea many distros now are using. Though I'm not sure why parallelized xz can't behave in predictable fashion.


No, I mean you don’t need to parallel compress. The compression speeds don’t matter, and are compatible with single- or multi-threaded decompression.


Compression speed can matter in general (to improve build times).

For xz, you need to compress with chunking (and may be indexing for more benefit), in order to allow parallel decompression to begin with. Otherwise xz produces a blob which you can't split into independent parts during decompression, which makes using many decompression threads pointless.

But yes, if parallel compression is creating non determinism, you can do all the compression work with chunking without parallelism, still allowing parallel decompression. But I'm not sure why it even has to create non determinism in the first place.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: