logoalt Hacker News

spaceducksyesterday at 8:15 PM0 repliesview on HN

> Nonsense. It has a lossy mode (which is its primary mode so to speak), so of course it has banding. Only lossless codecs can plausibly be claimed to be "immune to banding".

color banding is not a result of lossy compression*, it results from not having enough precision in the color channels to represent slow gradients. VarDCT, JPEG XL's lossy mode, encodes values as 32-bit floats. in fact, image bit depth in VarDCT is just a single value that tells the decoder what bit depth it should output to, not what bit depth the image is encoded as internally. optionally, the decoder can even blue-noise dither it for you if your image wants to be displayed in a higher bit depth than your display or software supports

this is more than enough precision to prevent any color banding (assuming of course the source data that was encoded into a JXL didn't have any banding either). if you still want more precision for whatever reason, the spec just defines that the values in XYB color channels are a real number between 0 and 1, and the header supports signaling an internal depth up to 64 bit per channel

* technically color banding could result from "lossy compression" if high bit depth values are quantized to lower bit depth values, however with sophisticated compression, higher bit depths often compress better because transitions are less harsh and as such need fewer high-frequency coefficients to be represented. even in lossless images, slow gradients can be compressed better if they're high bit depth, because frequent consistent changes in pixel values can be predicted better than sudden occasional changes (like suddenly transitioning from one color band to another)