logoalt Hacker News

krackersyesterday at 11:18 PM4 repliesview on HN

>if the linear data is displayed directly, it will appear much darker then it should be.

This seems more a limitation of monitors. If you had very large bit depth, couldn't you just display images in linear light without gamma correction.


Replies

Sharlintoday at 12:37 AM

No. It's about the shape of the curve. Human light intensity perception is not linear. You have to nonlinearize at some point of the pipeline, but yes, typically you should use high-resolution (>=16 bits per channel) linear color in calculations and apply the gamma curve just before display. The fact that traditionally this was not done, and linear operations like blending were applied to nonlinear RGB values, resulted in ugly dark, muddy bands of intermediate colors even in high-end applications like Photoshop.

show 2 replies
AlotOfReadingyesterday at 11:39 PM

Correction is useful for a bunch of different reasons, not all of them related to monitors. Even ISP pipelines without displays involved will still usually do it to allocate more bits to the highlights/shadows than the relatively distinguishable middle bits. Old CRTs did it because the electron gun had a non-linear response and the gamma curve actually linearized the output. Film processing and logarithmic CMOS sensors do it because the sensing medium has a nonlinear sensitivity to the light level.

tobyhinloopentoday at 8:14 AM

The problem with their example is that you can display linear image data just fine, just not with JPEG. Mapping linear data to 255 RGB that expects the gamma-corrected values is just wrong. They could have used an image format that supports linear data, like JPEG-XL, AVIF or HEIC. No conversion to 0-255 required, just throw in the data as-is.

dheeratoday at 12:23 AM

If we're talking about a sunset, then we're talking about your monitor shooting out blinding, eye-hurting brightness light wherever the sun is in the image. That wouldn't be very pleasant.

show 3 replies