Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's a great step, but isn't it still not ideal because at 5k resolution, you could still get banding on a continuous gradient covering the screen without dithering?


Theoretically, yes — 10 bpp increases the number of distinct shades from 256 per channel to 1024. So with a 5K display at 10 bpp, each shade would be repeated for five physical pixels.

In reality, no — the differences in brightness between each shade are going to be too subtle to notice.

Most sources of egregious banding usually comes from 8 bit source material manipulated or "colour managed" to an 8 bit output product. If the source material was 10+ bits then banding would be far less noticeable, even if the final output is still 8 bits.


I find that most significant banding comes from a large gradient that spans between two very similar shades. (E.g., a full screen gradient that goes from grey to slightly-darker-grey).

I fear that 10 bpp would hide the banding from you, so that when your users see it on their 8 bpp displays they notice the banding that you didn't see or get a chance to dither out with noise.


I notice this a lot with web media. Designers have really nice setups, so they don't have any problem designing sites with tiny thin gray text on gray backgrounds with little gray icons etc, not realizing that it looks like indecipherable mush on other people's hardware.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: