I do prefer #000 over #FFF (or vice-versa) and generally agree with your sentiment, but it is worth noting that portions of overly-bright screen do not generally trigger the "seared eyeballs" sensation. There are scenarios where allowing a scant 1% of a photo (for example) to be over-bright means much more range in the rest of the image. Personally, I prefer a display that simply matches the static contrast ratio of the human eye. I use ambient lighting to keep my pupils from dilating when viewing a dark screen. (Which has the side-benefit of avoiding the dark-to-light transition that people hate so much.)
Unrelated fun idea for hardware engineers: develop a GPU that outputs average image brightness before each scanout. This allows the monitor (or GPU itself, though you'd suffer quantization issues) to slowly ramp during dark-to-bright transitions. Consumers would love the feature: people like to stare at devices in dark rooms (bed etc) but melt their eyeballs every time they navigate from a dark page to a light one. This would prevent that pain.
> There are scenarios where allowing a scant 1% of a photo (for example) to be over-bright means much more range in the rest of the image
I think that is why hdr invented at first place. Sdr is never meant to present "sun casting on you directly". The sdr #fff in hdr setting is called "paper white". Looking at paper shouldn't hurt your eye in reality and nor does on screen.
Unrelated fun idea for hardware engineers: develop a GPU that outputs average image brightness before each scanout. This allows the monitor (or GPU itself, though you'd suffer quantization issues) to slowly ramp during dark-to-bright transitions. Consumers would love the feature: people like to stare at devices in dark rooms (bed etc) but melt their eyeballs every time they navigate from a dark page to a light one. This would prevent that pain.