> You should avoid using extremes like #000000 and #ffffff
If it's too much contrast for you, maybe your screen is not setup correctly (too bright), or you don't zoom enough. Or your room is not correctly lit. Black on white should be comfortable.
Don't force me to squint or to use way too much brightness sucking too much power to compensate for poor contrast.
People are fine with office suites despite (thanks to?) them defaulting to black on white. We would have known by the time it if was an issue.
I disagree. If you have a screen that is capable of showing high enough contrast that you can have true highlights and true blacks, then why should you make image quality worse just because someone else is unable to reproduce that?
I guess the real problem is that there's no real standard for what #fff means – it has to be interpreted through a colour space, which in turn does not talk about brightness, I think.
Unless you are running with an OLED, reproducing #fff is really hard anyways. Just like unless you have e-ink, active lit #000 is going to basically be a light bulb.
Reproducing #ffffff and #000000 is relatively possible on higher level monitors that have HDR support and full-array local dimming or a dual-layer LCD. While these monitors shine in HDR mode, most of them are also useful for getting a perfect SDR image out of them.
> So, because I'm poor and can't afford such a monitor/screen,
Done properly, moving away from #fff/#000 can be done in such a way that it looks excellent on better monitors and is usable, at very least not entirely unpleasant to read, on less capable ones.
> I will have a harder time reading your site
It not looking as nice for you is fine, as long as it is readable. If we held everything back to making sure it worked on all screens there would be a lot more plain-text-only sites/apps out there. A perfectly good option IMO, but try getting the general screen using public to agree with that!
Even with a bad screen there are usually adjustments you can make to help.
There are of course many pages out there that are too low contrast, or otherwise unpleasant, unless you are using a particularly good monitor setup a particular way for which…
> or just give up and go somewhere else.
… is a perfectly valid option. You don't owe them your attention, though likewise they generally don't owe you anything either.
How about you modify your screen's gamma, calibrate it or install an extension to increase contrast? Why do I have to have my eyes blasted with insanely high-contrast text because my device follows the specs better?
> I use a color-calibrated screen and #000000 to #ffffff is a painful amount of contrast.
This seems highly unlikely if your brightness isn't completely off. What cd/m2 are you at? I'm on a Eizo CG277 with built-in hardware self calibration. It's set to 120 cd/m2 and I have hard time thinking it could be painful to anyone. I much prefer #000 on #fff over lower contrast alternatives.
In pre-press and photography we're used to ISO 3664:2009[0] which I believe establishes whitepoint brightness level between 80 and 120 cd/m2. It should be added that it requires isolation from daylight and ambient illumination level near 80 lux (and a good monitor)... But even at 150 cd/m2 in a room with controlled amounts of daylight, you should not be blinded by #000 on #fff.
My screen is fine, probably as good as or better than what most people have. We can't optimize for the almost 0% of people who have calibrated screens to the detriment of the big majority of people.
The brightness of a screen is also well-defined in the respective DIN norms for workspaces, which I’m following exactly.
Those norms also set a limit for the maximum amount of contrast allowed at a workspace, which #000000 on #ffffff violates if it’s shown on a color-calibrated screen with standard brightness.
> We can't optimize for the almost 0% of people who have calibrated screens to the detriment of the big majority of people.
We can’t just change the meaning of what definitions and terms mean just because it’s easier than making everyone follow the standard.
Create a color profile for your monitor, so your OS can compress contrast if your monitor isn’t powerful enough to reach the SDR sRGB specs.
To help with "making everyone follow the standard", why don't you start describing what those "standard" screen settings are instead of only talking from the point of authority?
Still, any standard prescribing "standard brightness" for office environments is bound to be nothing but a decent average or default. Even if there was biologically optimal display luminance, it would still depend on the environment lightning (which changes during the day if there's a single window in the room), and on the state of the viewers eyes (age, any visual problems or even temporary issues like tiredness), type of display...
>Still, any standard prescribing "standard brightness" for office environments is bound to be nothing but a decent average or default
The purpose of a standard isn't to be most preferred by any given person or "biologically optimal", it's to be...standard. Without a standard, a designer has no ability to pick a particular color and expect anything like that color to be what the end user sees. Without a standard, there's almost no point in discussing what text colors websites are or aren't using.
You might be taking this to an extreme. There are standard definitions of colours, and we've got reasonable ways to compare screens side by side to see how many colours they can reproduce. While humans can differentiate between a large number of colours, we usually think of them only in the terms of named ones (and no, I am not talking of "aubergine").
The GP was talking about using whatever "DIN norms for workspaces" regarding screen brightness are, so I don't see how that relates to colour calibration.
You also sound as if any one end user really cares if they are looking at exactly the right hue of blue when looking at the "hp" logo. If anything, the company and their marketing department care (for subliminal messaging?), but users really don't. I don't think anyone could pick a McDonald's red or yellow out of a given table of similar red/yellow hues.
But even so, if we were looking at any non-light-emitting content displaying colour (eg. a printout), it will look differently to your eye whether you are looking at it in dim light or in a very light setting, so designer can't do anything about that.
Anyone who knows even the most basic stuff about colour perception understands that it can't be decoupled from your environment (enter a darkroom and every colour is suddenly... black).
Colour calibration and standards are done so that you can guarantee that some things will look identical when displayed in exactly the same environment (so your two screens sitting next to each other show the same hues) or to reproducibly transfer between different mediums (eg. screen and paper), but there is no way you can guarantee that it will look identical for another user in a different environment.
Colour calibration is of limited use in general (eg. even different types of paper [think glossy vs matte] will produce different visual results), and while there's more stability with light-emitting mediums (like LCD screens) — because you, essentially, control some of the lightning as well — that breaks down for the average user because they do need to adjust the brightness according to their environment. Those who absolutely require reproducibility, control both their environment and their equipment.
Indeed, it reads like this, but the visually impaired minority is actually mostly part of the majority of people without calibrated screens I was mentioning. So I actually included them.
I'm sure people with calibrated screens can find a setting where black on white is comfortable, even temporarily, while most people just can't fix contrast lowered at the source. Including not visually impaired people in not ideal environments.
But that’s the wrong approach. You can always tonemap low-contrast values into a higher-contrast color range, but not the opposite way. Once you have something at #000000, it’s clipped, there’s no way for any distinction between these values, same at #ffffff.
That’s the real issue, isn’t it? The tooling for this is shitty. You can create an accurate color profile for your monitor, apply it in software and your OS will automatically map the values correctly. This will improve contrast quite a bit, but is complicated, expensive, and you’ll lose a lot of definition in other brightness ranges.
But the same issues apply in reverse if you try to reduce contrast because some idiot thought a 200:1 contrast for text was healthy.
"Screen configuration" means more things. You can optimize for readability, or you can optimize for faithfulness (and more). You should switch according to use.
I don't think this is entirely right although it sounds logical at first. Modern displays can be extremely powerful in terms of back light intensity and contrast; you realize that when you switch on the average model from years ago. They need that power so users can watch videos with intentionally dark scenes and still be able to perceive details in less-than-ideal situations. It does not automatically follow that the maximum contrast is intended or ideal for other purposes such as reading text.
I might add that office-type programs were developed at a time where 14 to 18 inch color CRTs were the latest and greatest. The image that modern flat screens deliver is much more dynamic and has way sharper edges, to the degree that maybe we should consider to artificially blur the entire display ever so slightly just so the higher Fourier frequencies get a bit less dominant.
Then unleash the full brightness when watching videos, and scale it back when reading text.
Until we get HDR on the web, one sRGB size must fit all. At this point, accessibility and readability are top priority. You can always tone down your backlight, while someone else might have it at full blast and still not be able to read without squinting in the sunlight on a cheap display. I know who I'm more sympathetic to.
You can not just tone down your backlight without having to measure the color profile of your monitor again. Which takes hours.
As I’ve said countless times before, we’ve got OS support for tonemapping HDR content onto SDR displays, now we need OS support for tonemapping SDR content onto shitty displays.
I know nothing about color profiles, I'm surprised you can't just turn down brightness to read something comfortably and revert to the original level when you need accurate colors again. Or you want accurate colors all the time? But what for, if you are just reading an average document?
I don't need accurate colors when reading text, but I need it to be comfortable. It seems having color calibration would actually suck for me if I can't adjust brightness depending on the weather, on the hour of the day, on the season, on whether I'm tired or not, or on the location I currently am, or on what I'm doing (reading, programming, or watching a video). I need to be able to adjust brightness without fuss.
I guess I would buy dedicated hardware and put it in a well lit-controlled room if I needed accurate colors.
> I know nothing about color profiles, I'm surprised you can't just turn down brightness to read something comfortably and revert to the original level when you need accurate colors again. Or you want accurate colors all the time? But what for, if you are just reading an average document?
I want accurate colors because setting it up is a pain, and switching between accurate and inaccurate colors takes your eyes quite a while to readjust.
It’s much easier to just have everything in an accurate color mode and have content designed for that than to switch it around.
Proper color profiles are actually even more of a benefit for cheap displays than good displays as cheap displays profit from proper colors even more.
Imagine what colors devs would choose if instead of defining white as "the brightest thing any given monitor can spit out" they would just specifiy some absolute value in e.g. Lumen per mm² and the screen tries to actually reach or maintain that value.
The issue with color on screens is a bit like the loudness wars in audio mixing: everybody tries to use the available bitspace to the max, because this made some sense in the days of 16 bit (or lower) audio. Nowadays much more dynamic delivery would be possible (with dynamic reduction as a user preference), yet there is no real, well designed system for such things in place.
The root cause could possibly be that not all displays are capable of proper contrast in dark mode at a comfortable brightness for also displaying expanses of white.
You would wear out the brightness controls in no time unless you go out of your way to mandate dark mode on everything.
Another factor might be the dark adaptation of the iris, which could be helped by desk lights.
I have never owned a monitor or phone without a brightness control. I challenge you to find any screen capable of displaying a paragraph that does not have a brightness adjustment. Any form factor (other than backlight-free stuff like ePaper and cheap calculators) will do, as long as it had top 100 market share in its segment at some point in this century.
(Brightness is the adjustment that decreases the max white level without also making blacks brighter. So, a display that only has a contrast knob would count as missing a brightness adjustment.)
In my experience with cost-optimized low-end LCD displays, changing brightness settings will more or less alter absolute contrast at the lower end as well.
Even if the ratio stays the same, the darker shades suddenly are a lot more distinct as they just aren't consumed by artifacts and color shifts or even just glare anymore.
If it's too much contrast for you, maybe your screen is not setup correctly (too bright), or you don't zoom enough. Or your room is not correctly lit. Black on white should be comfortable.
Don't force me to squint or to use way too much brightness sucking too much power to compensate for poor contrast.
People are fine with office suites despite (thanks to?) them defaulting to black on white. We would have known by the time it if was an issue.