Since it was linked to from Apple’s Hot News section, I read with interest Galbraith’s article on the color accuracy of the new Macbook Pros, even though I plan on skipping this iteration of macbook.
When I came across this article titled, “New MBP offers top display quality, but some beg to differ, I thought, “Oh, Galbraith was wrong on something.”
Not everyone is satisfied with the MBP screens, however. Designer Louie Mantia of the Iconfactory has a bone to pick with the screen quality of his new 13″ unit; it’s sporting a 6-bit display, which has been an issue with color-sensitive professionals for years now.
No such luck, just some moron talking out of his ass. Ahh, the old canard about how their 8-bit panels are really 6-bit. Let me spare these people two bits of wisdom.
Bits to color
A pixel on your screen is actually composed of three subpixels: a red green and blue one. The shape and ordering of the subpixels being pretty much laid down in stone with the Sony Trinitron patent. If each of these subpixels receives 8 bits of digital information to set it’s intensity level then the number of colors would be 28 x 28 x 28≈ 16.8 million colors. But if you only have 6 bits then you get only 262,144 colors.
FYI, just so you don’t become part of the moronic commentariat with gems like, “Who can see a million colors anyway?”, the human eye seems to be able to distinguish around 10 million colors, so you’d certainly notice this.
Not good right?
Why lose two bits?
The problem is that many LCD monitors are used to watch video or motion on the screen. With this 25ms time frame for transition from on to on states, pixels that should have transitioned to the new color levels trail the signal and result in an effect know as motion blurring…
Since consumers were demanding faster screens, something needed to be done to improve response times. To facilitate this, many manufacturers turned to reducing the number of levels each color pixel render. This reduction in the number of intensity levels allows the response times to drop but has the drawback of reducing the overall number of colors that can be rendered.
In other words, if you want animation or video not to look like ass on your LCD panel, then you have to go to 6-bits.
“Well I’m a professional X, I don’t need the smooth video.”
Whatever, monkey boy.
There’s a lot of animation in Mac OS X and ghosting is pretty jarring. I remember in the late 80’s when Apple introduced a “active matrix LCD” on a 16-pound “portable” and people would stare at a hypercard animation of a rabbit for hours admiring how less-crappy-but-still-annoying the LCD was. *sigh*
But I digress.
The reality is that pretty much every manufacturer uses 6-bit LCD panels in their laptops. They all advertise 16 million color displays too.
But… two people sued Apple
They dropped the class action, not because us plebs don’t have lack their sooper-elite color sensitivity skills. They dropped the lawsuit because the two people behind it were batshit insane and their lawyers found a nicer ambulance to chase.
Don’t believe the bullshit. These “professional” photographers who sued Apple also believe the next Apple laptop will have a 30″ touchscreen, weigh less than two pounds, cost less than $1000 and fit in their pants pockets. Fuck idiots like that.
Aside: When people say, “Wow you’re photographs look professional!,” I have to ask, “What sort of professional?” Just to make sure they’re not mindfucking me. Because that’s exactly the sort of mindfuck I like to do…
I know Apple Legal where the disembodied Lord Voldemort quietly plots his return, but I challenge people to notice the difference between 8-bit and 6-bit color.
It’s the dithering thing.
So Tweedledum pointed out in his post, the way you get 8-bit color out of 6-bit pixels is through a technique known as dithering. By alternating two different tones, you can achieve the average tone. Here is an example of how it works for light pixels. A red and blue one makes a magenta tone. I’ve explained this too many times before.
See this icon designer knows this because in the old days of making icons on the crummy computers we had back then, that’s exactly how we’d make intermediate colors and shit on our crappy computers.
The problem is that this dithering occurs, not in space, but in time. How fast?
The rate is only around a couple hundred times per second. Theoretically, you could see it (your internal frame rate is about 24fps, but you can notice changes if they are large that occur over 10msec). Dithering algorithms are quite advanced now, and improvements in display technology make it unnoticeable. Also you’re trying to find dithering in the last digit, which is really hard.
You probably won’t notice it.
At the minimum if it bothers you, then you should have no trouble being driven up the wall by the flicker caused by CRT monitors as they refresh the display. Because that’s both a larger gamut change and slower than a new laptop LCD panel’s dithering system.
The maxim: “This only matters to professions who work on graphics” is outdated and left over from when displays were slow, crappy, had shitty gamut, crappy backlights, and the dithering algorithms were ass.
The “truth in advertising” schtick confuses the same phrase “16.7 million colors” that has different terminology among LCD manufacturers (means 8-bit subpixels), than “16 million colors” that computer vendors use to mean 6-bit displays with 16.2 million color levels rendered by a 24-bit graphics card). This is complicated by the fact that LCD manufacturers have started to use the “16.7 million color” terminology even for 6-bit displays!
How to look up the stats on your panel
So I typed…
brick:~ tychay$ ioreg -lw0 | grep IODisplayEDID | sed "/[^< ]*</s///" | xxd -p -r | strings -6 LP154WP3-TLA1 Color LCD
then googled that. This last-year’s MacBook Pro is an LG Philips display: 6-bit, 16ms response time, 330cd/m2 luminance, 800:1 contrast, 60/80 degree viewing angle.
6-bit???? Time to get my rage on. 🙂