Retinal burn

I suppose musings like this are very common among Apple haters. Basically the complaint boils down to:

“325dpi? Bah! Even a 1986-era laser printer does 300dpi and my newspaper does at least 600dpi. Until you get there, the print is smudgy and causes eye-strain.”

Apple - iPhone 4 - Learn about the high-resolution Retina display

The facts in the Apple’s advertising blurb are 100% correct. If you have a beef, don’t take the advertising head-on. The whole thing is essentially a misdirection in all but a few cases.

What a crock of shit.

First of all, hats off to Apple marketing for coming up with the name “retinal display.” It really sounds very high-tech—or, biotech.

Second of all, it isn’t 325dpi, it’s 325ppi. There are 3 pixels (subpixels) in a single visual color dot. With subpixel rendering the two can be seen as equivalent, but you have to remember that there will be color fringing that might be significant enough in certain instances to make the distinction significant. For instance, in a photo of a natural scene, you might not notice the pixelation, but a diagonal line, or looking at very small text—you will.

What is the significance of 325ppi? Pretty much that it’s a bigger number than 300 dpi. What is 300dpi? It’s an arbitrary benchmark from the print world.

As I’ve explained ad-nauseum, 300dpi is a well known number from printing circles. It‘s the acuity limit of most eyes at a viewing distance of 1 foot. This is physics and biology here.

What is going on is that your eye is a digital sensor in that it has individual elements (cells called photoreceptors) that sample light at discrete points in space. The closer packed these photoreceptors are, the more detail you can pick out. Now biology is no slouch so in the center of your eye, these things are packed really close. They’d be packed even closer except for one thing: light is a wave and its wave nature will dominate at a certain size scale when it passes through your pupil—this is what is meant by the word diffraction. Your eye’s biology is that good. It’s good enough biology that it nearly reaches the acuity limits dictated by this law of physics.

That’s the limit—you can’t do better than that without processing tricks and still be human.

This means that if the dots are close enough, or your viewing distance is far enough away, you aren’t going to see the dots.

At a viewing distance of 1 foot, that’s around 300dpi. Printers know this number well.

Actually, it’s worse

In reality, most people’s viewing distance or acuity is much worse than a computed value. Some people are farsighted or very nearsighted or don’t have perfectly corrected optics. Some people squint (which increases the diffraction). The lighting isn’t perfect, there may be reflections and refraction and the image may be distorted. Or maybe it’s just too dark and those acute cones in the center of your eye just can’t get enough data to make a decision. The dots in the test may not have enough contrast. Some people simply don’t have the time to stare at things forever to look for differences in detail. That’s why the megapixel myth is a “myth” and why you can do tests like this to prove it.

About that viewing distance

And really, is your average iPhone usage viewing distance 1 foot anyway?

Let’s put it another way. Do you notice the pixelation on your color monitor right now? Can you see the individual pixels in that form the font in this text? I can’t on my computer as I type this. Well if it’s a desktop LCD panel like mine, you’re only around 80ppi. My viewing distance is about 3 feet.

Perhaps you are reading this on a laptop. The resolution is much higher (up to 130ppi in some cases) and you are a little closer than three feet away. But you still can’t see the pixels.

Now think of your High Definition TV set. If it’s anything like mine, you have less than 30ppi when it’s showing a 1080p Blu-Ray, but it appears sharp as a tack. And, can you even see the difference between 720p and 1080p from your couch? I can’t.

After a while you start to notice that the dpi of your displays and the typical viewing distance form a relationship.

This is why I think the laptop example is best. That’s the viewing distance of how I use my phone the most and that means that 140dpi should be more than enough. The best complaint one can make about the Retinal Display is not that it’s not as good as print, but that you probably won’t notice the difference unless you’re looking for it. Eyestrain relief may be minimal or may only occur only after you’ve finished reading War and Peace on your iPhone.

Why we like print

Print is inefficient; newsprint is even worse. In a newspaper, you’d be lucky to get 86ppi effective resolution on the photos, and in many cases it’s much worse because ink is messy and smears and the paper it’s printed on is barely fit to line your birdcage with.

Not only that, there is more to things than acuity. There is also contrast and color. Print doesn’t do well in either regard because contrast is passive (depends on how how bright the ambient light is and how white the paper you print on is. Newspaper is not very white because white paper costs a lot) and ink doesn’t have a large gamut.

600dpi? Puh-leez. Any mention of dpi or ppi for “newsprint resolution” is the same old canard of using digital measures to misrepresent analog as being better simply because it’s analog. Effective resolution for most newspapers (for print) isn’t any better than my first laser printer: 300dpi Laserwriter in 1985.

The big difference with eyestrain is obvious if you’ve ever owned a Kindle. It’s active lighting vs. passively lit. All these displays we’ve talked about are backlit—a form of active lighting. Newspapers are passively lit, which is very different.

Think of it this way: you merge the red green blue lights of your display together and what color do you get? White. But you merge cyan, magenta, and yellow paint on paper together and what color do you get? Black. It’s additive vs. subtractive color.

The ambient light levels needed to give contrast to read print on newspaper is the same as the light levels of ambient light! The light levels needed to light the back of your computer LCD to provide adequate contrast is much greater than the ambient light. Don’t believe me? Take a photo of your desk with your computer on. You’ll notice that the monitor will be blown to the highlights. Or think of it this way: the black in the blackest black of your monitor is no blacker than your monitor when it’s off (it can’t be), but it appears much blacker!

Your eye is constantly scanning and your iris is instinctively adjusting for the local contrast of what you’re looking at: as the eye center shifts from your monitor to your office, the muscles in your iris instinctively open your pupil. As your eye shifts from your office back to work, those muscles instinctively stop down. Doing that all day is tiring.

For book print and eInk, the ambient light and contrast light are the same light. Your iris doesn’t need to adjust and you have no problem. A corollary to this will be that if you have the exact same eyestrain with a itty bitty booklight on a Kindle (or a paper book) at night as you would with an iPad at night. Factor in the inconvenience of those book lights and you start why sales seems inextricably tied to Christmas—when you’re giving those to someone else.

Kindle 2 cover MightyBrite

Kindle 2 cover and MightyBright
North Beach, San Francisco, California

Nikon D3, Nikkor 24-70mm f/2.8G
1/60sec @ ƒ3.2, ISO2200, 29mm

I lent my MIghtyBrite to someone and they lost it. I wasn’t in a rush to replace it.

In summary, the argument against the new display should be, you’re not going to be noticing the difference (unless you look for it), so why didn’t they put an OLED display and give us even more battery life instead? Or just sell an option without the display.

Aside: The Kindle

Yesterday, I asked Matt if he still uses his Kindle now that he has an iPad. He said he does, because he reads a lot outside. The iPad has no contrast outside. (You’ll also notice the same contrast problem with your computer in an office with a lot of window lighting in morning or evening light because of the glare or reflections. Or try to use your laptop outdoors.)

As for me, my landlord was amused to find out I’m hardly using my Kindle anymore (I still use the Kindle app). I am surprised at how great the iPad’s battery life is and how important fast pagination of a non-eInk display is for reference material. If my New Yorker subscription were available on the Kindle App, I’d seriously consider getting rid of my Kindle 2. Well that and the an ability to do text exports of my marks would be nice. 🙂

Aside: OLED

At lunch yesterday, Mike mentioned that he was surprised that Apple didn’t go with OLED because of the longer battery life.

I said that, while it should half the battery drain issues in theory, in practice, it isn’t that great, so Apple probably opted for the thing that’s going to sell better and be unique (resolution) over something nebulous as that.

He said that in testing, he found that the display eats the most power of nearly everything on the phone and that OLED equipped Android phones lasted much longer on a single charge. I postulated that they may just have a bigger battery in them because the technology hadn’t reached the point of efficiency.

In any case, worried that I had been spewing bullshit, I looked it up:

While an OLED will consume around 40% of the power of an LCD displaying an image which is primarily black, for the majority of images, it will consume 60–80% of the power of an LCD – however it can use over three times as much power to display an image with a white background such as a document or website. This can lead to disappointing real-world battery life in mobile devices.

(Clearly, there will be a point where Apple will switch, but they’re not there yet.)

Update:

In talking to friends, I realized that I left a lot of things to infer that I should have summarized better:

Most of the time, you won’t notice the difference between the new display and the old one. The exceptions are

  • You have the two side by side. In that case you’ll naturally pull up very close to both to make the comparison
  • You are examining looking at details in a photo or artwork. In this case people often bring it very close, even if they’ve pinch zoomed.
  • You are reading a book on the iPhone. In that case your viewing distance will be about 1 foot away
  • You are texting. Some people text at about a foot and a half which is just enough to notice a difference between the current iPhone and this one (but not so with the Android).

The reason your eyes get tired while reading has very little to do with resolution and almost to do with active vs. passive lighting.

  • Almost all displays are actively lit (this means that they start out “black” and provide contrast by lighting it up). This includes CRTs, Plasma TVs, color LCDs, LED-backlit LCDs, and OLEDs.
  • For them to provide proper contrast, actively lit displays must be much brighter than the ambient lighting. You look away from your monitor and the iris muscles work to deal with the differing lighting conditions which tire out your eyes.
  • Books, B&W LCDs (digital watch faces), and eInk displays (Kindles) are passively lit, This means they are lit by the ambient light and provide contrast by subtracting from it (with paint).
  • Your eyes don’t have to adjust to a change in lighting because the lighting is the same.
  • This also explains why Matt still uses his Kindle: In outdoor or daytime lighting, actively-lit displays can’t overpower the sun and have little to no contrast. (Those new digital billboards have very bright LEDs that can.)

What I think happened with the Retinal Display was the following. From a business perspective, Apple needed to increase the resolution of the iPhone to compete with Android phones that have high pixel densities—high enough to cover the texting case above. Since a computer is fundamentally binary, it simplifies computation and design (allowing almost no changes for the developer or loss in performance) if they simply double the pixels the display in both directions (4x the pixels in the same sized display). Doing so caused them to greatly overshoot what is necessary (and probably forced the choice of LCD over OLED). However, they noticed the number they reached (325ppi) happens to be greater than a known reference in printing circles (300ppi)—the maximum effective necessary resolution at reading distances. Since their competition has not reached that resolution, they marketed it as “retinal display” and used that catchy term to educate consumers to demand something they probably don’t need.

If you are an Apple hater, it may sound evil. But then again, I don’t need to pay $3 for coffee either.

Need and want are two very different things. 🙂

11 thoughts on “Retinal burn

  1. Print resolutions for halftones is quite low (80+dpi), but for type, the "resolution" is the resolution of the particles in the ink. Pigment based inks have a particle size of around 0.1–2 µm. Dye based inks are better than that, since as they soak into paper, they actually change the color of the paper.

    1. Yes, I suppose I didn’t make this clear. Basically print is a very analog process. Your example also explains how inkjets (and the like) can have rated resolutions of 14,400dpi and above when a single dot is much larger than that. It’s really hard to tell what the “practical resolution” of an analog device is. That’s why so many people recommend you print it out and see for yourself. 🙂

      When one states numbers like 600dpi (or beyond) for newsprint, it is using a digital measure (dpi) to talk about an analog process. It‘s like in the article I wrote about “When did you go digital?” where people talk about “What is the megapixel of film?” The answer can be theoretically as high as hundreds of megapixels, but effectively it ends up being around 6 (for most 35mm prints).

      OTOH, when one states 80dpi for newsprint. One really means 80+lpi which is the halftone screen typically used. That’s a special case in the opposite direction.

      If you meant to call into question "newspaper isn't better than my 300dpi Laserwriter" statement then point taken. What I meant was text print quality on newspaper isn’t any better. However, if you take any sort of halftoning, the difference (and advantages of offset newsprint over laserprinting) is stark. I remember I couldn’t print of my offset masters out 10% screens that would render and 20% screens would make the text illegible on a laser (but offset prints have no trouble with either extreme). Too bad, I didn’t have access to a Letraset connected to a Macintosh. Such things, even as a service, were just unavailable to a high school newspaper until around 1992.

  2. George Ou writes about resolution here and here.

    Alot of what he mentions either agrees with me or is semantics (comparing iPhone to some higher end Androids), but he did point out that the marketing image above showing two "a"'s comparison is not correct—it’s showing 3x ppi instead of 2x. I think the video is fine but they should change the static image on the page. I should have caught that. (If you look at the video the image comes from you can clearly see the transition where they go from an actual comparision to umm… perceptual?).

    1. Yes, the author is correct at a viewing distance of 1 foot, 477ppi would the maximal resolution of the eye both computed by measured minimum cell spacing of photoceptors in the dead center (fovea) and estimated by rules of diffraction by eyeballing your eye. I alluded to the exact number (but didn’t bother compute it, because my college vision and neurobiology texts are in storage, and the latter is only an estimate).

      He is also correct in saying that the resolution of the display needs to be much higher than the computed resolution for the image to be "perfect." What he is alluding to is a computational effect knowns as hyperacuity (which I linked above). Basically, photoreceptors are not spaced evenly (in space or time), so processing tricks that need to throw out information before encoding in the optic nerve, can actually get this information out. It turns out, the former is usually noise, but the latter has an evolutionary imperative (i.e. picking out if a striped predator in a field of grass, or determining if it is running toward or away from you).

      However, the author is talking about the maximum theoretical computation. In practice, it doesn’t work exactly that way. There is processing in the eye (translucent cells that refract light, no less, due to an accident of evolution) that needs to encode the image (think of it as doing de-noising during acquisition) before it goes down the optic nerve to the back of the brain (visual cortex, you have a maximal bit rate of the optic nerve, so some processing needs to be done at the eye), viewing conditions are less than perfect, viewing samples don’t reveal it, etc.

      In other words, there is a lot of variance from eye to eye. Most people's eyes trade off absolute acuity for something that's actually useful in the real world.

      (As for the authors argument of the “retina” vs. the “eye”, he goes a bit too far. The distinction is bullshit because the retina is more than the layer of photoreceptors. Without the "eye" portion moving the eye around, then that resolution can only be achieved in the dead center of the eye (fovea)—the rest of the eye is much worse—the eye gets the full resolution by scanning the scene as mentioned above. Also, last I checked, processing done in the other three layers of cells in the retina is called the “retina” too. My guess is because the author is a theoretical physicist, not a biologist, he uses the stated values in a textbook, instead of using his common sense—see below about 20/20 vision.)

      Where the author makes the mis-step in other words, is he translates the theoretical values without accounting for biological limitations or imperatives. (other than photoreceptor spacing). In practice, very few eyes out there achieve those theoretical values, even when corrected.

      Or put it this way, I have pretty sharp eyes (when corrected), but I've never tested better than 20/20 vision. However, a friend, when we were kids, had 20/15 vision and I’ve heard claims as high as 20/10 (but no higher). 20/20 = 300 ppi @ 1 foot, 20/10 would be nearly 600 ppi @ 1 foot.

      At the end of the day, how many people do you know still test better than 20/20 vision? Those are the people who could theoretically see pixelation at a viewing distance of 1 foot.

      (By the way, even for those, it will be trouble seeing the pixels. The acuity test is computed for a 100% contrast situation black-white-black, not a natural scene. The closest approximation to this would be text on the display and asking them to not see black-white-black pixels but to correctly identify the subpixel (should be red or blue). The contrast of those subpixels is much less than 100% (green comes close, but not those two) so there is no test you can do on the retinal display at such a close viewing distance that would actually test those theoretical values. If you do have better than 20/20 vision, you can try this on yourself with an Android held at 15" instead of an iPhone at 12". You’ll see what I mean immediately…Or just paint the screen red or blue and then see how close you have to hold the iphone to your eye before you see the subpixels!)

Leave a Reply to Ed Finkler Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.