I’ve spent the better part of the last 12 years in the imaging and display world. In that whole time, one thing that is constantly misunderstood and debated is how exactly to measure quality of cameras and displays, and further how to cut through marketing-speak.
Myth 1: “HD-quality”
“HD-Quality” is a marketing tool, plain and simple.
The technical definition of HD (high definition) is the pixel resolution of the display. Examples are 1280 x 720 (AKA 720i or 720p) or 1920 x 1080 (AKA 1080i, 1080p, or Full HD). However, let’s be clear about something: there are no formal image quality metrics. Image quality can be measured by a number of specifications, including contrast ratio, dynamic range, # of available colors, etc… In no spec, in no document, do you find an agreed-upon set of rules dictating the minimum contrast ratio that a display must have to be ‘HD-quality’. There is no third-party test house, or governing body, attaching a ‘HD-approved’ label to the box.
There are wildly-varying performance levels of HD televisions, as any shopper would tell you. The performance of LCD vs plasma differs. I believe that the picture produces by my plasma is better than LCDs. However, if you saw the OLED TV at CES this year…well, that thing was great. Even within the same technology, the actual silicon display panels used by different manufacturers are different. If you examine a Samsung TV versus a Vizio, or a Sony versus an LG, you’re likely to notice differences in quality once you really start to examine them (certainly not taking sides about which is better). Further, this comparison assumes you’ve actually calibrated your screens.
Further, even the ‘scientific’ ways to measure things like dynamic range and contrast ratio are up for debate. Not all manufacturers use the same formulas for determining contrast ratio, and often this term is frankly nothing more than specsmanship. When you start with two different ways of measuring baseline, your results are no more valid a comparison than debating who would win a Super Bowl between the New York Yankees and Los Angeles Lakers.
So, in short… HD-quality is nothing more than a marketing gimmick, and doesn’t correspond to anything that can be measured. As long as you have the minimum number of pixels, you can call yourself HD-quality. As such, I would argue the terms means absolutely nothing when referring to display quality. The next time you hear that a product has an ‘HD-quality’ display, or produces ‘HD-quality’ pictures, put as much credence in that as you do promises from the local used car dealer whose office is a mobile home…with the engine running.
Myth 2 “More pixels mean equal better pictures”
Contrary to what the salesperson at your local electronics store will tell you, more pixels doesn’t necessarily equally better pictures. This is true for both cameras and televisions.
Let’s take the camera example, and assume have 12 Mp (Megapixel) and 10 Mp cameras
The absolute of how good an image will look is really dependant on a number of factors: the quality of the image sensor (SNR, shot noise, others), the type of lens (construction, number of elements, F#, others), how the image is pre-processed inside of the camera, and others. If the noise floor (amount of electronic noise generated during operation) of the image sensor contained in the 12Mp is much higher than the corresponding image sensor in the 10Mp, the image, as it is captured, will be significantly degraded. Image processing can somewhat correct for noise, but only by so much. If the difference is more than that correction, the 10Mp will have a native image capture that is better than the 12Mp. Even with identically good lenses and electronics, the 10Mp will produce a ‘cleaner’ image. And, in my experience, lenses for higher resolution cameras are more difficult to make, leading to more chances for issues. This isn’t to say that lower resolution cameras are always better—not at all. It’s just to say that resolution is NOT the predominate factor in quality.
So let’s apply that same concept to displays. Just because your tablet has “1 million more pixels than an HDTV” doesn’t make it inherently any better at displaying content. What if your 1 million more pixels are of inferior quality? What if your 1 million more pixels can’t be seen in direct light? What if those 1 million extra pixels consume so much more power that it forces you to adopt a battery that is much more expensive, larger, and heavier?
Certainly not making any judgments on products that sell today—I am simply putting forth the idea that more pixels does not necessarily equal ‘better’.
Myth 3: The performance of displays at a retailer are ideal and real-world.
One quiet little ‘secret’ of the television industry is that display televisions are purposely tuned with blown-out colors and brightness. This is done because the average consumer has been conditioned to believe that the brighter the video and the more vivid the color, the better the television is. Here’s a test for our readers: if your television has preset modes (most have ‘theatre’, ‘vivid’, etc… presets), turn it to vivid for a few minutes. Then turn it to theatre. After watching a few minutes in vivid, you’ll likely find that your eyes start to be bothered by what you are seeing. ‘Vivid’ settings aren’t real-world or natural (unless you happen to live in a Tim Burton movie), and your eyes will begin to notice that. Since you likely aren’t staring at displays at retailers for more than a few minutes, you aren’t likely to notice this.
The same mantra of not judging displays is true for mobile phones. You probably haven’t noticed (and why would you?) that mobile phone stores almost never have working products staged by the front window of the store, unlike many other retailers. This is because of the effect of sunlight in washing out the displays. Retailers control the lighting in the store to provide the best possible viewing experience – at the time of purchase. If we all knew how bad our phones would be at showing video in sunlight, would we be so enthusiastic about their displays? OEMs can call the display whatever they want—name it after a part of the human eye, attached a superlative to LCD/OLED, etc… The simple fact is that what you see at the retailer is NOT what you’ll see in the real-world.
So what’s my ideal display? Something that doesn’t consume 75% of my battery, is viewable in sunlight, provides good contrast/dynamic range, and real-world colors. Perhaps that might be a pipe dream, but I know two technologies that can get them there!