CCFL Vs. LED: A Downside To Going Green?
William Van Winkle
William Van Winkle has been a full-time tech writer and author since 1998. He specializes in a wide range of coverage areas, including unified communications, virtualization, Cloud Computing, storage solutions and more. William lives in Hillsboro, Oregon with his wife and 2.4 kids, and—when not scrambling to meet article deadlines—he enjoys reading, travel, and writing fiction.
The transition to LED backlighting delivers richer contrast and color—and is supposed to be greener—than traditional cold cathode fluorescent lamp technology. But is there a downside? Read on to find out.
This article originally appeared here on Tom’s Hardware. It has been edited for the busy IT pro. Please see the original for details excluded here and for information on how to buy products featured herein.
Out with the old, in with the new. LED backlighting is now all the rage in monitor design—and why not? Apple made LED technology the golden child of green tech when the company announced in 2007 that it would move to LED backlights and drop traditional cold cathode fluorescent lamp (CCFL) backlighting in its products. The target was mercury, a key ingredient in fluorescent lighting tubes.
The other side of going green is consuming less power, and monitor vendors practically trip over themselves to make lofty claims of electricity savings. Viewsonic, for example, notes a 50% benefit for its VX2250wm-LED. Moreover, those who value their investments could point to NEC, which proclaimed that LED technology would double the longevity of a monitor backlight from 25 000 hours with fluorescent to 50 000 hours.
Without question, there’s a lot about LED to commend it as a greener technology than fluorescent. Surprisingly, though, few (if any) people have stopped to ask what the relationship is between power savings and image quality. Is there a relationship? We always hear that LED monitors have far better contrast and better color, but is this true, and is there a price to be paid for that superior image?
The widespread transition from CCFL to LED got under way in earnest in 2007. Today, LED pricing has virtually reached parity with CCFL, and the trend is clearly in LED’s long-term favor. We’re not saying this is a bad thing. We only wonder if today, while you still have a choice between the two backlight technologies, if there is still a compelling reason to opt for the receding choice.
Most of you probably know that monitors benefit from calibration, and what you get out of the retail box is not calibrated. Instead, displays generally arrive cranked with maximum settings so they look bright, vibrant, and ready to rip your face off with radiant awesomeness. This may make for a great first impression, especially at a distance, but these are almost always sub-optimal settings. The red, green, and blue color channels may benefit from tweaking, the target output brightness is almost always less than what you see at first. And the general “temperature” of the display (used in the color, rather than heat meaning of the word) may need adjusting. The one spec that vendors tend to nail at the factory is gamma. The optimal gamma setting is 2.2, and this is almost invariably what you get.
We know that some if not most users resist calibrating their screens for whatever reason. A proper calibration requires a proper colorimeter, which in our case looks a lot like a corded mouse that you place over the display screen. A sensor inside the puck takes readings from the monitor, and software running on the PC converts these readings into various values. We started testing with a Monaco Optix XR colorimeter, graciously sent to us by Asus, with ColorEyes Display Pro. However, this is a somewhat older colorimeter, and there’s some debate about whether it’s still suitable for testing given more current options. We suspect it is, but we opted to compromise and go with X-Rite’s i1Display 2 bundle, which includes both the i1Display 2 colorimeter along with X-Rite’s i1Match software. After speaking at length with X-Rite engineers, we were convinced that this package, plus ColorEyes Display Pro and Chromix's ColorThink Pro, would be sufficient for making a reliable quality analysis suitable for a consumer-level audience. In a perfect world, we would have five figures to drop on Minolta colorimeters and luminance meters, but then you wouldn’t have had scantily clad elves in our holiday hardware roundup until 2017. Faced with that dilemma, we opted to accept X-Rite’s generous help.
Right out of the box, we noted the factory brightness, contrast, and color settings for each display. We then ran X-Rite’s Eye-One Match Easy test to determine color temperature, gamma, and luminance at these stock settings. Next, we used the Eye-One Match Advanced test to calibrate the screen and create a calibrated profile. After discussions with X-Rite and Chromix, we agreed on targets of 6500K for temperature and 120 cd/m2 for luminance. We see 120 cd/m2 noted repeatedly throughout the professional display world as being optimal for desktop monitor use, although your first look at it may seem surprisingly dim because you’re accustomed to an overdriven screen brightness.
With the screen calibrated, we then returned to Eye-One Match Easy and used it to take readings at nine positions around the display: top-left, top-center, top-right, middle-left, middle-center, middle-right, bottom-left, bottom-center, and bottom-right. Taken together, these readings would reveal any significant variances in luminance and color across the display surface.
Next, we used ColorThink Pro to measure gamut, comparing the measured output from the calibrated screen against the sRGB baseline gamut. Essentially, this shows the amount of perceptible color the monitor is displaying compared to the industry-standard sRGB profile. There are many alternatives to sRGB, but we opted to use it for its ubiquity and simplicity.
Finally, we used ColorEyes Display Pro to run Delta-E analysis on the calibrated screen. As the ColorWiki page on Delta-E states: “Delta-E (dE) is a single number that represents the 'distance' between two colors. The idea is that a dE of 1.0 is the smallest color difference the human eye can see. So any dE less than 1.0 is imperceptible (as in turn the lights off and head to the pub) and it stands to reason that any dE greater than 1.0 is noticeable (as in put the coffee on, we're going to be here a while). Unfortunately (and probably not surprisingly), it's not that simple. Some color differences greater than 1 are perfectly acceptable, maybe even unnoticeable. Also, the same dE color difference between two yellows and two blues may not look like the same difference to the eye and there are other places where it can fall down.”
We ran all of this software on an Intel Core 2 Extreme X9100-based 15” notebook, recording results via screen captures and notepad.