LED manufacturers have embraced the Color Rendering Index (CRI) because it allows a direct comparison of how faithfully products reproduce color compared to conventional lighting. The ability to compare light sources in this way is useful because it helps to convince consumers that solid-state lighting is a viable alternative for mainstream lighting.
However, some engineers question the validity of the current CRI test because it is limited to just eight pastel tones (R1 to R8). Detractors point out that manufacturers have spent many research dollars to make LEDs perform well in a test that may not reflect consumers’ expectations.
For example, under the current test, an LED can be classified with a CRI in excess of 80, yet render red poorly. Poor red reproduction may not matter so much in a domestic setting (although it will cause people to look pale), but it could cause LEDs to fall out of favor with food stores, art galleries and hospitals, among others— particularly as some competing halogen bulbs boast CRIs above 90 in addition to excellent red rendering.
Moreover, several academic studies report that white light LEDs with low CRI values are nonetheless perceived by observers to render colors well and, providing consumers are satisfied with solid-state lighting’s performance, it matters little that color fidelity is not retained for non-critical applications.
This article defines and reviews the current CRI test, explains its limitations, and describes how regulators and manufacturers alike are dealing with these weaknesses.
A brief introduction to CRI
CRI is a quantitative measure of the ability of a light source to reproduce the colors of various objects faithfully in comparison with an ideal or natural light source. The index was formalized by Commission Internationale de l'éclairage (CIE, or “International Commission on Illumination” in English).
Today, CRI is calculated from the differences in the chromaticities (or “color appearance”) of eight CIE standard color samples when illuminated by the light source under test and then by a reference illuminant of the same Correlated Color Temperature (CCT) [see the TechZone article “Defining the Color Characteristics of White LEDs”].
The smaller the average difference in chromaticities for the eight samples between the reference illuminant and the light under test, the higher the CRI (see the TechZone article “What Is the Color Rendering Index and Why Is It Important?”). Note that differences in CRI values of less than five points are not significant. For example, light sources with CRIs of 80 and 84 are essentially identical.
Natural light is classified as having a CRI of 100, the best possible. Incandescent lamps have a CRI above 95 while cool-white fluorescent lamps have a CRI of 62. Fluorescent lamps containing rare-earth phosphors are available with CRI values of 80 and above, but mercury-vapor lamps are poor performers with a CRI of 45. Halogen lamps work well with a CRI of 90 or better, while compact fluorescent lights (CFL) measure around 80.
An updated test
The CIE issued a technical report in 1995 detailing how to perform the CRI “test-color method Ra.”¹ The technical report defines the eight color reference samples (R1 through R8), which are derived from the Munsell color system. This system specifies colors based on hue, value (lightness), and chroma (color purity) and was created by Professor Albert H. Munsell in the early 20th century. The reference colors cover the hue circle, are moderate in saturation, and are approximately the same in lightness.
In the mid ‘90s, the CIE’s Color Rendering Technical Committee assembled to work on updating the test color method, as a result of which the “R96a” method was developed. Enhancements to the test included adding six more colors: four saturated solids (R9 through R12) and two earth tones (R13 and R14). Figure 1 shows the reference color pallet for the R96a test that comprises the original eight colors plus the six new ones starting with R9.
The committee was dissolved in 1999, releasing a report but offering no firm recommendations, partly due to disagreements between researchers and manufacturers. Consequently, R96a was not adopted and the 1995 Ra version remained the standard test.
LED lights that use a mix of red, green, and blue (RGB) LEDs score poorly in the CRI Ra test, often recording results in the 20s. The reason for poor scores is that the spectral power distribution (SPD) of RGB LEDs exhibits spikes that correspond to the output of the three LEDs and few other wavelengths (which is hardly surprising considering LEDs are designed to produce light within a very narrow wavelength band [Figure 2]). In comparison, a high-CRI-scoring incandescent bulb emits a full range of wavelengths.
A more common type of commercial white LEDs comprises a royal blue LED “photon pump” allied to a yttrium aluminum garnet (YAG) phosphor. The phosphor contributes to the LED’s output by Stokes shifting the blue photons from the LED to other wavelengths (see the TechZone article “Whiter, Brighter LEDs”). Makers of these types of LEDs have addressed CRI challenges to some extent by “tuning” the YAG phosphor to spread the SPD of their devices and enhance CRI. Substituting a blue LED for an ultraviolet one can produce further improvements. However, a major drawback of changing the phosphor to improve CRI is that it compromises LED efficacy – and efficacy is something of a Holy Grail for LED makers (see the TechZone article “Phosphor Development Addresses Lower Efficacy of Warm-White LEDs”).
Nonetheless, commercial products balance CRI and efficacy satisfactorily. Philips Lumileds LUXEON 3535 series LEDs (which exhibit an efficacy of 103 lm/W at 100 mA/3.1 V) have a CRI of 82. Cree’s XLamp XM-L2 chips (153 lm/W at 700 mA/2.9 V) have a CRI of 80, and OSRAM SSL 150 White LEDs (106 lm/W at 350 mA/2.95 V) come in with a CRI of 83. Lower-efficacy devices with CRIs around 90 are available from all three manufacturers.
Red in the face
However, all is not well. There is a groundswell of opinion in the solid-state lighting industry that a high CRI for an LED-based lighting does not necessarily mean the product is a good light source. These detractors point out that some products score highly in the Ra test-color method but poorly when the test is extended to include, for instance, R9.
The problem is that while R9 does track CRI, it is only at very high CRIs that it correlates closely. An LED-based lighting product can score a CRI as high as 80 in the standard test but record a result of 0 for R9. Yet R9, a vibrant red (see, again, Figure 1), is important in lighting applications, such as food stores (for example fruit and meat sellers), art galleries (Figure 3), and surgical operating theaters. In a domestic environment a light source with a poor R9 value will cause people to appear pale.
The regulators have taken notice of the weakness of the current test. ENERGY STAR, for example, now defines an LED as being acceptable if the CRI (Ra) is greater than 80, and R9 is greater than zero. It is a modest requirement, but at least recognizes the significance of the measurement. Also to their credit, manufacturers are embracing R9 testing –– even though it is not part of the industry-standard CIE CRI test-color method –– and chips with high values are becoming available.
OSRAM manufactures LEDs with CRI (Ra) of 95 and R9 above 90, in 1 to 5 W LED package options for CCTs between 2700 and 4000K in its OSLON SSL range. Similarly, Cree offers a range of products with high CRI (Ra) and R9 above 40. One example is the XLamp XP-E2 range that is offered with CRI (Ra) of 80, 85, and 90. Philips Lumileds offers a range of high CRI and R9 devices including the LUXEON T range that includes a device with a CCT of 3000 K offering 82 lm/W (2.51 V, 700 mA) and a CRI (Ra) of 95 and R9 of 90 (Figure 4).
The voluntary addition of R9 to the standard CRI test-color method is a welcome start by LED makers. Better yet, some manufacturers are also focusing on boosting the CRI values of their devices for R14, which is a skin tone and is claimed to be important for domestic lighting.
Color performance as important as fidelity
Despite their lowly CRI scores, RGB LEDs are reportedly visually appealing to consumers. It turns out that domestic consumers in particular are not particularly concerned with precise color rendering, more that their lighting produces aesthetically pleasing results. RGB LEDs are reportedly popular because their light tends to increase the perceived saturation of most colors without producing objectionable hue shifts.
Although blue LED/YAG phosphor solid-state lighting produces better CRI results, this comes at a cost of efficacy— one of the key reasons why LEDs are promoted as a “green” light source. If manufacturers were not under pressure to meet CRI constraints then white LED efficacy could be even higher.
Regulatory authorities are taking notice of these anomalies. For example, the U.S. Department of Energy (DOE) advises consideration of both color fidelity and color performance when comparing light sources.² The DOE notes: “The CIE Color Rendering Index is based on the idea that any color shifts between the test and reference illuminants are undesirable. While this is true for applications requiring critical color [fidelity] comparisons, it does not consider whether the color shifts are visually appealing.”
The DOE document also cites CIE Technical Report 177:2007, “Color Rendering of White LED Light Sources,” which notes that the conclusion of the CIE Technical Committee is that the CIE CRI is generally not applicable to predict the color rendering “rank order” of a set of light sources when white LED light sources are involved in this set.
This recommendation is based on a survey of academic studies that consider RGB LEDs and blue LED/YAG phosphor devices. During these studies observers ranked the appearance of illuminated scenes using lamps with different CRIs. In general, there was poor correlation between these rankings and the calculated CRI values. In particular, observers perceived that RGB LEDs with CRI (Ra) in the mid-20s appeared to render colors well— a conclusion already reached by the public.
Limitations of the test
Lighting engineers tasked with selecting LEDs for their next fixture should take into consideration the application for which the product is targeted. While manufacturers helpfully provide CRIs for their LEDs to allow comparison with other light sources, designers should be aware that the CIE’s 1995 test-color method has its limitations. For example, an application may require good performance for the R9-to-R14 reference colors that are not included in published CRIs. The good news is that several companies already manufacture LEDs that have high R9 and R14 values when good rendering of red and skin tones is required.
Alternatively, the application may not require high-color fidelity at all, and what might be more important is how well the solid-state light reproduces color in the consumers’ perception. Studies have shown that LEDs that have low CRIs nonetheless cast a pleasing light in the eye of the public.
For its part, the DOE recommends that if light fidelity is important to the application then CRI (Ra) is a useful metric for rating LED products. However, if color appearance is more important than color fidelity, then the organization suggests that white-light LEDs should not be excluded on the basis of low CRI (Ra) values alone.
For more information about the devices described in this article, use the links provided to access component details on the Digi-Key website.
- “Method of Measuring and Specifying Colour Rendering Properties of Light Sources,” CIE, 1995.
- “Color Rendering Index and LEDs,” U.S. Department of Energy, January 2008.