How to Dim an LED Without Compromising Light Quality
Contributed By DigiKey's North American Editors
2016-10-11
As the move from traditional lighting to light emitting diodes (LEDs) accelerates and a wider range of solutions hits the market, consumers have become more selective about the products they choose. In particular, linear dimming with a wide contrast ratio and no chromaticity variation or perceptible flicker is considered the mark of a quality product.
The difficulty for the designer is that, unlike traditional incandescent or fluorescent lighting, dimming LEDs while retaining light quality is not trivial. Analog dimming is possible but can result in a noticeable shift in the chromaticity and “temperature” of the emitted light.
An established technique is to dim the LED using pulse width modulation (PWM) of the forward current powering the LED. The basic premise is that during the “ON” cycle of the PWM train, the LED is operating under optimum forward current/forward voltage conditions. As a result, the quality of light is high and the perceived brightness is linearly proportional to the PWM train’s duty cycle.
The challenge for the designer is designing a PWM circuit that works in harmony with the modular switching voltage converters that are typically used as LED power supplies or “drivers”. Without this complementary approach it is all too easy to introduce problems such as electromagnetic interference (EMI), limited contrast ratio (maximum luminosity/minimum luminosity) and perceptible flicker (which is linked to detrimental health effects).
This article considers the design of PWM LED dimming circuits based around a selection of contemporary LED drivers and highlights the design steps necessary to come up with a solution that doesn’t compromise light quality.
The drawbacks of analog dimming
LEDs require a constant current/constant voltage power supply to maintain efficient operation with good light quality. (Light quality has become a key product differentiator and one that major vendors are at pains to promote for their high-end products. See library article, “Manufacturers Shift Attention to Light Quality to Further LED Market Share Gains”.)
There is some flexibility on the choice of operating point depending on the end-product specification. For example, the LED’s luminous flux is proportional to the forward current, so the designer might choose to power the LED at a higher forward current to boost luminosity, thereby reducing the number of LEDs required for a given design specification. (See library article, “Lighting Design for Optimum Luminosity”.)
Figure 1 shows the forward current vs. luminosity characteristic for an OSRAM Opto Semiconductors Duris S5E white LED. The OSRAM device is based on proven technology and is a popular choice for mainstream lighting applications. The LED produces 118 lm at 6.35 V/150 mA and has a claimed efficiency of 123 lm/W at that operating point. Reducing the forward current to 100 mA, for example, attenuates the luminosity by 30 percent, compared to that generated at 150 mA.
Figure 1: The OSRAM Duris S5E white LED demonstrates a nearly linear relationship between forward current and luminosity. (Source: OSRAM Opto Semiconductors)
Consumers, familiar with dimming for incandescent lighting, naturally demand similar capabilities for LED replacements. Chief among these capabilities is fine-resolution dimming across a wide luminosity range. An apparently simple way to address this demand would be to design an analog dimmer circuit that (via the LED power supply or “driver”) reduces the forward voltage/forward current powering the LED.
Unfortunately, analog dimming introduces some major drawbacks. Key among these are the impact on efficacy (output (given in lm)/input power (W)), a restricted contrast ratio because of a minimum forward current threshold, the added design complexity of precisely controlling the output current of a typical LED driver over a wide range, and, most pertinently, variations in the LED’s correlated color temperature (CCT) as the forward voltage/forward current changes.
The CCT determines the LED’s apparent warmth and is a key measure of light quality. Lowering the forward voltage/forward current has a subtle effect on the wavelength of the light emitted by the blue LED at the heart of most of today’s ‘white’ LED products. Modern high-brightness LEDs for lighting applications combine a royal blue LED with an yttrium-aluminum-garnet (YAG) phosphor. Some of the LED’s blue photons escape directly from the device while most combine with the phosphor resulting in (primarily) yellow emissions. The combination of blue and yellow light is a good approximation of white light.
The LED manufacturer then makes subtle changes to the phosphor to alter the white light’s “temperature” from cool (bluish) tints to warm (yellow) shades, allowing the maker to offer a choice of colors to suit individual tastes. The CCT quantitatively defines the LED light temperature. (See the library article “Defining the Color Characteristics of White LEDs”.)
Manufacturers specify an LED’s CCT at a specific forward voltage/forward current operating point. Designers choose a set of LEDs from a particular CCT “bin” safe in the knowledge that all products selected from that bin will emit a virtually identical CCT. While the leading makers do also typically include information about how the CCT varies against forward voltage/forward current, they don’t guarantee the performance of a specific product at operating points beyond the recommended parameters. In particular, the LED maker offers no guarantees about devices from the same bin producing the same CCT at any point other than the recommended operating point. Figure 2 illustrates how an OSRAM LED’s chromaticity coordinates (which determines its CCT) vary with forward current.
Figure 2: The chromaticity and CCT of an LED changes with forward voltage. Across a wide range of forward current these changes can be detected by the eye. (Source: OSRAM)
Worse yet, while the eye isn’t that good at detecting subtle color changes (for example, differences in the wavelength of photons emitted by a pure red, green or blue LEDs can change markedly before being noticed), it is very sensitive to CCT changes. As a result, it’s entirely possible that a consumer would notice that two fixtures powered by LEDs from the same bin vary considerably in color under an identical degree of analog dimming. (See library article, “Digital Dimming Solves LED Color Dilemma” for a more detailed technical explanation of this topic.)
Addressing CCT challenges with PWM dimming
In recent years, PWM has been adopted as the preferred dimming technique for high-quality LED lighting. During the ON cycle of the PWM train, the LED is powered at the recommended forward voltage/forward current operating point – ensuring that the CCT is within the datasheet parameters. The duty cycle (ratio of the pulse duration (tP) to the signal period (T)) of the PWM train then determines the average current and therefore the perceived luminosity.
Figure 3 shows three different pulse trains, all operating at a constant forward current. The top example shows mid-level illumination, the center example is dimmer, and the bottom is brighter. Figure 4 shows the linear characteristic between duty cycle and forward voltage.
Figure 3: Varying the duty cycle of the PWM pulse train changes the average forward current of the LED and hence luminosity (from top: medium, low and high brightness) while maintaining the specified operating current during the ON phase. (Source: OSRAM)
Figure 4: Duty cycle is linearly related to LED luminosity. (Source: OSRAM)
Modern LED drivers from major vendors have typically been designed with PWM dimming in mind. Many chips incorporate a PWM or DIM pin enabling a direct input from a PWM generator to determine the driver’s ON and OFF cycle. However, it still pays to carefully consider the choice of LED driver because there are some key factors that mark a good LED digital dimming design from a bad one.
A key consideration is the PWM train’s frequency (or fDIM). The minimum value of fDIM is determined by the eye’s sensitivity to flicker. Recent guidelines on lighting design suggest fDIM should be greater than 80 to 100 Hz if no long-term health effects are to occur. (See library article “How New Flicker Recommendations Will Influence LED Lighting Design”.)
The designer is facing something of a trade-off though, because the higher the frequency, the greater the impact on contrast ratio. This is because even the best LED driver takes a finite time to respond to a PWM input. Figure 5 illustrates where these time delays occur.
Figure 5: An LED driver exhibits delays in its response to a dimming PWM signal. These delays determine the dimming system’s maximum contrast ratio. (Source: Texas Instruments)
In Figure 5, tD represents the propagation delay from when the PWM signal (VDIM) goes high to when the forward current driving the LED responds. (tSU and tSD are the LED forward current slew up time and slew down time, respectively.) The slew rate limits the minimum and maximum duty cycle (DMIN and DMAX) and, in turn, the contrast ratio.
Lowering fDIM generally facilitates a higher contrast ratio, as an LED driver with a fixed slew rate has sufficient time to reach the required forward current/forward voltage and then drop back to zero even for low duty cycles because T is relatively long.
(Note that for any choice of PWM dimming frequency, it’s a good idea to select an LED driver with limited slew because an LED’s switch-on time is such that it can illuminate “early” on the PWM signal’s leading edge (and hence at a forward voltage/forward current outside of the specification) exposing the consumer to the same CCT variations that plague analog dimming.)
Contrast ratio (CR) is typically expressed as the inverse of the minimum on-time:
Standard switching voltage regulators for general usage aren’t designed to be repeatedly turned on and off, so manufacturers pay scant attention to slew. In many cases these regulators even feature so-called soft-start and soft-shutdown modes (to prevent voltage spikes), which extends slew. In contrast, LED drivers for dimming applications are designed with short slew times.
LED drivers based on switching step-down (“buck”) regulators have the shortest slew times of all for two distinct reasons. First, the buck regulator delivers power to the output while the control switch is ON, making the control loops faster than step-up (“boost”) or buck-boost topologies. Second, the buck regulator’s inductor is connected to the output during the entire switching cycle, ensuring continuous output current and allowing an output capacitor to be eliminated. Eliminating the capacitor allows the driver’s output voltage/current to be slewed very quickly.[1] A careful choice of buck regulator can allow for PWM dimming frequencies in the kHz range which, while perhaps not necessary for mainstream lighting, can be useful for applications such as high-speed strobing for industrial image recognition tasks.
Designing PWM dimming LED power supplies
There are three approaches to designing an LED power supply with PWM dimming: Develop a circuit from scratch using discrete components; pair a buck LED driver with a PWM input to PWM circuitry, or replace the PWM circuitry with a dedicated PWM generator.
The first approach is not for the faint-hearted, but if budget and space are at a premium it can be the way to go. However, here we’ll look at the other two approaches based around some of the many proven, integrated, modular power management devices from a wide range of major suppliers.
A simple and relatively inexpensive PWM dimmable solution which integrates the control functions of an LED driver but allows the designer flexibility in the choice of the external MOSFET used to drive the LED comes from Texas Instruments. The LM3421 is a high-voltage N-channel MOSFET controller for LED power. The chip can be configured in buck, boost, buck-boost and single-ended primary inductor converter (SEPIC) topologies.
Of particular interest in this context, the LM3421 incorporates an nDIM pin which can be used for dimming. TI suggests two approaches for dimming, the first using an inverted PWM pulse train via a Schottky diode (DDIM), and the second using a standard PWM signal applied via a dimming MOSFET (QDIM). The second approach is useful if the application demands a high PWM frequency with good contrast ratio because it accelerates the LED driver controller’s slew rate. Figure 6 shows the PWM dimming options for the LM3421.
Figure 6: TI suggests two PWM dimming techniques for use with its LM3421 LED driver controller, either using a Schottky diode or a MOSFET for applications that require higher PWM frequencies.
For its part, Maxim Integrated has recently introduced an LED driver with built-in dimming capability that requires no external components, bar the PWM signal generator. The MAX16819 is a buck LED driver that operates from a 4.5 V to 28 V input range and features a 5 V/10 mA on-board regulator. As with the TI device described above, the DRV output of the chip is designed to supply an external MOSFET, which is connected to the LEDs and helps to reduce slew.
A notable feature of the chip is its hysteretic control algorithm, which the company claims ensures fast response during the PWM dimming operation and enables a PWM frequency of up to 20 kHz for applications requiring such a rate. The devices have a switching frequency of up to 2 MHz, thus enabling the designer to select compact external components. Figure 7 shows how quickly the forward current driving the LED responds to dimming-voltage changes.
Figure 7: Maxim Integrated’s MAX16819 employs a hysteretic control algorithm that accelerates the response to PWM dimming inputs. The figure illustrates the system’s response at a 50 percent duty cycle, with an LED current of 400 mA.
For a high-end (but obviously more expensive) solution, Linear Technology offers the LT8500 48-channel LED PWM generator. The chip can be teamed with three of the company’s LT3595 16-channel buck-mode LED drivers for a PWM-dimmable lighting solution that can power up to 480 LEDs at currents up to 50 mA.
The LT3595A is a buck LED driver designed to drive 16 independent channels of up to ten LEDs each. The chip integrates switches, Schottky diodes, and compensation components to reduce the circuit footprint and lower component cost. It runs from a 4.5 V to 45 V input and operates at a 2 MHz switching frequency (permitting the use of small inductors and capacitors).
Dimming is controlled for each channel by applying a PWM input to the 16 individual PWM pins. The device features a rapid slew up and down rate for a maximum contrast ratio of 5000:1.
The LT8500 LED PWM generator operates from a 3 V to 5.5 V input and features 48 independent channels, which allows it to be used to directly control three of the LED drivers. Each channel has an individually adjustable PWM register.
The LT8500 can adjust the brightness of each channel independently. The 12-bit PWM registers – programmable via a simple serial data interface – enable 4095 different brightness steps from 0 to 99.98 percent of maximum LED output. Figure 8 shows how the LT8500 can be configured to drive the three LT3595A buck LED drivers. Note that the RSET resistor sets the LED current for all 16 channels on the respective LED driver.
Figure 8: Linear Technology’s LT8500 can provide the PWM dimming input for three LT3595 buck LED drivers. In turn, each driver can power up to 160 LEDs. (Diagram drawn using DigiKey Scheme-it, based on an original image courtesy of Linear Technology.)
Important design considerations
While a linear characteristic between forward voltage/forward current and luminosity exists for LEDs, implementing dimming by analog techniques that vary this voltage/current, although cheap and easy to design, can detrimentally affect the LED’s CCT. This makes the technique a poor choice for high-end consumer lighting products.
PWM dimming overcomes CCT variation because when switched on, the LEDs are driven at the recommended operating point. Dimming is linearly proportional to the duty cycle of the PWM pulse train. Silicon vendors supply a wide range of PWM dimmable (typically buck) LED drivers which take advantage of the technique and some PWM generators are designed to work specifically in LED applications.
The chips are capable devices which can lead to good end products providing the designer considers two key points: Ensure the LED driver has a rapid slew rate to increase contrast ratio, and carefully select the PWM frequency to firstly ensure no EMI problems (bearing in mind the LED driver itself will be switching at high frequency independently of any PWM dimming input) and secondly, to avoid perceptible flicker in the light output.
Conclusion
For sure, consumers are putting the pressure on designers to provide linear LED dimming with a large contrast ratio and no color variation or perceptible flicker from their LED lighting. However, as we have shown, the best way to meet these requirements is to operate the LEDs at the forward current/forward voltage recommended by the manufacturer for optimum light quality, and dim the LED using pulse width modulation (PWM) of the forward current.
We have reviewed some techniques and described some practical circuits to get you started: Good luck!
Reference:
- “Light Matters Part 2: Boosting, Buck-Boosting and Dimming,” Sameh Sarhan & Chris Richardson, National Semiconductor (now part of Texas Instruments), 2008.

Disclaimer: The opinions, beliefs, and viewpoints expressed by the various authors and/or forum participants on this website do not necessarily reflect the opinions, beliefs, and viewpoints of DigiKey or official policies of DigiKey.