I want to power a LED with 120 VAC. I set up a circuit like the one in the attached file. I sized the resistor with ohm's law. I wanted 20 ma so that means a 6.8 K resistor with a minimum of 2.4 watts. So I used a 3 watt metal film resistor. even so it gets very hot. Is this normal?????????????? Where am I going wrong??????????? do people use a driver at that voltage????????? it would look goofy to use a 5 watt. please help.
Your best bet is to use a driver designed for 120V and correct wattage. Unless you want to string together Many , Many Leds like they do with Christmas lights.
Well, I'm talking about a simple Panel-mount LED indicator. It works great to simply use a resistor at 12, 24 volts. If I've got a row of indicators, it doesn't look cost effective to use a driver for each and every one. Maybe the resistor is designed to get hot but my rule of the thumb is, if you'll burn yourself if touching it for more than a second, then there's a circuit design flaw. Does anybody know how the industry does it?
D1 seems to be in the wrong place, it needs to be in series with the LED to block the reverse current, also in that config, it conducts every half cycle, loading the resistor.
I would try about a 10k resistor to limit the total current to about 12ma. (you can still use the 6.8 but it will get somewhat warm) That will cut the power, and properly placing the diode will also help.
Put the diode in series with the LED in reverse.
You’re probably lucky the reverse voltage didn’t blow the LED.
You've got good points. I got the circuit idea from my NTE Reference catalog, but it could easily be that its incorrect. The LED would never blow because the resistor is a current limiter(simple version of the high-tech led constant current drivers), but its lifetime could be shortened. I think i'll try out a 5 watt wirewound resistor. they're not that expensive.
if what you meant by putting the diode in series with the led in the opposite direction of the led, the cicuit would not conduct.
Yes you are correct, the diode belongs in series with the same polarity as the LED, and it’s there to drop most of the reverse voltage that some LEDs cannot tolerate. Sorry bout that.
In your diagram the LED conducts for half a cycle, and then the diode conducts for the other half, shunting the current around the LED.
I just bread boarded this circuit (in series) with a fairly new LED a 4003 diode and a 10k resistor. (1 watt resistor)
I was getting 80-85 degrees F off of the resistor and the LED was very bright.
I really don’t know why the diode is used as a shunt it that diagram, but it will burn more watts. Actually the diode may not even be needed for modern LEDs.
Also to consider is the fact that running a LED from AC will cause the lifetime to be reduced because it goes on and off 60 times a second, without some kind of filter.
No, I use neon for 120 volt AC panels.
I would have to guess, they just use high brightness LEDs and a resistor, modern high brightness LEDs need very little current to light up brightly.
Better Idea: How do you know the most of the PIV will go across the series diode instead of the LED? You don't! Better plan: Place the diode in parallel across the LED to short the reverse current around the the LED. So, connect the diode anode to LED cathode and in parallel. Still keep the series resister of 22k ohms to limit the current to about 7ma peak (120v X 1.414 / 22000).
Matt43 sounds good to me. Remember that, although a LED is a diode itself, they don't make them to take the Peak Inverse Voltage (PIV) of a 120 VAC power, which due to the high reverse voltage on the diode/led has to take pretty much the full peak voltage of 120 V X 1.414 or about 170 volts. (Also remember the diode has to be connected to the LED with anode to cathode so they both allow current in the same direction). LED should be too slow to show the 60 hz "flicker" you are powering with.