If you search on Google for “light bulb current limiter” you should come across several articles, newsgroup and forum posts about the details of building one. (i.e. )
Essentially, you connect a incandescent light bulb in series with an outlet, so you can safely test a device that might be prone to blow a breaker, or draw so much current as to release magic smoke.
One common feature of many of these articles is the idea that the light bulb will “blow” if the device draws too much current. This is totally and completely false.
When a 60-watt light bulb is connected directly to mains, it will draw approximately 60-watts of power, or 500 milliamps at 120-volts. That is the maximum amount of power that it will permit to flow, and it will burn at full brightness.
When you connect a device in series with the light bulb, it is effectively acting as a resistor to the current through the light bulb. With a 60-watt bulb, a device that used 250-milliamps wired in series would cause the bulb to burn at about 50% brightness (although light bulb brightness as a function of current is non-linear so this is not entired true, this is just an example…)
Now let’s imagine a worst case scenario – what could be wrong with an electrical appliance that would cause it to draw the maximum amount of current? Why shorting the power cord of course.
If you short the power cord of a device attached in series with a light bulb, the circuit is 100% IDENTICAL to plugging the light bulb directly into the wall. Nothing blows up, nothing catches fire – the bulb limits the circuit to 60-watts.
So if you are testing an electrical device using a lightbulb current limiter, please be aware: the bulb will not stop the power if too much is used. It is up to you to know how much power your device should be using, selecting a light bulb of the correct wattage to prevent it from going above (or much above) this, and turning off the power if the bulb is burning at full brightness when it should not be.
Great explanation. So how do we determine how much power the device under test should be using?
The label on the device should give you some indication, although it is usually represented in amps (or milliamps). Amps is a measure of current, while watts is a measure of power. (and voltage times current equals power — or Volts x Amps = Watts). The typical voltage in the United States is 120 volts, but to get a truly accurate conversion you would need to test the voltage at your receptacle (it can be as low as 110, or as high as 125 ).
If the rating is in amps or milliamps, you would multiply the amps (or milliamps x 1000) by the voltage to get the watts. (thus 1 amp = (1 amp * 120 volts) watts = 120 watts)
A 60-watt incandescent lightbulb in the US (@ 120 volts) will permit around 0.5 amps (or 500 millamps) if your mains is 120 volts.
In the US, the typical maximum for 120 volt appliances is 1,650 watts. This is because most household electrical circuits in the US are 15-amp, and 15 amps x 110 volts = 1,650 watts. (The lower the voltage, the more current required to provide a particular quantity of power – 110 is the absolute minimum of even low-quality AC. Anything higher than 1650 would trip a 15 amp breaker if the voltage was 110 or lower)
Your device should have a label listing the specs.
Slightly complicated by the fact that the bulb’s resistance when cold is about 1/10th of resistance when hot. So a 240v/60w bulb draws about 250mA and so has a hot resistance of about 1000 ohms, but it has a cold resistance of 100 ohms (as you will measure with a meter) which will drop very little voltage at, say, 25mA.
The idea is that the bulb will not light when the circuit functions normally, which means it might as well be a bit of wire. But, if there is a fault, it limits current just as described above.