View Full Version : Cost of heating 110v vs. 220


trainer
01-05-2009, 12:05 PM
My question is on how power is metered.

I understand that a 220 volt heating circuit uses half the current to produce the same wattage as a 110v heater. for example, A 1000 watt heater draws 9.1 amps for a 110v model but only 4.5 amps on a 220v unit.

Since I'm drawing power through both hot legs of my service on the 220 heater, would the meter see less consumption then the same wattage on only one leg of the service as with 110?

Stuart in MN
01-05-2009, 12:12 PM
The voltage won't make a difference on what the power company charges you. They charge based on kilowatt-hours. If your heater draws 1000 watts (one kilowatt), it will cost the same to run it for an hour, no matter what the voltage is.

Watts = volts x amps. If you increase the voltage the amperage goes down, but the watts will be a constant.

rinny_tin_tin
01-05-2009, 01:17 PM
My question is on how power is metered.

I understand that a 220 volt heating circuit uses half the current to produce the same wattage as a 110v heater. for example, A 1000 watt heater draws 9.1 amps for a 110v model but only 4.5 amps on a 220v unit.

Since I'm drawing power through both hot legs of my service on the 220 heater, would the meter see less consumption then the same wattage on only one leg of the service as with 110?

Power is indeed the Volt-Amp product, or I*E. So what Stuart says is true - but, the advantage of running 220 over 110 is what is called as reduced I^2R losses (Read I squared R) where R represents the resistance of your cable feeding the load -- that is the heat generated in the cable and in everything else between the power source and the load. So - if you half the current by doubling the voltage you reduce your losses significantly. For example - consider a 10kW at 120V. Your current (I) is 10,000/120 = 83.3 Amperes. If you push that through a cable with a resistance of say 2 Ohms, your I-squared-R losses are 83.3 * 2 = 167 W! Now double the voltage to 220 V and your current becomes half of 83.3 or 41.7 Amperes. Your I-squared-R losses now becomes 41.7 * 2 = 83.4 W. This is wasted energy that you have to pay for - energy that is doing nothing but heating up your cable. This is why power transmission voltages are very high- like 220 kV.

So technically, running higher voltage is cheaper than lower voltage.

QED

Stuart in MN
01-05-2009, 02:25 PM
For example - consider a 10kW at 120V. Your current (I) is 10,000/120 = 83.3 Amperes. If you push that through a cable with a resistance of say 2 Ohms, your I-squared-R losses are 83.3 * 2 = 167 W! Now double the voltage to 220 V and your current becomes half of 83.3 or 41.7 Amperes. Your I-squared-R losses now becomes 41.7 * 2 = 83.4 W. This is wasted energy that you have to pay for - energy that is doing nothing but heating up your cable. This is why power transmission voltages are very high- like 220 kV.

That is true, but for residential purposes it's not as big an issue. For an example, let's assume a 1500 watt heater which is typically the biggest you can get that's designed to run on 120vac.

1500/120 = 12.5 amps. If 12 gauge wire is used, and if the circuit is 50 feet long, the resistance of the wire is about 0.08 ohms and the I-squared-r value is just under 13 watts.

Now, if the heater was set up for 240vac, 1500/240 = 6.25 amps. You could get away with 14 gauge wire for that amount of current, and for a 50 foot long circuit it would have a resistance of about 0.13 ohms. The I-squared-R value is about 5 watts.

So, the difference is only 13-5 = 8 watts. Certainly, it is good design practice to use higher voltage where you can, but for the relatively small heaters and relatively short lengths of the circuits used in a residential application it won't amount to much. If it were me, I'd wire a 1500 watt heater at 120vac. For anything bigger than that I'd use 240vac, mainly because it will save on wire size more than anything else.

Gary S
01-05-2009, 03:08 PM
But, remember that any voltage loss in the cable is heat, and heat is what you are trying to create here. The end cost will still be the same no matter if you use 110v or 220v heaters. 110v heaters are limited to 1500watts, so you aren't going to lose any heat on the large incoming feeder lines. You might lose some in the small power cord, but that is heat in the intended location anyway.

ddawg16
01-05-2009, 03:53 PM
One other factor is line loading.

When you go 220, all the power is going down the two 110V lines. If you go 110v, then one side of your 220 is carrying all that load along with the neutral.

If you put enough loads on one side, you could start to see other items connected on the same side affected.

When I was planning out the power in my panel, I made a considerable effort to have the load as balanced as possible.

rinny_tin_tin
01-05-2009, 05:29 PM
But, remember that any voltage loss in the cable is heat, and heat is what you are trying to create here. The end cost will still be the same no matter if you use 110v or 220v heaters. 110v heaters are limited to 1500watts, so you aren't going to lose any heat on the large incoming feeder lines. You might lose some in the small power cord, but that is heat in the intended location anyway.


Well - what if I was trying to cool the room and not heat it? What if I was trying to turn a motor?

rinny_tin_tin
01-05-2009, 05:38 PM
That is true, but for residential purposes it's not as big an issue. For an example, let's assume a 1500 watt heater which is typically the biggest you can get that's designed to run on 120vac.

1500/120 = 12.5 amps. If 12 gauge wire is used, and if the circuit is 50 feet long, the resistance of the wire is about 0.08 ohms and the I-squared-r value is just under 13 watts.

Now, if the heater was set up for 240vac, 1500/240 = 6.25 amps. You could get away with 14 gauge wire for that amount of current, and for a 50 foot long circuit it would have a resistance of about 0.13 ohms. The I-squared-R value is about 5 watts.

So, the difference is only 13-5 = 8 watts. Certainly, it is good design practice to use higher voltage where you can, but for the relatively small heaters and relatively short lengths of the circuits used in a residential application it won't amount to much. If it were me, I'd wire a 1500 watt heater at 120vac. For anything bigger than that I'd use 240vac, mainly because it will save on wire size more than anything else.

In a limited small-scale residential setting, than the differences may be negligible, but when you add up many household loads, that many household, that many neighborhoods, that many cities, etc - and then consider the extra economic electric power demand by the aggregate sum of these seemingly small power losses, it would be enough to eliminate several power stations while also serve to drive the cost of power substantially down. Other benefits must also be considered that weigh against cost - higher voltages/less amperes permit small conductor cross-section, thus the cost of materials is less, less weight, less space, etc etc etc. PLUS, it also a greener solution (less CO2, Hg, CO, H2SO4, etc etc etc)

So, the answer to the original subject question regarding which is cheaper, the answer is incontrovertibly the higher voltage.

2LTim
01-05-2009, 08:16 PM
I have to believe there are at least 4 or 5 more hairs out there that are yet unsplit!!!!!!

Charles (in GA)
01-05-2009, 08:33 PM
This falls in the same category as Obama telling us that if we all had tire pressure gauges and used them to keep our tire pressures set properly, we could solve the oil/gasoline crisis.

It would help, and you should never look a gift horse in the mouth, but........

Charles

rinny_tin_tin
01-05-2009, 11:23 PM
This falls in the same category as Obama telling us that if we all had tire pressure gauges and used them to keep our tire pressures set properly, we could solve the oil/gasoline crisis.

It would help, and you should never look a gift horse in the mouth, but........

Charles

Well...what you rather have him tell us? Take public transportation or go sell your gas hog and instead buy an electric car? On second thought, that may not be a bad idea considering the plight of the big three ..cept only Honda and the other rice manufacturers have product..while others only have willy in hand

:lol_hitti

Gary S
01-05-2009, 11:46 PM
Well - what if I was trying to cool the room and not heat it? What if I was trying to turn a motor?

The original question was about heating, not air conditioning or motors. Voltage drop in a properly designed heater won't be noticable, and like I already said, if there is any drop in the cord, you still get heat.

trainer
01-06-2009, 08:25 AM
That is true, but for residential purposes it's not as big an issue. For an example, let's assume a 1500 watt heater which is typically the biggest you can get that's designed to run on 120vac.

1500/120 = 12.5 amps. If 12 gauge wire is used, and if the circuit is 50 feet long, the resistance of the wire is about 0.08 ohms and the I-squared-r value is just under 13 watts.

Now, if the heater was set up for 240vac, 1500/240 = 6.25 amps. You could get away with 14 gauge wire for that amount of current, and for a 50 foot long circuit it would have a resistance of about 0.13 ohms. The I-squared-R value is about 5 watts.

So, the difference is only 13-5 = 8 watts. Certainly, it is good design practice to use higher voltage where you can, but for the relatively small heaters and relatively short lengths of the circuits used in a residential application it won't amount to much. If it were me, I'd wire a 1500 watt heater at 120vac. For anything bigger than that I'd use 240vac, mainly because it will save on wire size more than anything else.

Code here is to use minimum of 12ga wire for 220v heating circuits. Every hardwired heater i've seen for sale is 220, so I dont think there would be any savings in wire size most of the time.

WHat i've done is I have a 2000w/220v heater permantly installed in my old shop that I maintain at 50 I've been using a 1500w/ 120 portable unit to help bring the temp more quickly on really cold days.
I'm uncertain if I'm going to add another permanant heaters or run a gas line to that building in the future.

Stuart in MN
01-06-2009, 08:42 AM
Code here is to use minimum of 12ga wire for 220v heating circuits.
That must be a Canada thing, as far as I know it's not the case in the US. At any rate, if #12 wire is used in the example I gave earlier the I-squared-r value for the 240 circuit changes from 5 to 3 watts. So, the difference is still pretty small.

rinny_tin_tin
01-06-2009, 10:24 AM
The original question was about heating, not air conditioning or motors. Voltage drop in a properly designed heater won't be noticable, and like I already said, if there is any drop in the cord, you still get heat.

So....what I'm hearing from you now is had he asked about cooling instead - your answer would have been different?
:lol_hitti

rinny_tin_tin
01-06-2009, 10:26 AM
That must be a Canada thing, as far as I know it's not the case in the US. At any rate, if #12 wire is used in the example I gave earlier the I-squared-r value for the 240 circuit changes from 5 to 3 watts. So, the difference is still pretty small.

Small or smaller or much smaller - the answer still does not change. Why do you think they offer the 240V option to begin with? For the cheapskates? LOL
:lol_hitti