110V instead of 230V, less W consumption?

Message boards : Number crunching : 110V instead of 230V, less W consumption?
Message board moderation

To post messages, you must log in.

1 · 2 · Next

AuthorMessage
Profile Dirk Sadowski
Volunteer tester

Send message
Joined: 6 Apr 07
Posts: 7105
Credit: 147,663,825
RAC: 5
Germany
Message 1117739 - Posted: 16 Jun 2011, 6:42:54 UTC
Last modified: 16 Jun 2011, 6:43:56 UTC

Hello community!


Europe/Germany have 230 V.
I saw a report in TV, that someone made a voltage transformers for to reduce the 230 V to 200 V.
This hardware is directly after the house electric meter. So all electric consumer get only 200 V.
For all newer hardware is this enough. And they consume then less W.
But this is not cheap.

How about to reduce only the V for the PCs?
IIRC, all PCs which I have have PSUs which switch automatically between 110 and 230 V.

If I buy maybe a small voltage transformers (someone have experiences and could recommend one manufacturer?) for the PCs and they run at 110 V instead of the 230 V, the W consumption would be less?

How much % ?
Or this is different with PCs/PSUs and no advantage?

The voltage transformers consume also W?


Thanks!


- Best regards! - Sutaru Tsureku, team seti.international founder. - Optimize your PC for higher RAC. - SETI@home needs your help. -
ID: 1117739 · Report as offensive
Profile Donald L. Johnson
Avatar

Send message
Joined: 5 Aug 02
Posts: 8240
Credit: 14,654,533
RAC: 20
United States
Message 1117741 - Posted: 16 Jun 2011, 6:54:25 UTC
Last modified: 16 Jun 2011, 7:03:09 UTC

The power consumed by the computer is determined by the components in the computer. It makes little difference if the input to the PSU is 230, 200, or 110.

For the same power draw, the higher line voltage will draw a lower current, which will reduce (somewhat) the heat losses in the PSU transformer and cables, but should not make much difference in you power bill, unless you have a LOT of electrical equipment running.

Having an extra transformer in the circuit may actually draw MORE power from your mains, since the 230/110 step-down transformer will consume some amount of power, and also give off heat.
Donald
Infernal Optimist / Submariner, retired
ID: 1117741 · Report as offensive
Profile Wiggo
Avatar

Send message
Joined: 24 Jan 00
Posts: 38734
Credit: 261,360,520
RAC: 489
Australia
Message 1117755 - Posted: 16 Jun 2011, 7:47:23 UTC
Last modified: 16 Jun 2011, 7:50:58 UTC

In actual fact if you read a lot of good PSU reviews you'll find that 110V is much less efficient to start with than 220-240V so stick with what you have without adding the extra losses required while reducing that voltage added to the sum. ;)

Cheers.
ID: 1117755 · Report as offensive
Profile Link
Avatar

Send message
Joined: 18 Sep 03
Posts: 834
Credit: 1,807,369
RAC: 0
Germany
Message 1117756 - Posted: 16 Jun 2011, 7:48:08 UTC - in response to Message 1117739.  
Last modified: 16 Jun 2011, 7:48:41 UTC

I saw a report in TV, that someone made a voltage transformers for to reduce the 230 V to 200 V.

That might save energy for a light bulb, but for sure not for any modern electronics using switching PSUs. The generated the low voltages (12V, 5V etc.) will not change and so the power usage won't.



If I buy maybe a small voltage transformers (someone have experiences and could recommend one manufacturer?) for the PCs and they run at 110 V instead of the 230 V, the W consumption would be less?

No, it would be more. For the first, the transformers will use power, even on good ones I would expect at least 10% additional power consumption, just think how warm this things used to be. And than, the PSUs of computers used to be less effective on 110V compared to 230V, not much, but they are. I saw a graphics for one Corsair PSU, which could take anything between 90-265V, with lower voltage it was using a little bit more. Even if that would not apply for your PSUs, the power usage of the transfomer is still there.
ID: 1117756 · Report as offensive
Profile TOM
Volunteer tester
Avatar

Send message
Joined: 5 Apr 01
Posts: 53
Credit: 65,422,234
RAC: 86
Germany
Message 1117776 - Posted: 16 Jun 2011, 10:28:09 UTC
Last modified: 16 Jun 2011, 10:52:56 UTC

The efficiency of a PSU is dependent on the load (current) and the voltage/frequency input.
My 1200W PSU has an efficiency of:

PSU-Load---20%---50%--100%
230V/50Hz 88,6% 91,8% 90,2%
115V/60Hz 87.5% 90,1% 87,4%

simple speak:

  • The Power is the Product of Voltage and Current, but the Current is the main source of loss. That's one of the reasons why we use very high voltage to transport Power over long distances. If you halve the voltage you must double the current to get the same power, thath's the law of nature.
  • If you use extra components to reduce the voltage, these components consume extra energy. For example: if you combine a Transformer with an efficiency of 90% and a PSU with 90% you get a resulting efficiency of 81%. So there is no gain in doing so. There are Transformers to reduce the voltage from 230V to 115V, but they are only used by special (old) equipment where you can't switch between 115V and 230V. Modern equipment is switchig automaticly between different voltages.
  • Antoher reason not to reduce the voltage is stability. Electronic devices are suspectable of reducing or increasing the input voltage. Of course they have a tolerance level but this is almost +/- 5%. If you leave this tolerance floor there is no guarantee of proper function.


ID: 1117776 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1117821 - Posted: 16 Jun 2011, 13:21:24 UTC - in response to Message 1117755.  

In actual fact if you read a lot of good PSU reviews you'll find that 110V is much less efficient to start with than 220-240V so stick with what you have without adding the extra losses required while reducing that voltage added to the sum. ;)

Cheers.

I have seen this on all of the PSU makers labels as well. However I wonder if the 240v rating is for 50Hz and the 120v rating is for 60Hz. As all of my PSUs are 100-240v by design I plan to do some efficiency tests by only switching the input voltage.


As for the transformers that reduce the voltage I have seen these before. In airbase RADAR rooms they are used to maintain a constant voltage to vital equipment at all times. So when a brownout occurs, where the line voltage drops 5-20v, there is no change on their end and their backup power systems don't have to kick in.

Our company uses a system to maintain a constant 100v, instead of our normal 120v, in our environmental test lab.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1117821 · Report as offensive
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22985
Credit: 416,307,556
RAC: 380
United Kingdom
Message 1117871 - Posted: 16 Jun 2011, 15:17:15 UTC

Dropping line voltage can be a bit of a "snake oil" to some folks. It doesn't really work with most electronic equipment as they are constant power devices, but for some things that are constant current devices, light bulbs are the most obvious it will work, but not as much as you would expect. There are problems with using reduced voltage (200/230, or 100/110) in that many power supplies are optimised to run most efficiently at the designed supply voltage, and moving away from these design points can increase the power demand due to increased losses.
Of course some folks get confused by electricity, and have no grasp of the fact that a low voltage lamp need not be a low energy lamp - I've had some fun in local chain diy stores with staff not understanding that a 50W/12V lamp uses the same amount of energy as a 50W/230V one....
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 1117871 · Report as offensive
Profile ML1
Volunteer moderator
Volunteer tester

Send message
Joined: 25 Nov 01
Posts: 21985
Credit: 7,508,002
RAC: 20
United Kingdom
Message 1117877 - Posted: 16 Jun 2011, 15:27:09 UTC - in response to Message 1117739.  
Last modified: 16 Jun 2011, 15:28:20 UTC

I saw a report in TV, that someone made a voltage transformers for to reduce the 230 V to 200 V.
This hardware is directly after the house electric meter. So all electric consumer get only 200 V.
For all newer hardware is this enough. And they consume then less W.
But this is not cheap.

How about to reduce only the V for the PCs? ...

For PCs, the lower mains supply voltage usually causes the PSUs to "work harder" and slightly less efficiently. At a lower input voltage, you need more pulses of current for a given power output, and hence more wasted heat from the switching power transistors.

For other house loads such as heaters and lights, the lower voltage means you will get less power dissipated. Hence, your (incandescent) lights will be a little dimmer, and it will take longer to boil water in an electric kettle.

The 'compact fluorescent' bulbs will grab higher current for a lower input voltage to maintain their output brightness.

So... All that gives a big loss for inefficiency and extra power loss due to requiring higher currents at the lower voltage.

Worse still... Power companies will deliberately lower the mains voltage a little (brown out) to reduce peak power consumption from unregulated loads such as heaters and motors. If you've already 'browned out' your supply with a step-down transformer, then you could well drive your equipment with too low a voltage... For motors such as for your freezer, aircon, heating and so on, you could have the motors stall and damage their windings with a high stall current.


So, for my summary:

All bad and a bit of a con-trick.


There's far far better and more efficient ways to save energy.

Good luck,
Martin
See new freedom: Mageia Linux
Take a look for yourself: Linux Format
The Future is what We all make IT (GPLv3)
ID: 1117877 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1117915 - Posted: 16 Jun 2011, 16:12:19 UTC - in response to Message 1117877.  

I saw a report in TV, that someone made a voltage transformers for to reduce the 230 V to 200 V.
This hardware is directly after the house electric meter. So all electric consumer get only 200 V.
For all newer hardware is this enough. And they consume then less W.
But this is not cheap.

How about to reduce only the V for the PCs? ...

For PCs, the lower mains supply voltage usually causes the PSUs to "work harder" and slightly less efficiently. At a lower input voltage, you need more pulses of current for a given power output, and hence more wasted heat from the switching power transistors.

For other house loads such as heaters and lights, the lower voltage means you will get less power dissipated. Hence, your (incandescent) lights will be a little dimmer, and it will take longer to boil water in an electric kettle.

The 'compact fluorescent' bulbs will grab higher current for a lower input voltage to maintain their output brightness.

So... All that gives a big loss for inefficiency and extra power loss due to requiring higher currents at the lower voltage.

Worse still... Power companies will deliberately lower the mains voltage a little (brown out) to reduce peak power consumption from unregulated loads such as heaters and motors. If you've already 'browned out' your supply with a step-down transformer, then you could well drive your equipment with too low a voltage... For motors such as for your freezer, aircon, heating and so on, you could have the motors stall and damage their windings with a high stall current.


So, for my summary:

All bad and a bit of a con-trick.


There's far far better and more efficient ways to save energy.

Good luck,
Martin

Hopefully the device Sutaru saw was a constant voltage transformer, or power conditioner, instead of just a step-down transformer.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1117915 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51586
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1117918 - Posted: 16 Jun 2011, 16:18:36 UTC
Last modified: 16 Jun 2011, 16:20:48 UTC

In short......
This is a no-gainer.
There may be very small differences in efficiency, but by and large, today's switching power supplies simply auto adjust their switching circuits to adapt to any mains voltage withing their operating range.
If the mains voltage goes down, they simply adjust input current up....the wattage used remains the same. Reducing the input voltage is not going to change the amount of power that the computer components require on the low voltage DC output of the PSU.
And even if the PSU was a tad more efficient at a lower input range, I doubt it would ever offset the conversion losses in a step down transformer from 220v to 110v.

The only way to actually reduce the power consumption of the computer would be to install more efficient CPU/GPU/HD components or undervolt and/or underclock the CPU or GPU.
"Time is simply the mechanism that keeps everything from happening all at once."

ID: 1117918 · Report as offensive
justsomeguy

Send message
Joined: 27 May 99
Posts: 84
Credit: 6,084,595
RAC: 11
United States
Message 1117940 - Posted: 16 Jun 2011, 16:56:13 UTC

Not to mention that for starters, you are already using a transformer.
Your PSU supplies 12/5 VDC to your components. I'd have to agree with a
couple folks up above, this is a con. While it might use less energy to
to run your stove it will take longer to boil water for your mac and cheese.

Also take into account the frequency. Dropping from 60 to 50 Hz will make your
clocks run slower, somewhere between 10 and 20 minutes every day- IIRC. Have
to set the clock every day or be late to work.

Again as far as electronics go, there is no gain in this. almost all have
a power adapter in them and take care of the voltage reduction.

230v as stated above is more efficient for stepping down. You don't have to
bring it down as far after transmission to be usable for customers. Most
residential power (in the US) runs at 2400v, 4800v for longer runs, and
your primary transmission lines (long haul lines from the point of origin)
are any where from 35k to 130k v depending on distances.

....here's your 98 cents back, thanks!
:)
Kevin

"Two things are infinite: The universe and human stupidity; and I'm not sure about the universe." - Albert Einstein

ID: 1117940 · Report as offensive
Cosmic_Ocean
Avatar

Send message
Joined: 23 Dec 00
Posts: 3027
Credit: 13,516,867
RAC: 13
United States
Message 1117946 - Posted: 16 Jun 2011, 16:59:03 UTC

I wondered about this years ago, and then got educated on how it actually works. Some people used to say that for motors, running them on 220-240 would "cut the amperage in half", but that's only part of the story.

On 120VAC, with 3 conductors, you have hot, neutral and ground. Here in the US, neutral and ground are almost always tied together in the breaker panel, so all of the power draw goes through the one conductor for the hot, and then returns to neutral/ground.

On 240VAC, still with 3 conductors, you have hot hot ground. The two hots are on opposite phases, so there is no need for a neutral because the opposite-phases end up canceling each other out and makes a flat line, essentially. Now, half the amperage is drawn on each hot leg, but then there are two legs, so the power consumption is nearly identical. But what 240VAC will allow you to do is run a smaller gauge wire since only half the load is going to be on each conductor.

For motors and very heavy start-up load devices, 240VAC is more efficient, because it is a bigger kick to get it up to speed, but once it is past start-up load, the wattage consumption is nearly identical. There may be something like <1% of a gain, but it is negligible.

Electronics don't really care what voltage you feed them, but again, higher voltages do improve efficiency, but by a very very small margin.
Linux laptop:
record uptime: 1511d 20h 19m (ended due to the power brick giving-up)
ID: 1117946 · Report as offensive
justsomeguy

Send message
Joined: 27 May 99
Posts: 84
Credit: 6,084,595
RAC: 11
United States
Message 1117953 - Posted: 16 Jun 2011, 17:00:56 UTC

Okay, now that I think about...110v really only serves two purposes...

1) cheaper to manufacture appliances

2) appliances are cheaper, have higher resistance and therefor burnout
faster - hmmm, anyone remember to phrase "planned obsolescence" from
high school economics?

"Two things are infinite: The universe and human stupidity; and I'm not sure about the universe." - Albert Einstein

ID: 1117953 · Report as offensive
justsomeguy

Send message
Joined: 27 May 99
Posts: 84
Credit: 6,084,595
RAC: 11
United States
Message 1117959 - Posted: 16 Jun 2011, 17:09:11 UTC - in response to Message 1117946.  



On 240VAC, still with 3 conductors, you have hot hot ground. The two hots are on opposite phases, so there is no need for a neutral because the opposite-phases end up canceling each other out and makes a flat line, essentially. Now, half the amperage is drawn on each hot leg, but then there are two legs, so the power consumption is nearly identical. But what 240VAC will allow you to do is run a smaller gauge wire since only half the load is going to be on each conductor.



Not supposed to be three...should be four conductors...hot, hot, neutral
and ground. Each hot pulls off of a different post in the breaker panel
and as you also noted, neutral and ground are often (but shouldn't be)
connected together at the panel. I hate electricians that don't do this
right!

I would expect better efficiency from the higher voltage though. As someone
else above noted, there is less loss in the higher voltage. I know with water
the friction loss is exponential, I would expect similar results with
power...after all, wiring and plumbing are very similar! :)


"Two things are infinite: The universe and human stupidity; and I'm not sure about the universe." - Albert Einstein

ID: 1117959 · Report as offensive
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14690
Credit: 200,643,578
RAC: 874
United Kingdom
Message 1117962 - Posted: 16 Jun 2011, 17:11:12 UTC - in response to Message 1117953.  

Okay, now that I think about...110v really only serves two purposes...

1) cheaper to manufacture appliances

2) appliances are cheaper, have higher resistance and therefor burnout
faster - hmmm, anyone remember to phrase "planned obsolescence" from
high school economics?

And a third. In the UK, we only see 110V used via step-down transformers, usually on building sites.

3) Safety. 110V is less likely to kill you if you cut through a cable.
ID: 1117962 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51586
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1117963 - Posted: 16 Jun 2011, 17:12:20 UTC - in response to Message 1117959.  



On 240VAC, still with 3 conductors, you have hot hot ground. The two hots are on opposite phases, so there is no need for a neutral because the opposite-phases end up canceling each other out and makes a flat line, essentially. Now, half the amperage is drawn on each hot leg, but then there are two legs, so the power consumption is nearly identical. But what 240VAC will allow you to do is run a smaller gauge wire since only half the load is going to be on each conductor.



Not supposed to be three...should be four conductors...hot, hot, neutral
and ground. Each hot pulls off of a different post in the breaker panel
and as you also noted, neutral and ground are often (but shouldn't be)
connected together at the panel. I hate electricians that don't do this
right!

I would expect better efficiency from the higher voltage though. As someone
else above noted, there is less loss in the higher voltage. I know with water
the friction loss is exponential, I would expect similar results with
power...after all, wiring and plumbing are very similar! :)


Uhh...
If I am not mistaken, code here requires the neutral and earth ground to be bonded together at the main panel, but not at a sub-feed panel.
"Time is simply the mechanism that keeps everything from happening all at once."

ID: 1117963 · Report as offensive
justsomeguy

Send message
Joined: 27 May 99
Posts: 84
Credit: 6,084,595
RAC: 11
United States
Message 1117969 - Posted: 16 Jun 2011, 17:30:37 UTC - in response to Message 1117962.  

Okay, now that I think about...110v really only serves two purposes...

1) cheaper to manufacture appliances

2) appliances are cheaper, have higher resistance and therefor burnout
faster - hmmm, anyone remember to phrase "planned obsolescence" from
high school economics?

And a third. In the UK, we only see 110V used via step-down transformers, usually on building sites.

3) Safety. 110V is less likely to kill you if you cut through a cable.



Actually, that's a misnomer. It takes 0.1 amps to stop your heart. The higher
current simply adds the risk that it will COOK your heart before you can get
away from it...easier to pull away from 110v, IMO.

I've had my fair share of electrocutions, 110v, I stick my finger on it to
see if it's hot...get a slight buzzing feeling. 220v will burn the cutting
edge out of a pair of pliers in the blink of an eye, but the most painfull
shock I ever got was from a neon sign, stepped up to 9000v and down to
0.095 amps. No way it could have killed me but it sure as heck felt like it
was trying!

I guess my point is that the chances that a certain voltage will kill you is
about even, but the higher amperage will make the difference. Mind you that
on a work site the 230v is going to be carrying high amps as well for all the
tools they run. A typical breaker box in a house is 240v and 200 amps. Where as
your average wall outlet it 110v at 15 amps.
"Two things are infinite: The universe and human stupidity; and I'm not sure about the universe." - Albert Einstein

ID: 1117969 · Report as offensive
Kevin Olley

Send message
Joined: 3 Aug 99
Posts: 906
Credit: 261,085,289
RAC: 572
United Kingdom
Message 1117970 - Posted: 16 Jun 2011, 17:37:00 UTC - in response to Message 1117969.  

Okay, now that I think about...110v really only serves two purposes...

1) cheaper to manufacture appliances

2) appliances are cheaper, have higher resistance and therefor burnout
faster - hmmm, anyone remember to phrase "planned obsolescence" from
high school economics?

And a third. In the UK, we only see 110V used via step-down transformers, usually on building sites.

3) Safety. 110V is less likely to kill you if you cut through a cable.



Actually, that's a misnomer. It takes 0.1 amps to stop your heart. The higher
current simply adds the risk that it will COOK your heart before you can get
away from it...easier to pull away from 110v, IMO.

I've had my fair share of electrocutions, 110v, I stick my finger on it to
see if it's hot...get a slight buzzing feeling. 220v will burn the cutting
edge out of a pair of pliers in the blink of an eye, but the most painfull
shock I ever got was from a neon sign, stepped up to 9000v and down to
0.095 amps. No way it could have killed me but it sure as heck felt like it
was trying!

I guess my point is that the chances that a certain voltage will kill you is
about even, but the higher amperage will make the difference. Mind you that
on a work site the 230v is going to be carrying high amps as well for all the
tools they run. A typical breaker box in a house is 240v and 200 amps. Where as
your average wall outlet it 110v at 15 amps.



110v is misleading, it might say that on the tin but it is center tapped, ie +55v to -55v, so only a risk of a 55v shock.


Kevin


ID: 1117970 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51586
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1117971 - Posted: 16 Jun 2011, 17:40:26 UTC - in response to Message 1117970.  



110v is misleading, it might say that on the tin but it is center tapped, ie +55v to -55v, so only a risk of a 55v shock.


Not so.....
At least here in the US....
Normal residential service is 110/220 or 120/240...
Each leg is 120v to ground or neutral or 240v between the two for appliances.
Nowhere can you get 55 or 60 volts.
"Time is simply the mechanism that keeps everything from happening all at once."

ID: 1117971 · Report as offensive
Profile Geek@Play
Volunteer tester
Avatar

Send message
Joined: 31 Jul 01
Posts: 2467
Credit: 86,146,931
RAC: 0
United States
Message 1117975 - Posted: 16 Jun 2011, 17:46:28 UTC - in response to Message 1117970.  

110v is misleading, it might say that on the tin but it is center tapped, ie +55v to -55v, so only a risk of a 55v shock.


Common on board ships with delta connected lighting transformers. Not on shore side.

Boinc....Boinc....Boinc....Boinc....
ID: 1117975 · Report as offensive
1 · 2 · Next

Message boards : Number crunching : 110V instead of 230V, less W consumption?


 
©2026 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.