Main > Main Forum

The power cost of running a video game in Illinois

<< < (3/4) > >>

shardian:

--- Quote from: ahofle on October 02, 2006, 03:48:07 pm ---Does a 400 watt PC power supply really consume 400 watts at all times?  I thought that was just the maximum if you happened to have a tons of devices in it.

--- End quote ---

Maybe an electrical guru can prove or disprove me, but I understand that AC/DC converters convert a full 400 watts, and the what is not used is converted to heat - thus the fins and cooling fan on a power supply. This is the same thing a voltage regulator chip does in dc circuitry...except it is on a much smaller scale and gets by with passive cooling.

Gambit:
It's not really alot.  Lets say it uses 400 Watts, then all you need to do to break even is turn off 4 light bulbs around the house.

shardian:

--- Quote from: Gambit on October 02, 2006, 04:06:37 pm ---It's not really alot.  Lets say it uses 400 Watts, then all you need to do to break even is turn off 4 light bulbs around the house.

--- End quote ---

400 watts 24 hours a day adds up quickly.

ahofle:
http://ask.metafilter.com/mefi/20850
This seems to imply it will only draw what it needs (not the maximum rating of the PSU).

MonMotha:
Most switching power supplies (the kind used in computers and arcades) are rated in terms of maximum power input (above which they'll either shut down or blow up).  The output is less than that by the efficiency of the conversion process (usually around 75-90% or so).  However, the output power is determined by the load you have hooked up to it.  The input power then scales accordingly.  In other words, if you have a 10W load on a 500W PC power supply, it'll probably only be drawing 15-25W (efficiency drops at lower power loads due to non-varying overhead).

Linear power supplies behave the same way, but the efficiency is lower, and they're usually rated in power output, rather than input.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version