The NEW Build Your Own Arcade Controls

Main => Main Forum => Topic started by: leapinlew on October 01, 2006, 03:36:24 pm

Title: The power cost of running a video game in Illinois
Post by: leapinlew on October 01, 2006, 03:36:24 pm
In Illinois they are raising the electric rates. Anyone have an idea how high and how much it'll cost to run a video game 24/7?
Title: Re: The power cost of running a video game in Illinois
Post by: Texasmame on October 01, 2006, 03:49:40 pm
I always heard it was about the same as running a fridge.  No biggie.  Even with all the games I had, I never noticed any real spike in the electric bills.  Then again, I didn't run 'em all the time.  You probably shouldn't, too, if it is a classic with all original components.

fire  >:D  fire
Title: Re: The power cost of running a video game in Illinois
Post by: quarterback on October 01, 2006, 04:59:20 pm
This will tell you:
http://www.amazon.com/P3-International-Kill-a-Watt-Electricity-Monitor/dp/B00009MDBU
Title: Re: The power cost of running a video game in Illinois
Post by: meany on October 02, 2006, 12:42:16 pm
In Illinois they are raising the electric rates. Anyone have an idea how high and how much it'll cost to run a video game 24/7?

The rates are going up 25% in January 07
Title: Re: The power cost of running a video game in Illinois
Post by: leapinlew on October 02, 2006, 12:46:34 pm
I've heard numbers from 10% to double. I guess we'll see.
Title: Re: The power cost of running a video game in Illinois
Post by: shardian on October 02, 2006, 01:26:22 pm
Assuming a 200watt arcade power supply, running 24 hours a day:

200W = 0.2kW
0.2kW * 24 hours = 4.8kW-h

your machine will consume 4.8 kilowatt hours each day. Multiply that by the amount you pay per kilowatt-hour to figure running cost. You can figure kW-h cost by using a previous power bill.
In my area, I pay about 10.5 cents per kW-h

Assuming 10.5 cents:
4.8*$0.105 = $0.504

It would cost me about 50 cents a day for a 200 watt power supply arcade game. Substitute your own rate to determine your cost.
Title: Re: The power cost of running a video game in Illinois
Post by: johnm160 on October 02, 2006, 03:04:11 pm
Assuming a 200watt arcade power supply, running 24 hours a day:

200W = 0.2kW
0.2kW * 24 hours = 4.8kW-h

your machine will consume 4.8 kilowatt hours each day. Multiply that by the amount you pay per kilowatt-hour to figure running cost. You can figure kW-h cost by using a previous power bill.
In my area, I pay about 10.5 cents per kW-h

Assuming 10.5 cents:
4.8*$0.105 = $0.504

It would cost me about 50 cents a day for a 200 watt power supply arcade game. Substitute your own rate to determine your cost.


About how much more does the monitor use?
Title: Re: The power cost of running a video game in Illinois
Post by: shardian on October 02, 2006, 03:12:15 pm
An actual arcade monitor runs everything off the same power supply. You never specified if you are running a mame cab or an original cabinet. I assumed an actual arcade cabinet, since you wanted to run it 24 hours a day. I would not recommend running a mame cab 24 hours a day. I can't think of any reason someone would want to do that...unless they would be putting it in a commercial location, which I would hope to assume you are not doing.  ;)

Either way, you would just add up the power consumption of the computer power supply (probably 400 watts), the monitor power (which could be a little trickier since it is straight AC), and the marquee light. It will end up being probably around 500-600 watts.
Title: Re: The power cost of running a video game in Illinois
Post by: johnm160 on October 02, 2006, 03:42:26 pm
An actual arcade monitor runs everything off the same power supply. You never specified if you are running a mame cab or an original cabinet. I assumed an actual arcade cabinet, since you wanted to run it 24 hours a day. I would not recommend running a mame cab 24 hours a day. I can't think of any reason someone would want to do that...unless they would be putting it in a commercial location, which I would hope to assume you are not doing.  ;)

Either way, you would just add up the power consumption of the computer power supply (probably 400 watts), the monitor power (which could be a little trickier since it is straight AC), and the marquee light. It will end up being probably around 500-600 watts.

You are correct, I was thinking of a PC power supply plus a monitor. I did not realize the everything ran off the the power supply in an original cab.

Title: Re: The power cost of running a video game in Illinois
Post by: ahofle on October 02, 2006, 03:48:07 pm
Does a 400 watt PC power supply really consume 400 watts at all times?  I thought that was just the maximum if you happened to have a tons of devices in it.
Title: Re: The power cost of running a video game in Illinois
Post by: shardian on October 02, 2006, 04:05:33 pm
Does a 400 watt PC power supply really consume 400 watts at all times?  I thought that was just the maximum if you happened to have a tons of devices in it.

Maybe an electrical guru can prove or disprove me, but I understand that AC/DC converters convert a full 400 watts, and the what is not used is converted to heat - thus the fins and cooling fan on a power supply. This is the same thing a voltage regulator chip does in dc circuitry...except it is on a much smaller scale and gets by with passive cooling.
Title: Re: The power cost of running a video game in Illinois
Post by: Gambit on October 02, 2006, 04:06:37 pm
It's not really alot.  Lets say it uses 400 Watts, then all you need to do to break even is turn off 4 light bulbs around the house.
Title: Re: The power cost of running a video game in Illinois
Post by: shardian on October 02, 2006, 04:11:12 pm
It's not really alot.  Lets say it uses 400 Watts, then all you need to do to break even is turn off 4 light bulbs around the house.

400 watts 24 hours a day adds up quickly.
Title: Re: The power cost of running a video game in Illinois
Post by: ahofle on October 02, 2006, 04:43:12 pm
http://ask.metafilter.com/mefi/20850
This seems to imply it will only draw what it needs (not the maximum rating of the PSU).
Title: Re: The power cost of running a video game in Illinois
Post by: MonMotha on October 02, 2006, 04:54:34 pm
Most switching power supplies (the kind used in computers and arcades) are rated in terms of maximum power input (above which they'll either shut down or blow up).  The output is less than that by the efficiency of the conversion process (usually around 75-90% or so).  However, the output power is determined by the load you have hooked up to it.  The input power then scales accordingly.  In other words, if you have a 10W load on a 500W PC power supply, it'll probably only be drawing 15-25W (efficiency drops at lower power loads due to non-varying overhead).

Linear power supplies behave the same way, but the efficiency is lower, and they're usually rated in power output, rather than input.
Title: Re: The power cost of running a video game in Illinois
Post by: torez on October 03, 2006, 04:13:03 am
Yes, the psu has a maximum rating, but usually is running at lower value.  Another way to look at it:  if psu always runs at the nominal value rating, there would be no need to have hibernate/sleep mode in a computer. ;)
Title: Re: The power cost of running a video game in Illinois
Post by: Gambit on October 03, 2006, 04:46:11 pm
I read somewhere that 80% of power used during idle time is used by the monitor.  So having the monitor power down after a certain amount of time will save lots of money.

Though if its being used in a commercial application that wont work.