The NEW Build Your Own Arcade Controls
Main => Everything Else => Topic started by: RetroJames on November 08, 2003, 01:31:36 pm
-
Hello all, I am curious if those of you who have vintage/new coin operated arcade games, might have ever tracked approx how much electricity one uses over a period of time?
Thanks!
-
The average game with a 19" monitor consumes around 200 watts.
-
Ken, as always thanks for the info. Can you take a look at my notes below and tell me if my conclusion is correct? I did a quick calculation out of curiosity to see how much it costs in GA to have an arcade game running for 24x7x365. I had to look some things up to get my units straight:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
The average game with a 19" monitor consumes around 200 watts.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
watt
n : a unit of power equal to 1 joule per second;
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
joule ( P ) Pronunciation Key (jl, joul)
the power dissipated by a current of 1 ampere flowing across a resistance of 1 ohm [syn: {W}]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
kil
-
Conclusion: [REVISED] A device that consumes 200 watts is a device that consumes 200 watts per hour. Therefore this device will
consume one kilowatt every 5 hours, or restated, the device will consume one kilowatt-hour of energy every 5 hours.
Assumming the device is on 24 hrs a day for 365 days in the year, the device will consume (24*365)/5 = 1752 kW-hours of
energy a year or 146 kW-hours of energy a month.
Ga Residential Rates
WINTER - October through May
Base Charge...... $7.00
First 650 kWh ....@....... 4.385
-
I didn't do the math, and I don't have an EE degree, but something seems wrong here. It just doesn't seem reasonable that a 200W device can cost $640/mo. to run. My monthly electric bill is nothing close to that (multiple computers running continuously, plus all appliances, etc).
Not trying to be a know-it-all (because I don't), but something's amiss.
-
1752 kWh * $4.385 = $7682.52 per year / $640.00 mo
That can't be right.
-
I thought it was very high as well. I have only had one cup of coffee so please check my math and conversions.
-
I think it would be the 146 kw hours a month at the 4.385 cents or 146 X .04385 which is $6.40 a month or $76.80 a year. justed needed to move your decimal point by 2 for cents
-
That makes sense, I would except that. We've calculated power consumption for older computer server equipment and some of it we've figured as high as $40 a month but your talking about lots of fans and spinning disks.
-
UNITS UNITS UNITS
Arrg...Cents....I was thinking it was dollars...Thanks! I am fixing the notes in the post.
-
Posted by: Ken Layton Posted on: Yesterday at 11:09:58pm
The average game with a 19" monitor consumes around 200 watts.
Conclusion: [REVISED] A device that consumes 200 watts is a device that consumes 200 watts per hour. Therefore this device will
consume one kilowatt every 5 hours, or restated, the device will consume one kilowatt-hour of energy every 5 hours.
Assumming the device is on 24 hrs a day for 365 days in the year, the device will consume (24*365)/5 = 1752 kW-hours of
energy a year or 146 kW-hours of energy a month.
Ga Residential Rates
WINTER - October through May
Base Charge...... $7.00
First 650 kWh ....@....... 4.385
-
i think that sounds right. i've seen calculations/discussions where 24/7 computer being left on was $5-$6 monthly.
-
That's what happens when you don't drink your coffee...;)
Thanks team OBryan for the proofread.