Any idea how much electricity a computer uses in a month?
Just heard a tech guy on talk radio claim he was running four servers at home. All were AMD and 24/7. He alleges when he switched to Intel cpus his power bill was reduced by $100 a month...
I find this very hard to believe...
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 1 to 10 of 10
Thread
-
-
A quick google found this info....
http://maximumpcguides.com/windows-vista/how-much-electricity-does-my-computer-use/ -
Too vague to make any conclusions,
It depends what you are running. You can go anywhere from ~100-1000w per server for usage
e.g.
Was he running multiple GPU's concurrently? or headless?
What kinds of workloads? 24/7 doesn't necessarily mean 100% load? e.g. if they were web servers waiting idle for requests
How old were the AMD's? 140nm? 90nm chips? Then $100/month easy to believe
On larger farms 20-40% of the cost goes into cooling
Electricity costs vary a lot by region. I pay 8cents/kWh , some people pay 30. So if I saved $100 in a month, they would have saved $375
An easy way to determine usage is plug in a kill-a-watt meter to your computer. Many UPS's have LCD displays as well for usage -
Originally Posted by kenmo
-
Originally Posted by kenmo
Buncha crap.Have a good one,
neomaine
NEW! VideoHelp.com F@H team 166011!
http://fah-web.stanford.edu/cgi-bin/main.py?qtype=teampage&teamnum=166011
Folding@Home FAQ and download: http://folding.stanford.edu/ -
You can switch just CPUs and it can make a large difference in power consumption. Going from 95W thermal spec chips down to their LV counterparts at 40W will be noticeable. Combine that with the fact that there are 2-4 of these chips in a single enclosure and there is some savings to be had.
I did a cost justification on something similar to this, I'll see if I can find some numbers on it. The big OEMs probably already have something to compare to.FB-DIMM are the real cause of global warming -
I have one of those Kill-a-Watt meters I can plug the computers into. My HTPC computer uses about 140W average. 3 watts when off. (The 5V rail running the RAM, LAN, etc. ) Start up is about 160W for ten seconds as the HDDs spin up and the optical drives check for discs.
I pay $0.12 per Kilowatt Hour, including taxes, other charges. 140W / 1000 = .14 KWH X 24 hours a day = 3.36 KWH per day, which would be $0.12 X 3.36KWH or $0.40 for 24 hrs or X 30 days = $12.00. That would assume 24/7 operation, but you could figure it out closer. Your electric bill should give your KWH rate. (Hopefully my math is right. ) -
Originally Posted by kenmoI think,therefore i am a hamster.
-
Originally Posted by johns0
There are quite a few ways to reduce power consumption on server- and workstation-class gear compared to consumer desktops. Getting lower-voltage chips with lower thermal spec, getting a high-efficiency PSU, green HDDs and a controller that can spool them down, using speed-step technology to underclock your CPUs when they're idle, and of course just shutting off the computer work for consumer gearFB-DIMM are the real cause of global warming
Similar Threads
-
How many hours a day do you spend on your computer on average?
By johns0 in forum PollsReplies: 13Last Post: 10th May 2011, 14:25 -
How to burn a dvd that an average computer user cant copy!!
By Imran87 in forum Newbie / General discussionsReplies: 35Last Post: 8th Jul 2010, 14:06 -
How I can reduce the time consumption?
By ramz in forum EditingReplies: 11Last Post: 7th Jun 2010, 10:42 -
Digital TV Set Power Consumption/Portability Question
By Frank-0-Video in forum DVB / IPTVReplies: 2Last Post: 2nd Feb 2009, 14:36 -
how to save electricity while downloading
By d_unbeliever in forum ComputerReplies: 9Last Post: 10th May 2008, 01:50