VideoHelp Forum
+ Reply to Thread
Results 1 to 10 of 10
Thread
  1. Any idea how much electricity a computer uses in a month?

    Just heard a tech guy on talk radio claim he was running four servers at home. All were AMD and 24/7. He alleges when he switched to Intel cpus his power bill was reduced by $100 a month...

    I find this very hard to believe...
    Quote Quote  
  2. Too vague to make any conclusions,

    It depends what you are running. You can go anywhere from ~100-1000w per server for usage

    e.g.
    Was he running multiple GPU's concurrently? or headless?

    What kinds of workloads? 24/7 doesn't necessarily mean 100% load? e.g. if they were web servers waiting idle for requests

    How old were the AMD's? 140nm? 90nm chips? Then $100/month easy to believe

    On larger farms 20-40% of the cost goes into cooling

    Electricity costs vary a lot by region. I pay 8cents/kWh , some people pay 30. So if I saved $100 in a month, they would have saved $375

    An easy way to determine usage is plug in a kill-a-watt meter to your computer. Many UPS's have LCD displays as well for usage
    Quote Quote  
  3. Member bendixG15's Avatar
    Join Date
    Aug 2004
    Location
    United States
    Search Comp PM
    Originally Posted by kenmo
    Just heard a tech guy on talk radio .... I find this very hard to believe...
    I find most things on talk radio are hard to believe..... gave it up eons ago...
    Quote Quote  
  4. But this wasn't on Coast to Coast Am or Art Bell????
    Quote Quote  
  5. Member
    Join Date
    Jul 2001
    Location
    United States
    Search Comp PM
    Originally Posted by kenmo
    Any idea how much electricity a computer uses in a month?

    Just heard a tech guy on talk radio claim he was running four servers at home. All were AMD and 24/7. He alleges when he switched to Intel cpus his power bill was reduced by $100 a month...

    I find this very hard to believe...
    Obvoiusly, you can't just switch CPUs. He switched CPUs, motherboad (and its various chipsets) and probably memory. Who knows what he did for drives, PSU, GPU (if any), NICs. If he'd swithed to the latest AMDs, he'd have found at least the same difference. Both AMD and Intel have step down technology to save power when the CPU isn't taxed.

    Buncha crap.
    Have a good one,

    neomaine

    NEW! VideoHelp.com F@H team 166011!
    http://fah-web.stanford.edu/cgi-bin/main.py?qtype=teampage&teamnum=166011

    Folding@Home FAQ and download: http://folding.stanford.edu/
    Quote Quote  
  6. contrarian rallynavvie's Avatar
    Join Date
    Sep 2002
    Location
    Minnesotan in Texas
    Search Comp PM
    You can switch just CPUs and it can make a large difference in power consumption. Going from 95W thermal spec chips down to their LV counterparts at 40W will be noticeable. Combine that with the fact that there are 2-4 of these chips in a single enclosure and there is some savings to be had.

    I did a cost justification on something similar to this, I'll see if I can find some numbers on it. The big OEMs probably already have something to compare to.
    FB-DIMM are the real cause of global warming
    Quote Quote  
  7. Mod Neophyte Super Moderator redwudz's Avatar
    Join Date
    Sep 2002
    Location
    USA
    Search Comp PM
    I have one of those Kill-a-Watt meters I can plug the computers into. My HTPC computer uses about 140W average. 3 watts when off. (The 5V rail running the RAM, LAN, etc. ) Start up is about 160W for ten seconds as the HDDs spin up and the optical drives check for discs.

    I pay $0.12 per Kilowatt Hour, including taxes, other charges. 140W / 1000 = .14 KWH X 24 hours a day = 3.36 KWH per day, which would be $0.12 X 3.36KWH or $0.40 for 24 hrs or X 30 days = $12.00. That would assume 24/7 operation, but you could figure it out closer. Your electric bill should give your KWH rate. (Hopefully my math is right. )
    Quote Quote  
  8. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    Originally Posted by kenmo
    Any idea how much electricity a computer uses in a month?

    Just heard a tech guy on talk radio claim he was running four servers at home. All were AMD and 24/7. He alleges when he switched to Intel cpus his power bill was reduced by $100 a month...

    I find this very hard to believe...
    Sounds like a typical intel is better than amd fanboy with inflated stats.
    I think,therefore i am a hamster.
    Quote Quote  
  9. contrarian rallynavvie's Avatar
    Join Date
    Sep 2002
    Location
    Minnesotan in Texas
    Search Comp PM
    Originally Posted by johns0
    Sounds like a typical intel is better than amd fanboy with inflated stats.
    No, it could very well be true, but I doubt we're hearing the entire story. If he was running Santa Anas or some of the recalled 252s prior to the switch and then changed to low-voltage Xeons there could easily be a reduction in power consumption, but it's hardly a fair comparison. The same could be said going to those LV Xeons from the older Noconas.

    There are quite a few ways to reduce power consumption on server- and workstation-class gear compared to consumer desktops. Getting lower-voltage chips with lower thermal spec, getting a high-efficiency PSU, green HDDs and a controller that can spool them down, using speed-step technology to underclock your CPUs when they're idle, and of course just shutting off the computer work for consumer gear
    FB-DIMM are the real cause of global warming
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!