Wondering if anyone has a feel for the power efficiency of older server hardware. I’m reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So…if you take the hard drives out of the equation, it’s probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?

My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It’s like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?

You are viewing a single thread.
View all comments
1 point

Homelab

!homelab@selfhosted.forum

Create post

Rules

  • Be Civil.
  • Post about your homelab, discussion of your homelab, questions you may have, or general discussion about transition your skill from the homelab to the workplace.
  • No memes or potato images.
  • We love detailed homelab builds, especially network diagrams!
  • Report any posts that you feel should be brought to our attention.
  • Please no shitposting or blogspam.
  • No Referral Linking.
  • Keep piracy discussion off of this community

Community stats

  • 9

    Monthly active users

  • 1.4K

    Posts

  • 6K

    Comments