The Energy Star program had been around for around 15 years at that point
And, for computers, was almost exclusively limited to monitors. In 2009, the Energy Star specification was version 4.0, released in 2006. In that specification, the EPAβs objective was to get 40% of the computers on the market to have power management capabilities 2010 β 40% by the year after Bitcoin was introduced. Intelβs 2009 TCO-driven upgrade cycle document mentions power management, but power use isnβt included in any of the TCO metrics.
All of the focus on low-power processing units in 2009 was for mobile devices and DSPs. Computer-oriented energy savings at the time was focused on processes, e.g. manually powering down computers or use of suspension and hibernation - there was very little CPU clock scaling available for desktop computers β you turned them off to save power. DVFS didnβt become widely available β or effective β until 2006, and a study published in 2009 (again, the same year Bitcoin was introduced) found that βonly 20% of initiatives had measurable targets.β
So, yes: technically, there were people thinking about these sorts of things, but it wasnβt a common consumer consideration, and the tools for power management were crude: your desktop was on and consuming power β always the same amount of power β or it was off. And people did power down their computers to save energy. But, like I said, if your desktop was on, it was consuming the same amount of energy whether you were running a miner or werenβt. There was a motto at the time bandied about by SETI@home, that your computer was using energy anyway, so you might as well do science with the spare CPU cycles. That was the mindset of most people who had computers at the time.