There's been a lot of interest in new forms of data centres recently and the growth of large public cloud providers investing in power production.
Ok, just a quick note because this is all obvious stuff and hence I'll provide a simple back of a fag packet style calculation :-
- The number of transistors increased from 10^11 in 1974 to 10^19 in 2004. That's a 100 million fold increase.
- During the same time, efficiency in energy consumption by transistors increased two million percent! Wow ... alas, that's a 20,000 fold improvement.
- Hence our rate of consumption increased about 5,000 fold. Which approximates to a doubling rate of 2.5 yrs.
- Depending upon who you talk to then electricity consumption by computing represents around 3% of total supply.
So, assuming that in the next 10 years we continue with the same level of efficiency improvement (i.e. 97% reduction in energy use for the same amount of computing power) then we will only need 50% more power stations than we have today just to cope with the increase in demand. Doubling rates of 2.5 years, Jevon Paradox and all that malarky can create real headaches if you're not careful.
Of course, there's all sorts of complications, substitution effects etc but as I said, this is just a quick note and many of those effects also have counter effects.
However, the above is why I've often asked people building private clouds whether they have thought about how they're going to secure the electricity supply to power it? I usually do this with my first question on their private cloud effort being where are they building the power plant and what fuel are they intending to use? Depending upon how mean I'm feeling, I might quip have they considered nuclear just to rub the point home.
It's also part of why back in 2008 I made the prediction that by 2016, the private cloud market would be in decline ... in fact, I expect it to be a bloodbath.