Sunday, September 06, 2009

Is Cloud Computing Green?

The short answer is yes, the long answer is no.

The short answer deals with the better utilisation of computer resources within the data centre, the potential for allocating virtual resources to more energy efficient services and for reducing the massive sprawl of physical resources (and all the energy hence consumed in manufacturing, cooling and distribution.). Overall, cloud computing offers the potential for more energy efficient virtual resources.

The long answer concerns componentisation. The shift of common and well defined I.T. activities from products to standard components provided as online services should lead to a dramatic explosion in innovation.

Standardisation always creates this potential.

If you consider writing an application today, the reason why it's a relatively quick process is because we have relatively standardised and stable components such as frameworks, databases, operating system, cpu, memory etc. Imagine how long it would take to write a new application if you first had to start by designing the CPU. This is componentisation in action, the rate of evolution of system is directly related to the organisation of its subsystems.

Cloud computing is all about providing standard components as services (it's pure volume operations). The problem of course is that we will end up consuming more of these standard components because it's so easy to do so (i.e. in old speak, there is less yak shaving) and it becomes easier to build new and more exciting services on these (standing on the shoulders of giants).

We might end up providing more efficient virtual resources but we will end up consuming vastly more of them.

In the short term, cloud computing will appear to be more green, in the long term it will turn out not to be. However, that's to be expected, our entire history of industrial progress continues to be about the constant creation of ever more complex and ordered systems and the use of stable subsystems simply accelerates this process, whether they be bricks, pipes, resistors, capacitors, databases or whatever.

Whichever way you cut it, our constantly accelerating process of creating greater "perceived" order and the constant reduction of entropy (within these systems and the future higher ordered systems that will be created) ultimately requires one external factor - a greater energy input.

Cloud computing will be no different and our focus will have to be on the source of that energy.

6 comments:

Roland Judas said...

Great article. Interesting thoughts, though I would not completely agree on your long answer. It might be correct, that more componentized IT services are consumed, probably on a large array of devices than we use today. But existing systems are way too ineffient, and especially the siloed approach, provides us with loads of redundant stuff, which keeps procurement managers and admins busy.

steve said...

I don't get it. I agree if something is cheaper, more of it is consumed.

But computing resources are an ingredient, you use them to produce some other good. So if you produce X goods over then years, vs X/2 over those same ten years (in a non-cloud world, with more expensive and less efficient ingredients), then on the face of it you've consumed less (resources to make X/2 were smaller). But if you'll eventually produce all that X worth of goods, then over then next 10 years, you'll even-it up, but your total resource bill will be higher. So it was better to do it with the cloud.

Or to take a different tact, if we're working to avoid environmental disaster (not primairily driven by computers), then I'd much rather have more more-efficient cycles to use to make progress than less.

swardley said...

@Roland: I absolutely agree with you, in the short-term. There are huge inefficiencies which can be solved.

However, even today you hear people saying that they couldn't have built their service without cloud computing because they couldn't afford the infrastructure.

In short, easier access to computing resources will increase consumption.

It's a straight question of whether the increased consumption of more efficient resources exceeds the lower rate of consumption of the less efficiently provided resources today.

History suggests that more energy will be consumed in the longer term.

swardley said...

@steve: in the short term on a micro scale you're absolutely correct, the use of cloud computing allows for more efficient resource provision.

However, in the long term, it allows for much greater consumption whether this is short lived (i.e. for example the long tail of applications with high but short lived processor consumption) or whether this simply through competitive markets making cloud computing even more affordable (pricing competition through switching).

The long term, macro scale would suggest a much larger consumption of more efficient virtual resources. The overall effect will almost inevitably be more energy consumption.

This is why our focus, for greening cloud computing, will have to turn to the source of the energy.

To assume that cloud will ultimately end up with less resource consumption is like assuming that cloud will end up with everyone using a few major providers.

It's a catchy idea but take a look at the electricity industry, it's not how that has worked out. Multiple different models of provision and consumption & overall more consumption.

Good points though, time will tell.

Peter Jenkins said...

I disagree with your long term answer. You need to separate Computing and Cloud Computing.

Clearly the vast economies of scale available to cloud providers will enable very good utilisation of computing resources. These providers will have a very clear incentive to squeeze every last drop out of their hardware - it directly affects their bottom line and it will use less energy.

Public Cloud Computing also provides far, far better accounting and transparency for resources used than any internal IT department delivers today.

This will drive software vendors and communities to be more efficient with system resources. I can imagine the conversation:

"What do you mean I need two times as many Oracle VMs as MySQL VMs AND Oracle is more expensive to license?!"

In the longer term, assuming most applications are deployed in the cloud, it should mean that the cost of the legacy junk that almost every organisation is sitting on will become much more visible. Companies that don't get a handle on these costs won't be competitive.

This killing off of old inefficient applications and (hopefully) companies will, I think, make a huge difference to the total energy use of IT.

If separately we continue to use computing to do more stuff as a society then yes we might use more energy, but that doesn't mean it's a bad thing or that is has much to do with Cloud Computing ... you need to understand what the new applications are doing to make that call ... maybe the application saves energy elsewhere?

swardley said...

Tut tut @Peter, you put a perfectly good logical structure in place and then you ignore it.

You're absolutely correct that when considering green you need to separate computing from cloud computing. This is why what applications are built aren't part of the equations.

The only things that you can include in the comparison are the distinctions between cloud and cloud computing. These include higher rates of utilisation, greater efficiency per machine and componentisation.

It is the latter which will lead to a rapid explosion in innovation in the field and a much greater consumption of more efficient units. Componentisation is a consequence of cloud computing and can't simply be dismissed. It is also the single reason why in the long run cloud computing won't turn out to be green.

So I beg to differ.