Friday, November 30, 2012

The History of Amazon EC2

In 1976, Space Aliens worried about future business synergy created Amazon EC2 out of spare non-existent and infinite computer resources using quantum time dilation tunnelling effects ... or was it?

There are many myths about EC2, my favourite includes the selling of Amazon spare capacity. It's always a good idea to talk to the people involved.

For the origins of Amazon EC2 - read here

For how it got built - read here

'Nuff said.

Wednesday, November 28, 2012

Competition, Strategy and Execution ... an OCI question.

I was asked a question recently, why did the OCI (Open Cloud Initiative) not demand an open source reference model? The answer is ... it does.

What OCI doesn't demand is all implementations of a "standard" have to be open sourced, it allows for operational improvements and hence service competition between providers. For example, I take an open source model for IaaS and make it work better somehow for my own IaaS and decide to keep those improvements proprietary.

Such competition based upon operational efficiency (as opposed to feature differentiation) is fine in a utility market, in fact it's even highly desirable in terms of reducing the probability of common failures. However, the market needs to ensure that semantic interoperability (a necessity for switching) between the providers is maintained.  For this you need an assurance mechanism around a core model.

If the core model is open (i.e. the open version is a full, faithful and interoperable reference model) and the assurance system is built around this then you should get a free market (as in unconstrained). This is in contrast to a captured market which is based upon a proprietary reference model and hence is under the influence of a single vendor.

For example, take Cloud Foundry which provides an open source PaaS which is implemented by a number of providers. This is on the way to creating a competitive free market of providers based around an open source reference model. However, you still need a mechanism of assurance that semantic interoperability is maintained (i.e. innovation is constrained in some manner to operational improvements rather than differentiation which itself limits switching between providers).  Hence things like CloudFoundry Core, which provides such an assurance mechanism are critically important to the game. 

Alas, knowing how to play the game (e.g. create a market based on an open reference model, allow operational competition and create assurance) is merely necessary but not sufficient to create a functioning market. There's also the thorny issue of execution. 

Whereas commoditisation is a consequence of competitive action (user and supply competition) of ALL actors and does not depend upon the execution of specific actors, the questions of centralisation (few players) or decentralisation (a broad market) is commonly a consequence of the play BETWEEN actors and it does depend upon execution by those different actors.

Hence whilst, OCI embodies the principles of creating an unconstrained competitive market based on an open source reference model with operational competition between the players - that's only part of the battle. Whether such a competitive market will actually form or instead a more centralised environment emerges which does not espouse any of the OCI values depends upon how well the game is played.

In other words, any strategy or principle no matter how good or benign becomes relatively meaningless without good execution and good game play.

Monday, November 19, 2012

Monopolies, Commoditisation and Centralisation

Just read a good article by Mark Thiele on Why Monopolies and Commoditization would pollute the cloud. It reminds me of my 2007 talk when I said "open source would be needed to counter the threat of monopolies in this space".

However that was then, this is now and there's a world of difference between how the game should be played and execution.  The article has a number of assumptions which need challenging.  So let's go through each ...


"Cars have been around for over 100 years now, they must be commodity by now, right?"

Here the article is assuming that the process of commoditisation is time based like diffusion i.e. adoption over time. In reality it depends upon the actors in the market and competition. Hence evolution from the  genesis of something to commodity is governed by ubiquity (how widespread something is and user / demand competition) and certainty (how feature complete something is and supplier / supply competition).  See graph below.



The nut and bolt took almost 2,000 years to go from genesis (first creation) to late product and commodity (starting with Maudslay's screw cutting lathe).  Electricity took 1400 years from the Parthian Battery to Westinghouse / Tesla and A/C utility provision. Telephony about 100 years and computing infrastructure about 65+ years. So, you cannot assume some linear relationship with time.

Secondly, in the above examples (Nut and Bolts, Electricity, Computing Infrastructure, Telephony) these systems have become components of something else (e.g. Machines, Consumer Electronics, Big Data System, Smart Phones). 

In industries where the car is a component of some other value chain (e.g. Taxis, Fleets, On Demand hire services etc) then it would be worth looking at whether standardisation has or is occurring.  

As the article points out the car itself has many commodity components but if the system you're examining is the top level system then you're always going to have branding, perception of other values which impact it.  Gary Dahl's "Pet Rock" is probably the best loved example of associating a branding value with what remained a commodity ... it's not a rock (which it was) but a "Pet Rock".

A comparison of computing infrastructure to cars is one of comparing a component to a higher order system which has other values (i.e. personal branding, status symbol etc).  If the article was to compare like for like then it probably would end up with a different conclusion i.e. how many users of Amazon EC2 care or even know what hardware Amazon EC2 runs on - it's an invisible component, far down the value chain. The same can be said of how many users know or care what specification of nuts and bolts are used to build their car?


"If we allow a few companies to push the technology to a true commodity business model"

First, it's not companies that drive things towards a commodity model but the interaction of users (demand competition) and suppliers (supply competition).  The question of evolution (which for activities we call commoditisation) is separate from the question of centralisation / decentralisation and the two shouldn't be conflated.

It would have been relatively trivial for the hardware manufacturers to create a price war in the IaaS space around 2008-2010 in order to fragment the market by increasing demand (computing infrastructure is elastic) beyond the capability of one vendor to supply.  The fact they didn't is their own fault and also one of the major factors why we might see centralisation.

In general:-
  • The process of evolution (driven by demand and supply competition) is not time based but depends upon the overall interactions of ALL actors (users and suppliers). It is an inevitable process of competition. 
  • The question of centralisation / decentralisation varies with a number of different economic forces but it usually depends upon the actions and gameplay of SPECIFIC actors (suppliers). Often you will find that companies are suffering from inertia to change (due to past success) and hence new entrants into a market that is commoditising can quickly capture the market.  This doesn't need to be the case though and the issue is one of executive failure usually of past giants and inability to react.

Let me be absolutely clear here, commoditisation does not mean centralisation. There's a myth that it does, often used to create strawman arguments. These two issues of commoditisation and centralisation are entirely different things. 

However, it's probable that commoditisation of various IT activities (nee cloud) will lead to centralisation due to failure of competitors within this space. You can't assume that commoditisation and centralisation occur hand in hand but in the case of IaaS it's likely.

Whilst "open source would be needed to counter the threat of monopolies in this space" still holds true, the actions since 2007 (particularly on differentiation) means the industry didn't counter the threat in the IaaS space. This didn't have to be the case, learning those lessons from HTTP and a more focused and earlier attack on becoming the open source AWS clone would have changed this.  Unfortunately rather than a strong play, competitors have played a weak game and a period of monopoly / oligopoly looks destined to be the result.

Hence the shift towards utility services is occurring (driven by actions of all players), open source is the route to creating competitive markets (in the final state) but due to Amazon playing the game well and most competitors having poor gameplay (with the possible exception of Google) then we're likely to see a period of monopoly / oligopoly in IaaS (of Amazon and Google) for far longer than was necessary.

Fortunately, some (such as CloudStack, Eucalyptus and OpenStack groups like CloudScaling) appear to know how to play the game.  So, I take the view that this won't be a permanent state of affairs and it will eventually work itself out in the IaaS space.  We will eventually get those competitive markets based around open source IaaS or in the worst case scenario Government regulation.  The downside, it'll take far longer than it needed to, by about 5-15 years (unless of course Amazon or Google decide to open up first or CloudStack suddenly becomes the focal point of intense competition e.g. VMware making a huge IaaS play based upon it).

We will eventually get there but as I've said, centralisation / decentralisation all depends upon how well the actors played the game and let's be frank - most haven't played it well. Luckily for all of us, CloudFoundry is playing a very smart game in the platform space, so that area seems less of a problem.


"Innovation would be stifled"

Quite the opposite. There are two aspects of innovation here - operational efficiency and creation of higher order systems.  Innovation is one of those awful words which means many different things, however the two are not the same. 

The commoditisation of any pre-existing activities which acts as a component leads to rapid increases in the genesis of higher order systems :-
  • From Nuts and Bolts to Machines
  • Electricity to Radio / Hollywood / Consumer Electronics / Computing
  • Computing to Big Data and all the changes we're seeing today. 
Hence commoditisation always increases innovation (as in genesis of higher order systems), this effect is known as componentisation (Herbert Simon).

Equally, innovation behind the interface doesn't stop i.e. electricity was standardised to an interface but innovation still occurs in the means of production (e.g. operational improvements and efficiency or new means of generation).  This is the same with all manner of industries from finance to manufacturing (see float glass method of producing glass etc).

You cannot therefore state that commoditisation inhibits or limits or stifles innovation, where historical evidence shows it not only allows for innovation in production but accelerates and enables innovation of higher order systems. Each of the major ages - industrial, mechanical, electrical, internet - are associated with commoditisation of pre-existing activities.


"Drivers that make unique IT solutions critical"

The article is absolutely spot on that even in a mass market of commodity components there are always niches - same with electricity, finance, most manufacturing etc. You'd expect the same with computing. There will be highly profitable but small niches.


"There are just too many ways to (in this case) build that car"

One of the most important parts of any of the common cycles of changes (i.e. ages) is the disruption of past giants stuck behind inertia barriers due to past success.  Disruption comes in two forms - there's the unpredictable, unforeseeable (e.g. a change in characteristic of a product such as disk drives) and then there's the predictable (e.g. a shift of an activity from product & rental services to commodity and utility services). 

Both cause disruption, the former because it's unseen and hard to defend against, the latter because it's seen but the company is unable to react due to inertia (which is often embedded in the culture of the organisation).  

We've seen this in every major age and there is nothing which suggest that cloud, which was predicted by Douglas Parkhill in his 1966 book the challenge of the computing utility, will be any different. 

The list of companies who believed that their industry would not be commoditised is long and glorious from the gas lamp companies of the past to almost certainly modern infrastructure companies today. 

The problem for all is that their offering is just a component.


In summary

Mark's article is interesting, adds to the debate but makes some hefty assumptions in my view and may even be victim of falling for many of the spreading myths.  However, it's definitely well worth a read and a welcome change from some of the strawman arguments and schoolyard diatribe that I've been exposed to of late. It's a refreshingly sensible article.

Its general premise on the danger of monopolies is one I wholeheartedly agree with. The reason this danger exists though is not one of commoditisation itself but instead executive failure of competitors - from differentiation strategies to failure to effectively execute and in many cases failure to act.

Big Data

"Big Data" is occurring due to the increase in new activities producing un-modelled data combined with the cost of thinking about whether to store data exceeding the cost of storing. To be put it simply, it's cheap (or should be) to store everything.

It's like my "stuff" box at home but on steroids.  I throw all sorts of bits and pieces into my "stuff" box because it's cheaper than taking the time to sort out what I should keep. I also do so on the assumption that "there's value or maybe there will be value in that stuff".  In general this turns out to be a delusion, it's mainly junk but I can't help thinking that "there's a pony in that field".

Eventually the box becomes full and I'm faced with a choice. Buy a bigger box (scale-up), buy another box (scale-out) or just bin it anyway. I tend to do the latter. What I don't need is a bigger or better or more distributed "box" but instead a better "algorithm" for sorting out what has value and what doesn't.

A lot of "Big Data" seems to be about better "boxes" where in my honest opinion it should be focused on better "algorithms / models". I'm not against storing everything, especially when it's cheap to store data (i.e. distributed system built with commodity components etc) as you never know what you might find. However, that shouldn't be the emphasis.

Oh, as for my "stuff" box, StorageBod humorously raised the idea of using the attic. Look's like I've got an even bigger "stuff" box now, though I'm not sure that helps me? I'll have to decide whether I fill the attic with lots of "stuff" boxes or use it as a free for all? Maybe I'll need a cataloguing system?

Of course if I fill up my attic with stuff then I'll probably end up with some salesman telling me stories about how "Mr Jones found a lost lottery ticket" or "Ms Jones found an old master's" in their attics. I'll probably end up spending a shed load of cash on the "Attic Drone Detection, Investigation, Cataloguing and Treasure Seeking (ADDICTS)" system.

I know there's a pony in that field somewhere, I'm sure of it. Otherwise I wouldn't just put this stuff in a box marked "stuff" - would I?

Wednesday, November 14, 2012

Hi Ho Silver Lining

Having a plan to create a federated market based upon a common open source reference model is not something new - it's a pretty common idea (the presentation linked above, I gave in 2007 and it wasn't a new concept then). But having a plan is not enough, execution matters.

In the platform space, Cloud Foundry seems to be leading that charge. They've even recently released a Cloud Foundry Core which enables them to provide users with some level of assurance between the different Cloud Foundry providers. This is all good stuff and yes there is competition with Google App Engine combined with the open source AppScale. However the approach of VMware towards creating a marketplace is enlightened in my view. They've got a good shot at making a large competitive market happen.

In the infrastructure space, the road has been more rocky with lots of mis-steps. However, I'm fairly bullish about CloudStack. Their focus on not differentiating with AWS but instead co-opting (which is fairly uniformly what I hear customers ask for, an AWS clone) combined with its membership of the ASF and the release of CloudStack 4.0 are all positives in my view. It bodes well for the future assuming they can grow and build a vibrant community.

The technology is also used in production in various places (30,000+ servers cited in one case), ISPs are coming online ... again, all good stuff. By not differentiating they also buy themselves time in the expected AWS vs GCE war as they can grow a large and vibrant ecosystem around the AWS ecosystem. Ultimately, if they can grow fast enough they might even exceed it. They have competition (Eucalyptus, OpenNebula, OpenStack etc) in this space but CloudStack seem to be taking a good shot at making this market happen.

Back in OSCON 2007 when I gave my keynote on federated markets and open source, I was concerned that these future industries would be dominated by proprietary environments.  Companies had to pick up the idea of building federated markets around open source, they had to start working together and they had to execute.  With projects like CloudFoundry and CloudStack, I'm less so concerned these days for the long term.  Both projects seem to understand where they need to go and neither has made a serious mis-step (e.g. failing to open source, going down an open core route, differentiating when they shouldn't, trying to be all things to all people, confused messaging on focus,  major losses in community, unchecked issues around a collective prisoner dilemma). 

They're both playing a reasonably good game, backed by well funded companies and are executing with a vision of creating a market in my opinion. For me, they are the silver lining in a cloud that at one point threatened a dark future for open source and consumers alike. This makes me happy - as in closer to 3 rules happy.

Thursday, November 08, 2012

On myths ..

Oh, I'm hearing lots of myths being spread about cloud ... again. Let me see if I can't nail a few.

On Amazon
AWS wasn't about selling spare capacity, it was built as a stand alone service.

On Utility
All activities would appear to evolve, the end state is one of commodity and utility provision. What is happening today is simply a shift from products to utility services. We happen to call this "Cloud".

On Centralisation
Utility does not mean centralisation (or consolidation to a few providers), the two are entirely different and governed by different economic forces. A utility service can be provided by a large number of different providers in a market.

On Open Source
A market of utility compute providers needs to solve the issue of semantic interoperability. In practice, to create a "free" (as in non constrained) rather than captured market, the technology must be provided through an open source reference model.

Open Stack is guaranteed to win
So, can we assume that OpenStack will win because it plans to create a federated market based upon a common open source reference model ... err No. There's that little issue of execution.

Whilst an open source approach has all the right advantages, you cannot assume that this will be OpenStack for many reasons :-
  1. There are multiple competitors to OpenStack - CloudStack and Eucalyptus to name two. Each has the potential to build a market and competing ecosystem.
  2. The rate of innovation, customer focus and efficiency grows with the size of AWS's ecosystem and so critical to competition is building a bigger ecosystem. Without visibly co-opting the EC2 / S3 / EBS APIs then OpenStack will put itself at a disadvantage.
  3. It is likely that a price war will develop between AWS vs GCE which will only increase demand (Jevons' Paradox). If a competing ecosystem is not in place at this time, it will get left further behind.
  4. The scale advantage is not about efficiency alone but rate of innovation and customer focus. With big enough ecosystems then Amazon and Google can continually outstrip new competitors i.e. they will continually be further ahead and increasingly so. 
If you asked me back in 2010 should OpenStack win this game - I would have said that in all probability ... yes. Rick Clark was key in the project and he knew the game.  However, ask me that same question today and you'll get a different answer.

This isn't because the approach of building a competitive market around an open source reference model is wrong but instead the execution of this project.  Which is why I say they desperately need a Benevolent Dictator to get things sorted fast. Someone like Lew Tucker.

Rackspace going ALL - IN with OpenStack

I was alerted by a good friend of mine Benjamin Black that Rackspace had announced it was going "ALL - IN" with OpenStack and that it was going to compete with Amazon on service not scale.

Ok, this is potentially great news for OpenStack but that depends upon the play and intention at hand.

If Rackspace believes that there are enough companies setting up or wanting to setup as utility providers of infrastructure around OpenStack then the move can make a great deal of sense. By enabling other companies to set-up, Rackspace's focus would be on growing the entire ecosystem without being a dominant player in that market. This is actually essential if you want to try and become the exchange and / or marketplace for a broad utility market.

So let us assume that the focus in Rackspace has become :-
  • build OpenStack into the reference model for a huge ecosystem (i.e. bigger than AWS)
  • manoeuvre to become the exchange / marketplace / assurance body for that ecosystem 
... then that's grand. It's a bold move but a good play.

By doing so, it would also make it easier for Rackspace to co-opt OpenStack competitors where such action is beneficial as it removes the whole differentiation and on ramp to Rackspace argument. It may also mean that Rackspace will push the technology even faster as they increasingly depend upon a broad ecosystem of utility providers. It will also enable them to introduce some form of certification for OpenStack (much as Google has done with CTS) in order to overcome the collective prisoner dilemma (everyone within the group differentiating). This latter part is required for assurance reporting across multiple providers (and an essential part of an exchange).

So the models for Rackspace would become :-
  • Focus on growing the ecosystem rapidly
  • Build a certification and ratings agency (e.g. a Moody's model) to ensure compliance of members offerings to OpenStack (essential for switching)
  • Build a marketplace for a market of OpenStack providers (e.g. a uSwitch model)
  • Build a computing exchange (e.g. where the real money will be)

Add into this some service and technical support revenue (though helping companies get going with OpenStack) then this would all be very reasonable. By also growing OpenStack in the enterprise and helping companies build their own private OpenStack clouds (whether sensible or not), there is the potential to grow future customers of this market by providing a natural route for transition.

Whilst the play is obvious and has been often repeated umpteen times over the years (in 2007 we were talking federated markets etc), it's potentially a good move because no-one has yet effectively built that market, marketplace, exchange and assurance body. Of course, it'll bring them straight into a collision course with RightScale, Enstratus, Canonical and others who have been gearing up for this space.

It's going to be a huge uphill battle - you've got AWS vs GCE to contend with, you'll need to move fast, you'll need to encourage OpenStack members to bet big as utility providers, you'll need to co-opt competitors, you'll need to manage the potential conflicts and you'll need to get that market setup within the next 12 months. 

However, it gives some hope.

Of course, I'm assuming that this is what they're actually planning. If instead their plan is to get enterprises building OpenStack as an on ramp to Rackspace services which they'll "differentiate" on some spurious basis rather than competing on scale with little or nor focus on the ecosystem, marketplace, exchange etc ... then ... oh well.

So, the dead duck might just have made a murmur or alternatively it was gas escaping ... can't tell which yet. Will they successfully achieve this? Will they be able to climb that mountain?

Well, if you want my personal opinion ... no.  Looking at what has happened and the choices they've made, then I take the view that they lack the force of leadership necessary to play this game. Of couse, that's assuming they're even playing the right game.

Wednesday, November 07, 2012

These US elections are more complex than I realised.

The internet is all agog with talk of Nate Silver and how he got the election right. So, I went to have a look and he seems to have called the race at 313 (Obama) / 225 (Romney). That seems very impressive to me.

However, I hate to be picky but whilst the prediction was close it doesn't seem to be actually right. It seems the result will end up 332 / 206 when Florida calls (assuming Obama wins). I've been told that actually Nate predicted a broader range and that 313 / 225 was the average - so he was hedging.

That's ok then. Still, it's very impressive and yes the twitter verse is flowing with #natesilverfacts

Now, as impressive as Nate Silver's prediction was, it seems that Drew Linzer who has predicted an Obama win since June with a 90%+ certainty and the right range, made his prediction of 332 / 206 which is also what he has been predicting since June.

Hang on - 332 / 206 - that's what seems to be happening. That's no hedge, that's just oh wow. Has Drew Linzer really nailed it? Since June?

Every state, every forecast - on the money. That's real wow. That's mega mega wow with wow sauce on.

That's more than just impressive that's so impressive that there must be ... wait ... 

Where's the #drewlinzerfacts?

Hint : There aren't any. 

Now, both Nate Silver and Drew Linzer have certainly made exceptional predictions here and despite the hedging  on the overall count on Nate's part, his overall predictions on % vote for each candidate squeaked past Drew i.e. Nate Silver was more accurate in 26 States whereas Drew was more accurate in 24 States.

But why the silence on Drew Linzer? If Florida goes the way expected then :-

#NateSilver can beat the sun in a staring contest but only Drew Linzer can make it run and hide #drewlinzerfacts

OK, this most be some sort of special US Election thing that I'm not getting seeing that I'm a Brit. I'm a huge fan of people who stick their necks out, don't hedge and use data. Linzer is a star.

Tuesday, November 06, 2012

On OpenStack and Dead Ducks ...

I received a message recently that I only referred to Open Stack as a dead duck because it disagreed with my hypothesis on evolution which was unscientific and quasi religious.

This is a very misguided view, so I thought I'd better respond.

On the hypothesis of evolution.

Back between 2005-2007, I collected a reasonable amount of data (around 6,000 data points from telephones to televisions to engineering components to electricity to banking instruments to ... a long list) covering a hundred+ years and discovered a process of how things evolve (as opposed to how things diffuse).  This process which is driven by user and supply competition is described in the diagram below which covers both the correlation and causation of the change.

During 2009-2011, I used a variety of predictions tests to confirm the consequences of the model.  I'm happy to now call this a weak hypothesis being supported by historical data, prediction tests and even published in an article as part of a peer reviewed journal.

Does this mean it is true? Of course not. It means it's a weak hypothesis.  The model describes the journey of any activity (a thing we do) from genesis to custom built examples to product (and rental services) to commodity (and utility services).  Such as the evolution of computing infrastructure from the Z3 to EC2 today (and its equivalents).

Graph of Evolution



For those still in confusion, the above is NOT a diffusion curve (there is no time axis) though it happens to have an S-Curve shape. When the Z3 was created, the act of using computing infrastructure was rare and poorly understand. For Amazon EC2 to appear the act of using computing infrastructure had to be both widespread (ubiquitous) and well understood (certain) in order to support the volume operations that utility provision requires.

Of course both the genesis of an activity and utility provision of a pre-existing activity diffuse but diffusion and evolution are not the same.

The Market Today

When we talk about activities (as opposed to practices and data), we commonly refer to this process of evolution by the term "commoditisation". This is exactly what is happening with computing infrastructure today, it is being commoditised to utility services as exemplified by Amazon EC2.

To counter the hypothesis, you'd have to demonstrate that somehow infrastructure was becoming less of a commodity and that rather than utility services growing that they would suddenly decline and we would return to a world governed by individually bought products.  I have yet to find an example of this throughout history and no reason to suspect that this model will not hold today i.e. utility services for computing infrastructure are here to stay.

I should caveat that there are certain marketing mechanisms and abuses of market power (i.e. oligopoly and reduced competition) which can give an appearance of something "de-commoditising" along with an issue of substitution, but before blindly accepting the opinion that something that has never happened before will now happen, I would ask for one iota of evidence of this.  Simply demonstrate to me that utility services in infrastructure are not growing. Simply explain to me why utility services for infrastructure are not the future and how this transition towards cloud is a mere blip which will soon reverse.  The market says otherwise, history says otherwise and I see no data which supports the alternative opinion. 

Instead I see 6,000 data points plus today's market which says they are wrong.  Now, to call me religious for basing hypothesis on data (both current and past), modelling and prediction tests is farcical. To do so because I won't accept their unsupported, un-evidenced opinion that the entire history of human development is wrong ... well.

The hypothesis states that utility services will grow and dominate computing infrastructure, I see no evidence that this will not happen.

On players

Now, if you happen to agree that the market for computing infrastructure is shifting towards utility services then it becomes a question of who will win that market, will anyone win, what standards will develop and if so how many?

I say standards develop because the natural end state of utility services is provision of fairly standardised components which is commonly through defacto standards followed later by dejeure.  This is an essential part of creating a competitive utility market which also appears common in all utility services.  The answers to these questions depend upon the actions of the players in the market as it forms.

Currently Amazon dominates the infrastructure as a service market and the APIs it provides can be considered defacto.  This is not a permanent situation, it is possible for other players to build a larger ecosystem and to supplant Amazon as the dominant.  An example threat to Amazon may well be Google Compute Engine.

At this moment in time however, Amazon is leading the pack and appears to be visibly growing faster than those around it.  The APIs it provides have been adopted by others (Eucalyptus, CloudStack and even OpenStack) and so whilst this part of the race is not over, it looks like Amazon is a good bet.

On OpenStack

Obviously Amazon has multiple zones and regions but to counter you could attack its weakness of being a single point of failure (a single company) by playing a competitive market game with multiple providers.  In order to do so, you would have to solve the issues of semantic interoperability and this in practice can only be done with an open source reference model.

However, the ecosystem around Amazon provides it with extra-ordinary benefits in terms of innovation, customer focus and efficiency.  Hence a smart play would be to co-opt the ecosystem rather than attempt to build from scratch (i.e. to differentiate away from it). You could differentiate later once your ecosystem was big enough but it seems naive to do this early.

When Rick Clark left Canonical, joined Rackspace and was instrumental in the creation of OpenStack - I fully supported this move of building a competitive marketplace around an open source reference model which co-opted the biggest ecosystem.  However that idea appeared to be short lived as the ideas of differentiation quickly surfaced.

Today, I see considerable problems with OpenStack which I've listed beforehand. My opinion is the project has been hampered by poor leadership, poor strategic play, a lack of major investment by the companies involved, slow development, weak technology and an unnecessary focus on differentiation. It does however have a visible and engaged community.

With the coming price war likely to be initiated between Google Compute Engine vs AWS then OpenStack needs to have in place in the next year, a massive scale competitive market of providers with strong technology.  I hope that they achieve this but I see little evidence of what is needed.  Instead I see further potential issues around a collective prisoner dilemma issue (where members differentiate within the group itself).

So do I believe that in the next year that a benevolent dictator will emerge and resolve these issues, that the technology will become rock solid, that members will invest the billions needed to build a competitive market at scale ... er no.  Which is why I hold the opinion that OpenStack is a wasted opportunity and hence a dead duck. 

So, what if OpenStack fails, will the shift towards utility provision of infrastructure continue? Yes,  well, at least that's the hypothesis.

But what if OpenStack does manage to create a competitive utility market at scale, will the shift towards utility provision of infrastructure continue? Yes, well, at least that's the hypothesis.

This is why the comment that I called OpenStack a dead duck because it "disagreed with my hypothesis on evolution which was unscientific and quasi religious" is misguided whether deliberately so or not. Oh, I'm being too kind ...

The process of evolution is independent of any individual players success or any particular person's opinion. It is simply a consequence of competition.

A final few words 

I realise that people have their own pet opinions (often they call them "theories" when they really shouldn't) and despite scant or more commonly no supporting evidence they argue vociferously that their idea will happen.  If you're one of those then "bully for you", you're obviously omnipotent though that's not the word that comes to mind.

Yes, I have opinions (e.g. OpenStack) which I state clearly as opinions. No-one can predict the interactions of actors in this space, you can only make informed guesses and yes, my opinions are often wrong.  For more on the unpredictability of actors actions then Hayek is worth a read.

Yes, I also have areas of research (e.g. evolution) and this depends upon collection of data, causation, correlation and prediction tests.  Evolution is not time based (i.e. no crystal ball again) and it doesn't depend upon specific actors actions but instead competition between actors.  No, it isn't "right" or "true" or "absolute", it's just the best model I have to explain market phenomenon that are clearly visible for everyone to see.

I would happily dump the model of evolution if someone could finally provide me with a better model and no, I don't count hand wavy concepts with no data, cause, correlation, test or historical examples even if you do believe you're omnipotent.  I'm a skeptic.