Saturday, September 29, 2012

A dangerous intervention ...

A few years ago, I warned about the dangers of Government intervention and creation of certified cloud providers. After reading the EC communication on Cloud Computing, my concerns are that we will start on this path.

First, some background :-

Cloud computing represents the evolution of a range of IT related activities from a product and product service model to one of commodity and utility provision. As such it has impact in terms of efficiency, rapid generation of higher order systems and new sources of value creation. In much the same way utility provision of electricity spurred on new activities such as radio, telephone, computing and new industries around this.

However, it's not just activities (what we do) that evolve but practices (how we do things). Hence architectural practices related to computing are co-evolving with the activity of computer infrastructure and we see the rise of what is known as "DevOps". This is all normal fare and these cycles of change repeat throughout history.

Each time we undergo such a change both consumers and vendors can have inertia to it. 

In the case of business consumers, some of this inertia is due to the changing architectural practice and the issue that legacy applications are not designed for the cloud. Hence there is often a desire on their part to have the new model operate like the old i.e. to have clouds which actually have the characteristics of more old fashioned data centres, a concept commonly dubbed as the "Enterprise Cloud". This inertia can be exploited to extend old business models.

In the case of vendors, much of their inertia is due to past success i.e. they become habituated to selling customers products and product rental models and are encumbered by their existing business. Which is why this period of change is often initiated by new unencumbered entrants (in the case of computing infrastructure this would be Amazon). The past giants often attempt to continue or extend their business models, believing the change is just a fad. History is not on their side and the result is disruption of the past models.

Today, we have all these issues with cloud - the new entrants providing utility services for computing, vendors and customers often suffering inertia and the attempt to extend past models.


EC communication on Cloud Computing

The EC communication recognises many of these changes and talks of the industrialization of IT. It notes the importance in forming competitive markets of utility providers with easy switching in order to avoid lock-in and other concerns consumers have. This is all reasonable.

The EC communication highlights its intention to help standards form in the industry in order to achieve these goals of competitive markets and choice for consumers. This is also reasonable if the approach is one of adoption of defacto standards (i.e. market chosen) and ensuring their provision both as open standards (i.e. royalty free) and through reference implementations in order to ensure semantic interoperability. These reference models (i.e. an expression of the standard as code) would need to be open sourced in order to ensure a competitive market which is free from constraints or control of specific vendors.

Already, in the infrastructure space, such candidate reference implementations exist with 850+ companies now involved in the Open Stack effort, an entirely open sourced IaaS which is at the heart of an industry movement towards creating competitive, service based markets. It's one of many such potential reference implementations - others including CloudStack, Eucalyptus and Open Nebula. In light of these developments, an EC approach of supporting any defacto provided as open standards with open source reference implementations should suffice in encouraging competitive markets.

Another part of communication discusses the need for voluntary certification and the EC's intention to enable this. If a competitive market of multiple providers forms then as with other utility industries such an assurance / compliance industry should be expected to form. The objection with the communication is that given the development of the market there is no need for intervention of the EC with certification. The danger is the EC risks "blessing" the certification programs it involves itself with.

Such an act can have negative consequences because certification is fundamentally about approval of certain characteristics however the market has not yet come to terms with what characteristics are essential and many business consumers in the market are suffering inertia to change. In these circumstances, certification could be manipulated to "bless" past business models i.e. to create a preference for "Enterprise Clouds" and thereby play upon the inertia that some companies have.  For example, a focus on SLAs can be used to strengthen the case for "Enterprise Clouds" over "Commodity Clouds" despite the latter being the more evolved state that industry will head towards.

The effect of this can be detrimental to industry, as companies adopt services that they wish to believe represent the future and are "blessed" by the EC through supported certifications but then discover the services represent a continuation of old models which take advantage of their inertia to change i.e. the quick fix to adopting cloud is simply to rebadge your existing enterprise hosting service as "Cloud".

A counter argument to this, is that though there is a risk, the trade off to this is encouraging adoption however it assumes that the EC is more capable of deciding what certification is required than the market. Given a situation where there is already significant market efforts to make interoperable systems and competitive markets (such as Open Stack and its 850+ member companies) then this seems premature.

I would agree with the EC effort to support standards, where those standards are designed to encourage a free (as in from constraints) and competitive market and hence based upon defacto (i.e. market chosen), open standard and open source reference models.

I would not agree with the EC effort to support certification programs as these would naturally form in a competitive market and the EC runs the risk of interfering with the process before the market has chosen and the EC cannot be sure that such interference will be beneficial and not manipulated.

Thursday, September 27, 2012

The Standards Jungle ...

The EC has released a communication on cloud computing

I've had a brief read, I'm in the middle of other work but I'll note that the general comments about the importance for future industry, competitiveness, good historical comparison (industrialisation of IT, comparison to utility) etc are all fairly standard fare. Though there's nothing particularly interesting it's good for them to repeat it. 

There's also the usual wobbles on cloud reducing cost, alas Jevons' paradox ... any reductions will be temporary as we will end up doing more.

However, regarding the Fragmentation of the digital single market and the Standards Jungle, I'll make a couple of observations.

In the value chain associated with cloud, one of the components is obviously enough the Internet i.e. availability of this commoditised means of mass communication has enabled commoditisation of discrete IT activities towards utility services (e.g. IaaS). Now, the Internet is as much a social as a technology phenomenon inherited from the hacker ethos - anonymous, decentralised, egalitarian, interactive, neutral to end user etc. 

The Internet's existence also allowed for the easy formation of communities such as as various open technology communities (e.g. open source) which themselves have inherited the hacker traits. There's a symbiotic relationship here between the Internet enabling the communities and the communities in effect sabotaging mechanisms to control the Internet (either corporate or Gov) through the technology they build.

Today, open technology is huge business and most of this is unaccounted (see the clothesline paradox). For example the economic contribution of Apache is enormous but not normally reported - O'Reilly has a fantastic report on this.

So, onto this symbiotic relationship between the Internet and the open technology communities, environments which by their very origins are anonymous, decentralised and neutral which have generated huge economic value including spurring the whole cloud industry most of which also depends upon open technology - we want to apply more centralised legislative frameworks with concerns over areas such as security (in principle due to anonymity)? 

Hmmm, this feels like it could be a repeat of how the radio spectrum was divided up due to security concerns for the benefit of a few companies but in the process managed to destroy a much wider ecosystem. However, this time the economic value at risk is huge. I'd be naturally cautious with any such measure.

So, unsurprisingly, I have concerns over areas such as the certified cloud providers / standards approach.  I'd want to look closely at the details of this. i.e. could this end up being a rehash of the OSI vs TCP/IP debacle? 

Overall : my view on this is 50/50. In some parts it is probably uncessary and if badly handled then potentially downright dangerous. I'll write something more on this when I've had the chance to read it properly.

Tuesday, September 25, 2012

Something I'll be coming back to ...

I'm just finishing my latest research and I thought I'd share a graph with you as I'm going to be using it from time to time.

The graph is a comparison of a reasonable sample of companies across two axis - the level of strategic play vs the use of open as a means of competing against, outmanoeuvring others.

The bubbles size represent the % of two populations - those that consider themselves Traditional and those that consider themselves more Web 2.0.

Basically, companies break down into four groups :-
  • Players : companies which think strategically about IT and are willing to use open as a means of competing. (STRATEGY + ACTION)
  • Thinkers : companies which think strategically about IT but don't tend to use open as a means of competing. (STRATEGY + DECISION NOT TO ACT)
  • Believers : companies which don't think strategically about IT but do use open as a means of competing. (NO STRATEGY + ACTION)
  • Chancers : companies which neither think strategically about IT nor tend to use open as a means of competing. (NO STRATEGY + NO ACTION)
There's a whole underlying model (all to do with evolution) of why these groups exist and the impact of them, their success etc. However, I'm not going to go through this now.

I just wanted to say that being in the Chancers or the Believers group doesn't seem to be very healthy. Which is why I've recently been posting about "What's my IT Strategy" and "Open by Default ... No Thanks"

You see, there is method in my madness and there are reasons why I raise these questions.

Monday, September 24, 2012

Open by Default ... No Thanks

Despite being involved in the open source community for over a decade, I am not a fan of open by default. My preferred route for deciding whether to deploy an open technology approach as in open sourcing a project or providing it as open hardware or open data or via an open API is :-
  1. Map out the value chain of a revenue stream within the organization. The value chain should contain all the systems (activities, practice and data) needed to create the high level product or service.
  2. Apply evolution to the value chain. For example, determine whether the activities are in a state of genesis or more commodity. Care should be taken to ask whether it is widespread and well defined in the industry and not how any specific company considers this activity (as the company maybe suffering from inertia to change). Where a difference exists this should be marked and an inertia barrier added. 
  3. Apply forcing functions to the map for where open technology already exists and is driving a system to a more evolved state. 
  4. Mark barriers of entry (whether regulatory or requiring high amounts of financial, physical or human capital) and impacts on supplier and buyer strengths.
  5. Determine your intent (from efficiency to standards game to recruitment) and use this map to determine whether to open something or not.
Wow, this sounds like a lot of work just to answer a simple question of whether to open something or not? Actually, it isn't. Mapping out the value chain of an organisation should take a few hours at most.

The point of the map is it gives you some situational awareness. It can be used to anticipate changes and competitors action. It also forces you to think about what you're trying to achieve with an open technology approach like open source, how you want to manipulate the market, what opportunity exists, what strategy are you following?

What I don't like about "Open by Default" is it often rapidly becomes "Open without Thinking" and alas this degenerates into "Open without Value" i.e. it's the act of thinking about how to manipulate a market which is more likely to mean a business will value it and put resources, effort and focus into an open technology effort.

If it turns out that you can't perceive any value whatsoever in the project then by all means throw it out there with the hundreds of thousands of other open "meh" projects. Someone might find it useful, it might have some form of beneficial effect you hadn't realised or it might help you hire that skilled engineer / data scientist. But at least, you've thought about it.

Thinking about something generally translates into "we value this". Open technology approaches are powerful tactical weapons in the competition between companies and are far too valuable to be simply ignored or launched without thinking.

Tuesday, September 18, 2012

Unstructured vs Structured

There are many terms I dislike from the misuse of the term innovation to the whole computer utility vs cloud debate. Another example of this is the whole unstructured vs structured data argument.

The terms unstructured vs structured implies there is a difference in the data sets i.e. unstructured data is by its very nature unstructured and unlike structured data. The term implies this is a permanent state of affairs, a set of data which has no structure and therefore cannot be modeled.

However, our entire history of scientific endeavour can be broken down into discovery of data we didn't understand, creation of models to explain the data and finally data we now understand. In other words we constantly move from unstructured to structured via the creation of a model.

Hence I prefer the term un-modeled data vs modeled data. Inherently this implies there is not a difference in the data sets just simply in our ability to model. It implies that there will be a movement from one to another.

What is your view? Am I the only person who dislikes this framing of unstructured vs structured?

Thursday, September 13, 2012

The Real Company syndrome

Ten years ago, I went on the Canon Corporate Executive development program at INSEAD and other institutions.

I remember it well because on the first day, in the first meeting we examined the question of the impact of the internet and what Canon's involvement should be - which was the overall theme of the course.

In a group of executives, there was a massive majority opinion which was very strongly voiced that it was just a channel and Canon shouldn't get directly involved. Then there was me, a single voice (ok, I was CEO of a subsidiary) up against this solid wall of opinion of SVPs, VPs from all over the globe.

Fortunately I've never been shy of a fight. Opinions changed by the end of the six month course but that first meeting always stuck in my mind because of one phrase.

The phrase I heard at that meeting was how this or that company wasn't a "real" company. In that meeting it was how Amazon wasn't a real company unlike Kodak.

Today of course, no-one would dream of saying that Amazon wasn't a real company but I keep on hearing that phrase. In the cloud space it was "but are there any examples of 'real' companies using cloud' etc.

I'd say how about Netflix and the response would be "no, 'real' companies not these tech companies'. Ensue argument over what defines a company, how all media companies are tech companies etc etc.

Where does this awful term 'real' company come from? Is it simply an artefact of inertia and a mechanism for manifesting denial over change?

What's my IT strategy?

I was asked this question recently about a company's IT strategy. Which bit was actually strategy? This is fairly easy to work out.

1. Take a company's IT strategy.

2. Now remove any and all references to a choice of a specific vendor as these are purchasing decisions e.g. we will use SAP to ...

3. Now remove any and all references to implementation details e.g. we will build a private cloud to ...

4. Now remove any and all references to operational details e.g. we'll improve our SLA's and reporting times to ...

5. Now remove any and all references to tactical choices e.g. we will invest in big data, BYOT (bring your own technology) and open source 

What is left, is the IT strategy. This should give you an idea of where the company is heading and what is governing those purchasing, implementation, operational and tactical choices.