Friday, May 24, 2013

The pure brilliance and horror of BitCoin

A chance discussion with James Duncan in which he raised the potential impact of Bitcoin on taxation  made me go and map out the value chains around money and look at some things in more detail.

Bitcoin is truly marvellous as a concept and it is far more than just another currency. If we examine the value chain associated with money we have higher order systems for transfer and storage (banks) along with taxation (Government funding). 

Bitcoin's effect is not just to act like a form of cash but to decentralise the transfer systems. By doing so, it impacts our ability to tax. If we can only see what is transferred but not to whom then we cannot in effect tax without the voluntary participation of those involved. But our market economic system is based upon greed and without adequate means of detection a flourishing alternative market will grow. Hence our taxation systems would have to adapt to some other commodity where the transfer mechanisms cannot be hidden so easily. But to what? Land ownership? Residency?

If the burden of the entire governance system is placed on land ownership then some will be forced to sell but to who and what of the subsequent burden of social housing and how will that be funded? But what of residency, can't we use that?

Alas, greed again plays its role because if the wealth is hidden then everyone can say they earn nothing and have no wealth. You cannot provide for the exception as all will claim it. Government has to assume everyone has wealth but some won't and you will not be able to distinguish. So what do you do when someone claims they cannot afford residency? Force them to work? Evict them? Gaol them?

It's greed that will drive the growth of Bitcoin. It's greed which will prevent the ability to tax. The beauty and horror of Bitcoin is that ultimately through the very thing that drives it (Greed) it will bring about the destruction of the capitalist system by the dismantling of the very thing that capitalists tend to dislike - taxation and the state. Bitcoin will become the cancer of the capitalist system. It is truly stunning in simplicity, unstoppable and alas Pandora's box has been opened. What a marvel of ingenuity it is.

As you can guess, I'm not a fan of Bitcoin. Left unchecked then I find it will undermine the importance of Government which is actually not good for competition and the market. However, don't confuse my disdain for Bitcoin with opposition to the technology behind it. The Blockchain has huge and positive potential in many industries. I'm a fan of the Blockchain, I just can't stand Bitcoin.

Wednesday, May 22, 2013

Could IBM buy $RAX ?

It has been an interesting few weeks in the history of cloud.  Dell swooped on Enstratius and then announced it was pulling out of the public cloud battle. This is a pretty bold set of moves by Dell with a strong whiff of realism.  It's bold because it would require someone to face upto the fact that don't have what they need to compete in the infrastructure space and they need to instead play a broker / platform play.  Dell have pockets of strength such as the Dell DCS team but they're unlikely to come off winners against AMZN vs GooG. It has a whiff of realism because to make such a play also means accepting a much smaller role in the infrastructure space in the future and it's unlikely someone who had not realised this was a fight for survival would have made that move.  Tough decision, good call IMHO.

Some of the gloss on OpenStack is starting to crack, as Dana Blackenhorn pointed out in a woefully inaccurate article.  The old myth of AWS being a low margin business - where do people come up with this stuff from? Amazon almost certainly controls its price drops because computing infrastructure demand is elastic but building data centres is constrained by time, resources and capital.  If you drop the price too quickly, you can increase demand beyond your ability to supply which is especially dangerous if you're already on an exponential growth path. However, despite the lamentable state of the post it does pose an interesting question. Would IBM buy $RAX?

In the cloud infrastructure business, IBM is being thoroughly whooped by AMZN and is facing the threat of GooG.  You can spin virtual data centres as private clouds or how private clouds are the thing as much as you want.  The future is a hybrid model of multiple public commodity providers and currently that list is AMZN + AWS clones, GooG, possibly MSFT and maybe some players in China / India (though I expect them to be AWS clones too).

IBM will have massive inertia to this change due to past success of its server and hosting models. That inertia will happily try and hold back IBM from changing, well past the time that its infrastructure business is lost.  But lost it will be and it's already incredibly late in the day to do anything. However, there is a glimmer.

If (and massive if) IBM bought RAX then could this bring them some direct capabilities in the commodity space at scale?  It'll come with some legacy of an older hosting model, it's a minnow compared to AMZN and it comes with baggage such as the differentiation from AMZN nonsense (though it's perfectly true that many in the OpenStack community are trying to make OpenStack have 'behavioural fidelity' with AWS).  But, it could be a good move to bolster OpenStack and establish a competitive market given certain conditions. 

If (HUGE if) IBM bought $RAX, focused its server and hosting business of this space and made it clear that the past was dead, dropped significant yearly investment especially in the first three years - say around $3-5 billion p.a. and nullified AMZN's ecosystem advantage by becoming a better AWS clone through embrace and extend then yes, they could become a player in the infrastructure space despite previously saying it wasn't their focus. They could rally a competitive market around OpenStack and get it up and running fast.

BUT it would mean making all the right moves and that would take strategic wit, a CEO with Captain Albert Ball's courage, able to spend upto $20 billion over three years in the face of strong internal resistance and determination to take the fight to the market. Meg Whitman seems to be trying to shake HP into life maybe Rometty could do the same for IBM because her predecessors have singularly failed to do so.

Would I  personally take the gamble if I was IBM? Yes, but then I would have flooded the market in 2008 and not let myself get in such a position. And that's the problem, given IBM's current positioning and their past timid actions in this space, I find it unlikely that they'll change unless things becomes desperately clear by which time, they may acquire but it'll be too late. 

It's an interesting question but I do see this idea of acquisition as more a flight of fancy. Based upon past actions I don't believe IBM has what it takes. I do expect IBM to focus higher up the stack, to let this infrastructure business slide and my son to grow up in a world where the idea that IBM built infrastructure is a fading memory. 

Maybe Rometty will change that.  Let's hope so, it'll certainly make things more interesting.


--- Final Notes.

1. I'm an advisor to CloudScaling. I think Randy Bias and the crew are spot on in trying to create 'behavioural fidelity' of CloudScaling's OpenStack system with AWS.

2. Whilst I know many engineers involved in OpenStack and I firmly agree with the approach of using open source to create a competitive market, I don't think the execution and the game play has been particularly sound.

3. I have long held the view that there is a transitional role for private clouds that are AWS clones but I expect that to become increasingly niche in the very near future. The key word is 'transitional'. I do hold the view that a public market of AWS clones is possible.

4. Despite the general media interest, OpenStack is not the only contender. I have a stronger preference towards Apache CloudStack. I know Peder Ulander and several of the CloudStack team and I think that Citrix is playing a sound game. I also know many of the people behind Eucalyptus and Open Nebula and they shouldn't be underestimated either.

5. I was an advisor to Enstratius.

Monday, May 20, 2013

Good news, the laggards are catching up ... again

Just been reading the articles on the New Work and Thinkers, Builder, Improvers and Producers.

Interesting. 

Ok, we all know that organisations consist of value chains (figure 1) which consist of components that are evolving (figure 2) and that you can map this (figure 3). As those components evolve their characteristics change (figure 4) and the methods you need to apply vary (figure 5) hence you need to treat your value chains as units (figure 6). You can manipulate this environment through a whole host of techniques. Furthermore, if you count up the frequency of activities at different stages you can build a profile for an organisation (figure 7) and you apply a structure based upon evolution rather than type (figure 8) with pioneers, settlers and town planners.

Pioneers roughly equals their Builders. Settlers equates to their Improvers and Town Planners equals their Producers. All three groups need Thinkers. All good stuff, cutting edge thinking if you happen to be in 2008 or earlier (we started implementing the structure in 2004/5).

Unfortunately it's 2013 and now it looks like the laggards are finally starting to catch up. Fortunately a four party system based upon the characterisations described in the post will create its own problems and in order to manage this they'll need to have a good grasp of mapping.  Even better news is that without a clue on how evolution actually works then their strategies will still be endless tyrannies of "what and when" over "why", they'll talk about ecosystems but won't understand it and they will probably continue to describe many predictable things as disruptive innovation. They will almost certainly continue to jump between one fad and another - "we're an agile company", "we're a six sigma company" - and many probably won't understand let alone have adopted those next generation practices (figure 10) despite having years to do so.

Word to the wise though, you have to accept that they will eventually cotton on and so the hay days of just walking into an industry and helping yourself may soon be over. The Chancers (figure 11)  might actually start to learn how to fight rather than collapse as their predecessors have - Blockbuster, Kodak etc. Things might soon start to become a whole lot tougher but it'll take time.

Those delightful days of walking in and dominating an entire industry like cloud computing for less than a few hundred thousand dollars are gone.  Pity but don't worry, they will come again and we've still got a few years left to play the game before it settles down. By then the new worlds of 3D printing and intelligent agents should allow us to have fun again with new practices, new ways of dealing with evolutionary flow and new concepts to experiment with.

To describe the last decade of industrial competition as being like "stealing candy from a kid" in some quarters is reasonably fair. Biggest problem for the players has been which industry do you want to own next? However, it was always unreasonable to expect it to last. From what I'm reading, people are slowly starting to stumble upon what many of us now take for granted. We should expect companies to become a bit tougher to walk over in the next decade.

Your future competitors will map their environments, they will know how to play the various strategic games, they will build effective ecosystems and organise around evolution and they will have all those next generation characteristics ... get used to it.

So expect lots of books on these topics invariably described as "break through thinking" and "gaining competitive advantage through structure" and try not to laugh. Because the real joke may end up on us if we're not careful. The easy street is over and we have to move on, to find the new mechanism of exploitation to give us another good 5-10 year run of trampling once again. We need to get even tougher.

We should never ever forget that those models of understanding that create an advantage will eventually end up in a book somewhere. By which time any advantage has long gone and we need to have moved on as well.  But then, life just wouldn't be exciting if this wasn't the case.  If it wasn't for the laggards eventually catching up then we would never have to progress. Nothing is permanent, all practices and activities evolve and diffuse. Embrace this.

So celebrate this awakening, know that new mechanisms will be found as old mechanisms become common.  It'll take a decade for the practices to have spread by which time we will be chowing down on industries with new techniques that we haven't even yet considered ... just don't rest on your laurels. That's the way you become a laggard.

Happy hunting.

Figure 1 - Needs to Value Chain (circa 2005)


Figure 2 - Understand Evolution (circa 2005)



Figure 3 - Map (circa 2005)



Figure 4 - Characteristics Change (circa 2005)



Figure 5 - Different Methods (circa 2006)





Figure 6 - Treat as Units (circa 2006)




Figure 7 - Profile (circa 2008)


Figure 8 - Build a structure which reflects evolutions (circa 2008)




Figure 9 - Implement (circa 2008)



Figure 10 - Next Generation Practices (circa 2011)




Figure 11 - Classification of companies by level of strategic play vs use of open (circa 2012)



Saturday, May 18, 2013

Two Scenarios

As an exercise, read the following two scenarios and give them both a score out of ten for plausibility of occurring. Did you notice anything?


Scenario 1

CIO: “All systems are operational this morning”

CEO: “Excellent news. Apparently the latest thing is cloud. According to this business magazine report 67% of successful companies are using them. We should look into that”

CIO: “Certainly. The latest research I have says that private clouds are the things to build as they’ve now entered the plateau of performance and provide an extremely efficient mechanism of infrastructure provision over our existing technology.”

CEO: “Excellent work. Well let’s look at getting a private cloud up and running.  We don’t want to be left behind in this technology war.”


Scenario 2

Corporal: “New Cannon arrived, as per orders we installed them and fired them this morning”

General: “Excellent news. According to this article in General’s weekly, over 67% of successful generals bombard hills. Let’s make sure we’re doing that as well.”

Corporal: “Certainly. Our research says the latest things to bombard hills with are mortars, it’s now a mature technology entering the plateau of performance and an extremely efficient mechanism of killing compared to existing technology”

General: “Excellent work. Let's bombard a few hills with mortars then! Don’t want to be left behind in this technology war.”

From Strategy to Mapping to Pioneers

A short and rough set of slides on some very basics of strategy, mapping, evolution and structure.


Steps from Simon Wardley

I put this up because of +Edd Dumbill recent post on a "Speaker for the Crazy". At the heart of this is mapping which if you don't do, you won't get the whole process and so don't worry. Just remind yourself that "67% of successful general are bombarding hills"

Wednesday, May 15, 2013

Why I'm a fan of Bromium

I've been a big fan of Bromium for some time. No, I'm not on the advisory board, nor do I have shares in the company and yes, I do know the founders. However, just because I know someone doesn't mean I'm going to agree with what they are doing.

The reason why I'm a BIG fan of Bromium is because of the approach they have taken to dealing with security.

I used to work in the security industry and I can happily say that a chunk of it is based upon snake oil and fear. The general principle of creating a secure but functionally useful system is based upon solving an impossible problem and with good commercial reasons. However, let me explain why.

A basic understanding of mathematics and a realisation that a computer system is nothing more than a mathematical model would lead anyone to Godel's incompleteness theorem and how trying to build a consistent and complete model is impossible. There is no such thing as a provable universally secure system which is useful. A system can only be described as provably secure within a set of given conditions and assumptions, one of which is that someone doesn't find a new attack vector which hasn't been catered for.

The bits of the security industry I worked in, knows the approach of building a secure system is impossible but it's actually in commercial interests to attempt to continuously solve the impossible.  All these new attack vectors create constant revenue streams, a painting of the Forth bridge so to speak and a constant need for new virus signatures, protection upgrades against malware etc. Oh, and if you don't keep up you might be exposed to new zero day exploits, viruses, malware ... fear, fear, fear ... give us your money.  Of course, this carefully skips over the other issue that protection is always post-event i.e. after the new vector is in the wild, discovered or written by security testers. 

What I most like about Bromium is they don't try to solve the impossible. Bromium doesn't try to stop you from ever being hacked, receiving malware or being hit by a zero day exploit because to do so would be impossible. Instead they accept that you will be hacked and receive zero day exploits. What Bromium focuses on is limitation of the damage and this (unlike the incompleteness theorem) is a more solvable problem.

The principle here is rather simple. If you accept that you will receive a zero day exploit that you've never seen before (and therefore cannot protect against) then what you do is limit the damage to as small and as temporary an environment as possible. In the case of Bromium, every process on the machine (i.e. every browser tab, every individual email) runs in a hardware isolated micro VM.

Let us suppose an attacker has sent you some previously never seen before zero day exploit in an email which would bypass most standard security protection. Then under Bromium, a successful attack will gain control of a hardware isolated MicroVM which the email runs in. That MicroVM consists of an empty machine plus the attackers email which is all isolated away from everything else (i.e. all your other emails, files etc). Of course, as soon as the user closes the browser tab or the email then the MicroVM including the attacker's malware disappears, returning the machine to its previous "secure" state.

What Bromium has neatly done is not try to solve the impossible (preventing you from being attacked) but instead limited any damage to as small and as temporary a space as possible. Hence whilst Bromium does not prevent any zero day exploits being run, it reduces the impact of them to practically negligible. The fear is gone. Just because one email has been compromised, doesn't impact all the other emails or the other applications and environments on my machine. It's all isolated and to get rid of the problem I just close that email.

Now this doesn't solve all the issues of security by a long shot, it's no magic bullet. There are numerous other attack vectors such as wetware and social engineering attacks. But what it does do is threaten to disrupt an entire branch of the security industry which in my view needs to be disrupted and has become an unwelcome leech.

This is why I am BIG fan. Oh, forget that ... I'm a HUGE fan. Go, Go Bromium.

I like people who find difficult problems and attempt to solve them by attacking the solvable bit (e.g. limitation of damage) rather than trying to solve the knowingly impossible which also happens to generate them continual revenue streams.

Sunday, May 12, 2013

In search of two better terms - Chaotic vs Linear

Many years ago (getting on for a decade now) when dealing with the issues of diffusion and how changes not only diffused but evolved, I developed the following graph to describe evolution (see figure 1).

Figure 1 - Evolution


The graph is based upon two axis. One of ubiquity which measured how widespread and commonplace the notion of something was and certainty which measured how well defined & well understood something was. 

By measuring the change of the style of publications related to the act then different domains were highlighted i.e. for activities (things we do) we have :-
  • Genesis : where publications tend to refer to the wonder of the thing.
  • Custom built : where publications refer to the building, construction and awareness.
  • Product : where publications refer to operation, maintenance and feature differentiation
  • Commodity : where publications are dominated by use, i.e. what is built with or on, guides for maximising use etc.
This process of evolution covers activities (what we do), practices (how we do things) and data. Each of these evolve through similar states in terms of characteristics, though the terms we use are different i.e. activities evolve from genesis to commodity, practices from novel to best.

Each component can evolve independently or in certain cases co-evolve for example with computing infrastructure provided as a product we had novel practices for architecture which evolved to become best practices (scale-up, disaster recovery, N+1). As the act of computing infrastructure itself evolved to more of a utility then novel architectural practices appeared and are currently evolving (scale-out, chaos engines, design for failure).  Hence best architectural practice for an activity provided through a product is not the same as best architectural practice for an activity provided through a utility.

To understand the connections we need to invoke mapping techniques but that's another conversation. What I'm looking at is one of the underpinning axis of mapping ... evolution.

By 2008, I had amassed several thousand data points providing the same pattern and the causation had been modelled (simple competition including user and supply). However, one thing has niggled me from the beginning. Back when I started to collect the data, I needed descriptive terms to describe the characteristics of the extremes of evolution with the in-between state being a transition from one to another.

At one extreme you had the un-modelled, the uncertain, the constantly changing, the unpredictable, lacking agreement and convergence, the novel and rare, the exciting (in terms of future potential) and the poorly understood. 

At the other extreme, the same act would evolve to the understood, the stable, the predictable, the measured, the agreed and converged, the common and widespread, the dull, mastery of perceived simple operation.

The latter simple operation is important as the actual act through standard interfaces often hides a world of complexity in the same manner that putting a plug into a socket and switching it on hides the complexity of electrical generation and distribution from the consumer. We have a perceived notion of mastery - we socket the plug and switch on - and expect power to be delivered. We are often unaware of the myriad of systems - some simple, some complex - that enable this. Our notion of mastery is only shaken when our expected outcome is not delivered.

The same holds with currency. When I hand over a pound to buy my 20p newspaper I have an expected outcome. I don't expect to get a varying amount of change in return. When I turn on the taps, I expect to get running water not nothing, not sand, not sludge and not a fizzy drink. It's our expectation of a simple linear type operation which is essential in defining commodity and utility despite the actual complexity of any systems the interface obscures.

I struggled with finding terms to describe these two extremes of evolution. Nothing was truly suitable. I could use chaotic vs ordered but this implied a permanence in state and the terminology was confusing. I did however find a descriptive framework in Stacey's Matrix, which had axis of certainty and agreement. One extreme was chaos and anarchy (low certainty and agreement), the other was simple (high certainty and agreement). See figure 2

Figure 2 - Stacey's Matrix


Simple in this term referred to a simple linear type operation and on reading a paper by Roger's which discussed how things not only diffused but as footnote mentioned how they matured becoming less chaotic and more linear in appearance, I decided to use the terms Chaotic vs Linear to describe the extremes. However, I've never been happy with those descriptive terms mainly because the notion linear degenerates quickly into linear / non-linear systems and the notion of chaotic quickly degenerates itself into other discussions.

So, finally, after all these years, I'm revisiting this and looking for two terms which more aptly describe the characteristics of the extremes i.e.

Something more apt than "chaotic" to describe characteristics of the un-modelled, the uncertain, the constantly changing, the unpredictable, lacking agreement and convergence, the novel and rare, the exciting (in terms of future potential) and the poorly understood. 

Something more apt than "linear"to describe characteristics of the understood, the certain, the stable, the predictable, the measured, the agreed and converged, the common and widespread, the dull with  mastery of perceived simple operation.

A good friend of mine @jamesurquhart has suggested "wild" vs "domesticated". I actually quite like that as descriptive terms for the characteristics of the extreme (see figure 3). I'm hoping someone might have a better suggestion.

Figure 3 - Wild vs Domesticated


What I want to avoid is meaningless or confusing terms such as "innovation" which is equally used to describe genesis of something, feature differentiation of a product or more evolved models of providing pre-existing activities. I'm looking for something more apt in describing those characteristics which is not tarnished by common use or loaded with other connotations