Monday, April 30, 2007

For every action, there is an equal and opposite reaction

Over the last year, I've seen some interesting news about software patents and copyrights.

In the UK we had the response to the epetition on software patents, which stated

"The Government remains committed to its policy that no patents should exist for inventions which make advances lying solely in the field of software."

In the US, just recently the supreme court has limited a decade old test centered around the requirement that an invention be "non-obvious". You can get more details at GrokLaw.

Two wonderful quotes included :-

"Granting patent protection to advances that would occur in the ordinary course without real innovation retards progress and may, in the case of patents combining previously known elements, deprive prior inventions of their value or utility"

Wow, that's an amazing statement. I particular like the the part about patents which combine previously known elements, which could extend to all sorts of places.

But wait, there was more :-

"One of the ways in which a patent's subject matter can be proved obvious is by noting that there existed at the time of invention a known problem for which there was an obvious solution encompassed by the patent's claim"

So, if it is a known or existing problem, which can be solved without real innovation, then bye bye.

But wait, hold on, PatentlyO says there is more. In the Microsoft vs AT&T case, the Supreme court said "Patent Act does not extend to cover foreign duplication of software".

Excellent, so how are we doing in Europe? Well, when it comes to copyright we seem to be heading towards a dismal future thanks to the IPRED2.

This makes the "aiding, abetting, or inciting" of copyright infringement on a "commercial scale" a crime. Doesn't seem such a bad an idea, but what by the way is a "commercial scale"?

Well that's the problem, it's very fuzzy and it seems to include "open source coders, media-sharing sites like YouTube, and ISPs that refuse to block P2P services."

Er.... hang on, if the US are finally trying not to be at a competitive disadvantage why are we in Europe trying to bash up the open source movement?

Every time something smart happens, there is always that opposing reaction to be found.

Shut the stable door ..... damn!

I've posted and made comments several times in the past about open source not just being a license model. I'm going to explain a personal view of this.

I believe that the use of a free and open standard (HTTP) accelerated the adoption of the world wide web in a manner that had not been achieved previously with any proprietary service. Why?

If a company invents something novel and new (a potential source of competitive advantage or CA), the tendency is to maintain control over that invention (either through patents, or secrecy or some other process). There is however the inevitable cloning of a good idea (as consultants move from one company to another), the eventual creation of a standard product and if useful enough the eventual conversion to a common and ubiquitous standard or commodity (often a cost of doing business or CODB).

All of this takes time.

By creating a free and open standard, you circumvent this process and this allows for the faster adoption of an idea as a common and ubiquitous standard - if it is useful enough. This is what I believe happened with the world wide web, the act of Tim Berners Lee releasing HTTP as a free and open standard accelerated the adoption of a good idea. Had he not done so and instead a proprietory licensed HTTP been released, I doubt I would be blogging today.

Once something is common and ubiquitous we tend to worry less about it and focus our creative energies on the novel and new. It would be pretty difficult to start building a virtual web business today, if the first things you had to concern yourself with were chip manufacture, building an electricity generation system and network lines and network equipment connected to your end users.

Hence I'd argue that the process of moving from CA (competitive advantage) to CODB (cost of doing business) drives innovation. Furthermore the real power of these "infra-structural" ideas can only be realised when they are adopted by a society at large e.g. national energy standards for power transmission, national transport standards for railtrack widths and so forth.

The reason I can catch a train from London to Glasgow, is principally down to such agreed standards - the Gauge of Railways Act, 1846.

This is why in my view open source is such a powerful force. It circumvents the traditional process and has the potential to drive an idea to become common and ubiquitous, an adopted emergent standard. This drives further innovation to the next idea and so forth.

Of course, just open sourcing an idea is not enough. It needs to be a good enough idea that a community will want to adopt it, and be involved with it. The internet and the world wide web were not only examples of such good ideas but also created new means for such communities to communicate. This created new social networks, which are themselves another source of idea generation.

So three factors seem to have been at play - first an idea for a new way of communicating was rapidly spreading due to its free and open nature. Secondly the new form of communicating was allowing new social networks to form and hence new ideas to be created. Thirdly many of those new ideas were themselves "open sourced" and quickly spreading.

Now the concepts of free software (as in speech not beer) were established well before this. However the development of the world wide web helped to spread the paradigm of open source not only in software but into other areas - such as user content. New ideas such as agile development have also propagated by such means.

Of course it didn't stop there nor were these new formed social networks limited to technologists, and why should they be? The basic principle is that freely sharing your work and ideas with others helps generate more ideas - this is applicable to any field.

So how does a business compete in such a world where ideas are so freely shared? Well it's not ultimately through controlling the spread of an idea, but instead through providing a better service. The old processes where an idea was controlled by the limited few and drip fed into the world are starting to crumble. Whether we are talking technology, arts, science, business, politics or news - open and free content, new social networks and consumers as producers are breaking down these old boundaries.

In this world the old, more established, more defined processes are being replaced with a much more dynamic system. For example, new forms of commerce such as, where end users are more actively involved in the production of goods rather than just being passive consumers.

This is why I say that conversation is becoming as, if not more, important than the product. You need to strike up that conversation with your customers before someone else does.

As a company you'll also need to find ways of releasing the ideas within your organisation as competitors become more agile in business. Open source and virtual niche business models are just around the corner whilst manufacturing is in-line to undergo a serious change as a result of such ideas and the fabbing revolution.

This "new world" is far more dynamic, no-one can plan it in detail, no-one knows where it is heading, you have to simply "try, measure and adapt".

This "new world" is being driven by the idea of open source, the impact this idea is having in many industries and the consequential increase in the rate of innovation.

It's more than just another licensing model and there is no turning back.

The horse, in my opinion, has well and truly bolted.

It's like faith but without hope.

A month ago, I posted a comment on why I believed Atheism was a faith.

I've had a crackingly good conversation with George Felis on this subject since then, and had moved my position initially to that of :-

Atheism (strong) is not a faith because Atheists believe in this viewpoint ONLY from the conclusion of rational (a priori) argument rather than say as a matter of hope or revelation and it is that matter of hope or revelation that distinguishes Atheist beliefs from a matter of faith.

However, upon reflection a refinement is needed as Atheism may contain an element of revelation in the sense that the belief maybe revealed through insight or a particular event. However, I can find no examples of where Atheism contains hope, an end game such as salvation. It is this absence of hope which distinguishes Atheism from other faiths.

Hence, I take the stance that :-

Atheism (strong) is not a faith because Atheists believe in this viewpoint ONLY from the conclusion of rational (a priori) argument rather than as a matter of hope and it is that matter of hope that distinguishes Atheist beliefs from a matter of faith.

My original view was wrong and hopefully this explains my title.

Sunday, April 29, 2007

Stop all that chattering! I'm talking ...

I've read some disturbing posts over the last few weeks regarding freedom of speech, the consumer as a producer and the web 2.0 phenomenon.

I though I'd post something on these, and finally get round to replying to Jenny's questions.


As Yochai Benkler and Eric Von Hippel have studied the open source movement and emerged understanding of the "Wealth of Networks" and "Democratizing Innovation", am I understanding correctly lessons from your open source experience for creating sustainable networked organizations include:

  • i. "you cannot efficiently plan out the process of development as it is more akin to research and therefore dynamic".
  • ii. "three axis of technology, people and requirements being relatively unknown"
  • iii. "try, measure and adapt"


My experience comes not from creating open source communities but dealing with static or dynamic processes of building. I'm not sure how applicable these are to your work on networked organisations, but let me explain these processes and how the three points you mention relate to my experience.

ii. "three axis of technology, people and requirements being relatively unknown"

The process of building a software system can loosely be described as involving people, technology and a set of requirements.

If all three of these "axes" are well known or well defined, the process of building can be described as static. Whereas if these "axes" are not well known or defined, the process can be defined as dynamic.

Hence mass copying CD's, is a fairly static process - you know exactly what the requirements are, CD writing technology is well known etc. Conversely developing a new and novel web site can be described as a dynamic process - even the requirements are generally not well known.

iii. "try, measure and adapt"

Unfortunately in the software industry there seem to be two common re-occuring issues:

  • Firstly, despite much being CODB (cost of doing business), there seems to be an over tendency to develop or customise. Fortunately with the rise of SaaS (software as a service), utility computing markets etc - some of this tendency may diminish.
  • Secondly, where things were novel and new - and therefore technology, requirements and individual performance are relatively unknown, there has been an over tendency to attempt to use static planning processes. Concepts such as "software factories" and the scientific approach to management (e.g. Taylorism) have been misapplied in this context.

About a decade ago, when developers started to use more dynamic planning methods to deal with dynamic problems, there was a significant improvement in productivity for the companies they worked for. Today, techniques like SCRUM & XP (also known as agile development) are becoming widely used because they are inherently dynamic and are designed to deal with new and novel software development, unlike static planning systems.

These new methods are based upon the concept of "try, measure, adapt". In the case of test driven development, you write a test, you write some code to try and pass this test, you measure whether this worked, you adapt if it didn't and on and on.

i. "you cannot efficiently plan out the process of development as it is more akin to research and therefore dynamic".

Now "try, measure and adapt" is a valid form of control, but notice there is no specific planning step. This doesn't mean you don't have a plan, it just tends to be fairly minimal.

I'd like to make a joke that "you wouldn't try to Gantt chart a cure for cancer"; unfortunately in todays R&D environments in the UK, for some quixotic reason such static planning methods are being enforced. Lunacy ... no, just wasted energy and effort.

Novel software development is more like a game of football - you never play the same game twice. You have a common goal, an idea of how to attack the game, but fundamentally you try out something, see if it worked and adapt - during and between games.

Every football team dreams of playing a team whose players are following a rigid plan. Could you imagine Gantt charting a football game before the game? Could you imagine a team who followed such a plan? What happens if the ball isn't where you planned it to be?

This illustrates why static planning processes are good for static systems, whilst dynamic processes are good for dynamic systems.

Hence my points :-

  • i. "you cannot efficiently plan out the process of development as it is more akin to research and therefore dynamic".
  • ii. "three axis of technology, people and requirements being relatively unknown"
  • iii. "try, measure and adapt"

Now let me try and link these ideas to the concepts of agile enterprises.

First onto Enterprise 2.0 technology and specifically wiki's. At Fotango, we adopted a wiki some four years ago as the static process around our intranet (this person writes this bit, this person approves etc) had created a information resource which was useless. So we decided to try something new. We put up a blank wiki and before long everyone was contributing something.

Unfortunately, so much information was put onto the wiki that it became overloaded with "noise". So we needed to adapt and try something else - "gardening". By "gardening" I mean a regular pruning of information within the wiki.

In some organisations "gardening" may emerge naturally, in ours it didn't. This is a critical point: you shouldn't plan out in detail the adoption of an Enterprise 2.0 technologies within an organisation because you don't know what behaviours will emerge. Instead you need to "try, measure and adapt".

Note, I say you shouldn't plan out. This doesn't mean you can't. I can always plan out exactly how a football game is going to go, who is going to be where and at what moment in time etc. I'm not going to get the best result if I do - especially if the other team don't follow my plan.

So on to my title ... stop all that chattering! I'm talking ....

Whenever I've been involved in introducing more dynamic processes, I've generally come up against a fairly resistant and incumbent "old guard" who like the "old way".

So we come on to the latest spats about amateur online journalism. The "old guard" of the news world has been very comfortable with the well defined macro level processes of them collecting information, editing and disseminating it. Sometimes, they have been caught out spinning or doctoring information - reinforcing the old adage of "don't always believe what you read in the papers".

The "old guard" also selected who had a voice, it decided upon the criteria of expertise, it chose.

Unfortunately for them the ball has moved, and now we are in a world where anyone can collect, edit and disseminate information.

This "new world" does create a lot more noise. It provides powerful new mechanisms for anyone to be heard. Much of what is behind the attack on "net neutrality" in the US, the involvement of more traditional news organisation into the internet space and the recent spate of articles about the need for curators or editors for the internet appears to be about re-establishing that neat, old view of the world.

What is needed however, are new forms of control that are more adaptable to the reality we find ourselves in. I do want to know what is happening in the world. I do want to trust the source and sometimes I do want to have my say.

However, in this "new world", I get to decide whom I listen to. The only issue is who do I choose?

Unfortunately whilst the mechanisms of dissemination exist, the mechanisms of choice or trust don't. What is needed are reputation-based social networks. A method for searching for information from people that I, my friends, or the general public, trusts.

It may emerge that we choose to trust the same "old guard" as before. If not they'll just have to adapt and try something new. It may emerge that a "new guard" is created through the Stentorocracy as I called it.

You cannot understand everything on the Internet, you cannot make perfect sense of all the noise. In much the same way in economics you cannot make perfect decisions, or be that " rational man" or reach your "pareto-optimality". There is always too much information.

Something needs to separate the noise from that which is useful. Hence a new system, reputation-based social network, is needed to separate the wheat from the chaff.

Of course, there will always be winners and losers. There will be the included and the excluded. Maybe we'll all end up listening to some random 15 yr old blogger because he made some good points.

Maybe not.

However, I don't think an approach of "Stop all that chattering! I'm talking ..." is going to get the "old guard" very far. Especially if they use the same techniques, such as blogs, which they complain about. Hence I'm far from convinced by Andrew Keen

Still he has a voice, he has a right to be heard. But then so does that 15 yr old blogger.

You see, in my simple world it's the idea that's important, not the source.

Saturday, April 28, 2007

You suck ... thank you.

Neil McGovern posted some very valid comments about Zimki. Now, we're a small company and we know about the shortfalls in documentation with Zimki.

We're trying to improve it, all the time.

I hope that when we open source Zimki, we might be able to gain the support of others in the wider community in creating a utility computing market based upon an outstanding product. We will of course push it as far as we can.

So Neil's comments are fair, but they are also very much appreciated. Why? Because Neil has taken the time to tell us what he thinks sucks with Zimki. This is positive and helps us improve things.

So thank you and we'll get it fixed as fast as we can.

Friday, April 27, 2007

Web 2.0 catchup

The Web 2.0 Expo was fantastic, we had an excellent outing with Zimki as per Koby's post on our blog, overall great fun.

I gave two talks - one for Ignite, one at the Web 2.0 Expo itself - both seemed to go ok. I also met a large number of interesting people and listened to some very interesting ideas, you just can't ask for more than that.

We didn't get much take up on the carbon offset idea. We'll try that again at OSCON as I'm keen to encourage people to think about such things.

I saw many exciting ideas, talks, companies and products but I'll mention one in particular - BungeeLabs.

We launched Zimki, well its predecessor called LibApi, back in early 2006 with the service going public in Mar 2006. The idea was based around :-

  • an online development environment which took care of all the "Yak shaving" which normally occurs with software development
  • a "pay as you consume" model for charging for the use of our computing cloud
  • the creation of a competitive utility computing market through the open sourcing of Zimki

Later that year Amazon's EC2 launched, we knew we had competition - so it's good to see another company enter the same market space. Why good? Well, it validates the market, it creates competition and I have to agree with Sha Agassi's sentiment that utility computing clouds are the most important developments in the software industry for the last ten years.

Agassi however refers to Amazon's EC2 directly whereas my view is the really important step is the establishment of a competitive utility computing market - not just one provider - hence the reason for open sourcing Zimki to try and kick-start this process (see my earlier posts on open sourcing Zimki, large scale disruption, utility based grids etc)

So how about the new kids on the block with their next-generation on-demand environment? What do they provide? A web-based IDE, on-demand scalable deployment, highly instrumented infrastructure and utility computations that combine computing, storage and network interaction - hey it sounds like another Zimki like concept.

Unfortunately they are not going to be into beta phase until May apparently - so it's not out yet, but it is direct competition at the same level of the stack - this is good news.

The IDE they demonstrated looks good, didn't get a chance to play with it myself and from Alex Barnett's blog they've got the concepts right - utility computing and removing "yak shaving" (note, must buy shares in the little company that makes our Yaks for us). Of course, I'd have to agree since we've been doing and talking about this for over a year, I wouldn't exactly call it revolutionary though. We didn't even think it was revolutionary back when we started and that was some time ago.

However a couple of disappointments for me. First they seem to have created a new language - BungeeLogic. I wish they hadn't, there is enough of them out there, that's why we chose JavaScript.

Secondly from what I was told by them, they are only open sourcing the connection component not the entire environment and engine. This is a shame, as the real value is to have many providers in the same space with developers freely able to move between providers.

Still, it's exciting though

Wednesday, April 18, 2007

3, 2, 1 ... Ignite ...

As soon as I arrived at the hotel in San Francisco, I was out of the door heading to a Werewolf game. It was a great evening organised by Artur Bergman.

I accused Brady Forrest of being the Werewolf! He said I was only accusing him because he'd turned down my talk for Ignite - I laughed and still said he was the Werewolf (I can't remember if I was right or wrong in the end?).

The following day, about four hours before Ignite, Brady and Jessie sent me an email and asked me to do a talk. I didn't have time to write a new talk to fit in the with standard process, which is 20 slides in 5 minutes automatically switching. So we agreed I could control the pace of the talk if I did 70 slides in 5 minutes!

So I was on for Ignite. 70 slides in 5 minutes, jet lagged, hungry and four hours to go! Arghhhh. I was still writing it whilst the other speakers were presenting.

It was a mix of my talk from FOWA and E-Tech

Well I did it ... a ducks, kittens with guns, commoditisation, utility computing, yak shaving and fabbing etc .. and it wasn't too bad, at least I didn't go down in flames - which could have happened oh, so easily.

Thanks to Brady and Jessie for letting me do this, crazy, but fun...

Thursday, April 12, 2007

My voice, your voice ... our voice.

The market is changing, and conversation is becoming as or if not more important than product. It's all about our experience or relationship with something and that isn't just the thing itself.

This is one of the reasons why I believe that the adoption of Enterprise 2.0 like technologies is inevitable. As a company you need to become the canonical source of information about your product, warts and all, otherwise you run the risk of someone else becoming that voice.

Then again, why not allow someone else to become that voice? If you can build a community around your product, why not let that community set your direction?

With Zimki we wanted to build a forum, but we also have to get ready for the open sourcing of Zimki, there is also the documentation to get written, and this feature and that feature and so on, oh yes - we also need to provide more tools, improve the IDE etc.

We're only a small company, the team are doing an amazing job but we have a huge number of projects to manage and many things we need to improve.

So, I am very grateful to Joel for creating the unofficial Zimki forum

Thank you Joel.

Hopefully when we get to OSCON and the open sourcing of Zimki, we will find others willing to help create this idea of an open utility computing environment with much less Yak-shaving and moveable applications.

So it is truly encouraging that people are already starting to help.

Sunday, April 01, 2007

E-Tech was fab.

As usual E-Tech was an outstanding event - there were some excellent speakers, an interesting crowd and good conversations all round.

I'd like to have seen more, but there is never enough time. The talks I really enjoyed were :-

  • Body Hacking, Quinn Norton. This is a really interesting subject matter, enhancing the human body and the potential social impacts of this. Wonderful.
  • The Making of Virtual Earth, John Curlander. Captivating stuff on how to go about creating entire 3D images of the environment from a mass of 2D pictures. This is definitely one to watch.
  • From Pixels to Plastic, Matt Webb. Well it's Matt and I couldn't miss one of his talks.
  • JavaScript: It's Happening All Over Again! James Duncan. I work with James and so I know the talk, but I do enjoy his presentations and yes JavaScript is very cool.
  • Spintronics, Kevin Roche. Harnessing quantum spin in modern electronics and methods of creating streams of electrons with the same spin. I was stunned by this.
  • Haml: A Semantic Rebellion in Template Land, Hampton Catlin. It's always dangerous to use live demos but building your presentation in your own tool - crazy. It worked, it was great fun and Hampton presents with flair. Shame the room was pitch black.

The list could and should go on, but all I'll say is that if you've not been to E-Tech I would strongly recommend it for next year.

I also gave a talk on commoditisation (with ducks, Zimki and fabbing as usual).

Unfortunately there was a mini gale at the time, so the doors were blasted in on several occasions and also the room was located outside the main area, down a small track, through the woods and past the signs which said "beware of the dragons".

I'm very grateful and extremely surpised that people made the trek. Unfortunately we didn't get many people to sign up for our carbon offset site in Zimki . We'll have to try again later in the year.

Now this is fairly old, but I'd never seen it before until Craig Dwyer pointed it out to me. If you like Lego, you have to watch this.