Friday, August 09, 2013

Why Google Glass will change the world.

I've only played with Google Glass a few times (not owning a set) and it takes only a few seconds of use to realise there exists multiple killer applications which will change the way we interact with the world.

For me, Google Glass is more 1.0 and a predecessor to more powerful systems where your entire field of vision becomes interactive through printed and transparent electronics over the lens. In this latter case a whole host of other potential applications becomes immediately clear such as annotating of objects and people of interest in your field of view. However, this is all rather obvious and inevitable stuff but even in its current form there is a long list of killer apps i.e. multiple "where to attack" that a company might build a business in. Of these, one of my favourite examples is the second opinion model which actually takes advantage of the current display characteristics of Google Glass.

Before explaining how it works, I thought I'd explain why it's a killer application. I'm a great fan of mapping competitive environments through user needs and using this to predict market changes and opportunities (i.e. where to attack) by finding better ways of meeting user needs. The ideal scenario for me is to find a universal but poorly met need and there is one abundant example today which Google Glass solves.

To explain, I want you to think back to the last time you were going to buy a car or rent a home or were fixing something or in fact any time when a second opinion would have been useful. You probably actually phoned someone, had to describe the situation and they may have given you some advice on this but the process would most likely have been tiresome. Trying to explain over the phone a particular car and get their advice on what's a good price, what should you look for etc is never easy. Try asking someone whether a particular thing you've seen in an auction is a fake?

It's way much easier if you could show them and they can talk you through it. I've tried this in the past using skype on the phone but whilst that's better it's less than ideal. The core component of the killer app in Google Glass is hangout. When I create a hangout I can see and hear the person on the hangout whilst they can see what I'm looking at. Now obviously if we had full view interaction then they could point to areas of my field of vision that I should take an interest in but even in its current form the floating window of a person who can see what I'm looking at is highly effective and bizarrely reassuring. They can guide and direct me.

Now, ideally whatever situation I'm in I'd like an expert at hand. Burst pipe, instant plumber available who can see what I can see and give me advice. Travelling on a plane and the pilot and co-pilot are taken out by a mysterious ailment (I've obviously watched too many disaster movies) then instant pilot available. Is this car a good buy? Is this antique a fake? Funny looking boil on my leg should I go to the hospital?

There's an enormous list of situations where second opinions are good to have especially from someone who knows what they're talking about. And that's the the killer app. A connection to a personal assistant with a long list of available experts on a wide range of topics who can create a 1 to 1 hangout for me with someone who is knowledgeable about what I'm looking at. 

This one thing alone will change the way we interact with the world and stop me buying lemon cars, fake Picasso's and pressing the wrong button on an airplane. As for the boil, I suppose I'll call NHS Direct but I bet it would be easier if I could just show them.

--- 9th Sept 2013

I was asked for other examples of "where to attack" with Glass. To be honest, the list is huge, there's lots of potential and a hangout is just one. A few of the more obvious examples include :-

Interpretation of audio / visual events. Hear a bird signing, a song, a foreign language spoken, the roar of a motor car or see some impressive building or some other event then click here to interpret and identify.

Augmentation / Annotation of objects in the field of view. See something you like then click to buy it and variations of this form including in-field translation of text i.e. "What do these hieroglyphs mean?" will be a thing of the past. If my partner sees a present which she thinks our son might like (e.g. a new toy) then I expect her not only to be able to send me a photo but leave a virtual note on it. So, when I go into some other toyshop then my glasses will identify it and I can add my own views on suitability etc.

Streaming interpretation of audio / visual events. Having a conversation on some subject, don't worry Glass will be constantly streaming relevant information on the discussion to your field of view i.e. "Who was in the Rolling Stones?" will be a thing of past, the answer will be available immediately. Think MindMeld combined with Google Glass. Watching football will never be the seem again.

Remote viewing and control. Worried you've left the house without turning the cooker off or setting the fish tank onto "automated feed" mode? Quickly transport your vision back to home and reset / change what you need.

Augmentation / annotation of location. Need information on where you are, history, culture, practices or need a taxi (or a self driven car i.e. the future "utility" taxi) to your location or simply want to leave a virtual message at this space for others (think a virtual "I was here", "The building is unsafe, do not enter" note) then Glass will have a solution for that.

Basically there's a mass of new activities related to augmentation, annotation, interpretation, remote viewing, remote control based upon audio, visual and location information. No-one should be in any doubt that Google Glass will change the world.

And whilst the above is huge it is but peanuts compared to what is coming and the rise of intelligent software agents.  The combo of this with Glass will create true marvels of incredible use.

Few will care that privacy will further evaporate. In ten years time as I wander through a local craft store with my Glasses identifying something it calculates that my Mother would like for her birthday based upon its discussion with her Glasses then privacy won't be top of my thoughts.

What I'll be thinking about is the suggestion that my Glasses will make that if I take a twenty minute stroll (the weather is good, I need the exercise) upto this shop (directions provided) then an acquaintance has seen a better version (my Glasses asked their Glasses) and I could also stop and have a chat with them at the coffee shop next door. As my Glasses will point out they're working for company XYZ and are producing some product relevant to my research.

Today's browser based / smart phone world will seem like a bad memory in a decade. Like flared trousers or mullets.