Don’t Panic

Global warming? I think some people are getting hot headed and should cool it. The latest example of climate confusion was a polar bear that floated from Greenland to Iceland on some melting sea ice. Apparently, this was our fault because of our careless use of non-renewable energy. Thank goodness the Monaco luxury yacht show is a carbon neutral event. However, the Arctic ice sheet is actually larger than it was in 2007 and as of last year global temperatures started to fall and it’s global cooling that is now starting to look like a threat.

So is the world’s thermometer going up or down? As far as I’m concerned very few people know what’s going on and absolutely nobody knows exactly what’s going to happen next. What I am fairly certain about is that climate change is the latest in a long list of issues that we are using as a focus for our everyday anxieties. Remember the threat of a global flu pandemic that we were worrying about a few years ago? Or how about Deep Vein Thrombosis on long-haul flights? The list goes on to include rogue asteroids, Y2K, acid rain and digital privacy.

But here’s the thing. Many of these issues will prove to be real but most won’t be nearly as much of a problem as we imagine. In short we’ll muddle through as we always do. There are problems that need to be confronted but compared to what previous generations endured our current concerns are a walk in the park. On almost any measure that matters the human race is better off than it was 25, 50 or 75 years ago. Life expectancy has increased, infant mortality has decreased, literacy levels are up and chronic hunger has fallen significantly worldwide.

But what about oil I hear you shout. Well we’ve been here before. In the late 1700s Britain suffered from ‘peak wood’. Land was being deforested to make way for agriculture and a rapidly growing population was putting pressure on the fuel supply. I imagine the doom merchants of the day thought we were all screwed. It was a wood based economy after all. But the rising price of wood led to the emergence of new fuels and the doom merchants were soon replaced with coal merchants. It’s also worth remembering that in the first half of the 19th Century, people used sperm whale oil for lighting and in 1820s it cost $200 a barrel in today’s money. Sound familiar?

So, to sum up, a little bit of perspective please. Our achievements over the last hundred years have been considerable and we should stop focussing on worst-case scenarios and celebrate the odd successes once in a while.

Battery Farmed Children

According to a survey, 25% of them have difficulty walking and many of them are becoming disabled by rapid weight gain and a lack of proper sleep. The story in question is about battery-farmed chickens but it could equally be about our children.

In the UK, 25% of children between the ages of eight and ten years old have never played outside unsupervised. Meanwhile, Australia is in the middle of an allergy epidemic. According to the government, 40% of Australian children suffer from an allergy of one kind or another. Holy guacamole.

One reason for this is probably because our houses have become too clean and our kids are not exposed to enough dirt. Filth yes, there’s plenty of that on the various screens we allow them to sit in front of, but kids (and chickens) need to scratch around outside. But I don’t think blaming technology is fair. The real culprit here is parental paranoia. We have become afraid of life itself. For example, back in 2003 there were less than 200 non-food anti-bacterial products launched onto supermarkets shelves worldwide. By 2006, this had jumped to 1,610.

And it’s not just microbes we’re trying to ban. Many schools now have a strict policy relating to food allergies. Bags are searched every morning to identify illegal foodstuffs, which can include yoghurt, homemade cakes and, of course, anything that has ever come into contact with — or might have once said hello to — a nut. Nuts? I’d say so. But we are putting fear in front of fact.

The food allergy epidemic is largely a myth. According to the US Food Allergy & Anaphylaxis Network (FANN) around 150-200 people die each year in America due to allergic reactions to food. But according to the US Centre for Disease Control the actual figure is closer to 10. That’s a big difference. And let’s put this into perspective. Around 40,000 people are killed every year on American roads, including more than 2,500 kids. I suspect the ratios would be similar in Australia.

Don’t get me wrong here. Food allergies are real and can kill. Only last year a fourteen-year-old Melbourne boy died whilst on camp due to a food allergy. However, exaggerating the risk could be doing more harm than good because it feeds a culture of fear where children are overwhelmed by anxiety. Moreover, nothing can ever be 100% safe and squeezing the risk out of one area only displaces it somewhere else. Risk will not disappear simply because you regulate it.

I think we are creating a false sense of security and learned helplessness in other areas too. In some schools, running in the playground has been banned because children might bump into each other or fall over. Some schools have even gone so far as to introduce soft, impact-absorbing surfaces to replace old-fashioned dirt or tarmac. Perhaps this is working — when was the last time you saw a kid with grazed knees are a broken arm? These kids exist but they are an endangered species. This is a shame because these accidents actually have a benefit. They teach kids to push the boundaries but to be careful. They also teach resilience. Moreover, according to some experts, these surfaces may actually cause more serious accidents because children believe that they are safe.

Our protectionist and interventionist impulse may be harming us in other areas too. For instance, schools are now asking parent helpers to supply personal information that will be used to conduct criminal background checks. Good idea? Possibly, but the implication is that all adults are guilty until proven innocent. The plan could also backfire in a number of ways.

First, the checks could result in less parent helpers. Fancy coaching football at the weekend? Well how would you feel about it if it meant ongoing criminal checks? The argument in favour of checks is that if you are innocent you have nothing to worry about. But what worries me is that once we start to view all adults as potential sex offenders there will be subtle changes to how everything from policing to law making operates.

Second, spontaneous acts of random kindness could disappear under a mountain of bureaucratic red tape. Fancy baking a cake for the school raffle? You can’t. The cake may have come into contact with nuts and we can’t tell whether you’re a nutcase until you fill out a form.

To be civil means to be polite or courteous and civilisation is built upon the idea of mutual trust. Most people are trustworthy and most things are not dangerous, but if we teach our children that they are not we are laying the foundations for a society where fear becomes an epidemic.

Another Case of Bad Language

Here we go again. A report by something called the the National Organisation of University Art Schools says that schools should be teaching ‘visuacy’. The National Association for the Visual Arts (NAVA) is similarly focussing on the “outcome” of visuacy being a stand-alone subject for years K-10 and the National Review of Visual Education says visuacy should be given the same prominence as literacy and numeracy.

So what is this strange new skill that will be so fundamental to students in the 21st Century? Judging by the fact that the report cites the example of deconstructing an advertisement for Elle Macpherson’s kickers to establish “conditions of value and meaning” alongside an examination of Picasso’s Guernica, visuacy appears to mean visual literacy plus post-modernism minus a sense of humour.

Am I taking the #*&!? Absolutely. It would be silly not to. Doubtless members of the Visual Education Roundtable (“a coalition of key stakeholders to be an advisory body to CMC and MCEETYA”) will paint me as a pedantic philistine but I can live with that. Newspeak like this is a mutant life form from outer space and needs to be killed off before it infects the whole planet.

To put the record straight I’m all in favour of visual literacy. So is my mum, who used to be an art teacher. Our brave new world is saturated with images and it’s going to get much worse in the future. Everything from walls and tabletops to cereal packets and clothing will soon have the potential to become screens displaying the almost infinite amount of information and entertainment created by you, me and everyone else.

Thus we will be drowning in digital dross and there will be a real need to filter this material, either by visualising information or by understanding the difference between semi-stylish eye candy and items of real substance.

But according to post-modernist academics with a love of Jerry built jargon all of this imagery is of equal value. A video by Kylie is as meaningful as a painting Van Gogh. We should be so lucky. My point here is not a discussion about postmodernism. What’s getting my goat is simply the use of bad language, especially in schools. Yes we live in a visually cluttered culture but that doesn’t mean that words don’t matter.

Always Turned On

Things are getting really weird. I have a female friend that goes to bed with an electronic device every night. Her husband is getting fed up and claims it’s ruining their sex life. Her response is that she’s in meetings all day and needs to take a laptop to bed to catch-up with her emails. This is a bit extreme but I know of lots of other people that hardly ever switch off.

You can see this first hand when passengers switch their mobile phones back on the minute their plane lands. What’s quite so important that it can’t wait ten minutes until they are inside the terminal building I have no idea. Perhaps it’s yet another example of how people feel insecure if they are not always available or constantly connected.

Something is going on here. Soon after the millennium (probably the Tuesday around 7.08am) we collectively decided to redefine the concept of freedom to include notions of speed and wireless connectivity. It’s certainly a seductive idea.

We are now free to work anywhere we want. We can do it on aeroplanes at 39,000 feet, in the back of the car, in bed or on the kitchen table. Should children be exposed to their parents doing this? Most people will say, “what’s the harm?” but I am worried about the signals these devices are sending out. Surely what people are saying is that I am more interested in being alone with my digital network than being with you?

The desire to be connected isn’t limited to work either. Twitter, as some people will know, is a micro-blogging service that allows people to answer the question “What are you doing?” by sending regular up-to-the-minute newsflashes of their daily existence to chosen friends. Messages are limited to 140 characters and, judging by most of the messages I’ve seen, most users are limited to an IQ of 110. (Actually that’s not true. Twitter seems to be the domain of uber-geeks with very high IQs but I couldn’t resist saying that).

It’s a sort of stream of consciousness thing that results in babblings about being “thirsty” or “going to lunch”. In theory Twitter is a fun way to keep in touch but I am starting to wonder whether it’s possible to be too in touch.

For example, I have a friend that’s a ‘Twit’. If I wanted to I could sign up and find out that he was “eating vegemite toast” at 7.08pm” or that “I’m in bed now” at 12.04am. But I don’t need to know this.

Back in the old days people kept certain types of information to themselves. I think it was called either privacy or security. Nowadays such digital exhibitionism is practically compulsory. Even the next President of the United States is on Twitter (“Just saw Hillary, she has a gun”?).

Are We All Going Google Eyed?

Here’s fun game. Visit an ATM and play “Guess your PIN.” If you win you get some money. If you don’t you loose your card, usually after the third attempt.Other versions of the game are available for credit cards and removable car radios.

In a survey, 63% of Australian’s said that they had difficulty remembering things like PIN numbers. Personally, I can’t even remember my own home phone number these days. I’m also starting to struggle with passwords to social networks that I’ve joined during moments of midnight madness. I even ‘lost’ my new bicycle for a few days last month because I couldn’t remember the padlock code and couldn’t remember where I’d written it down either. Still, it was better than last year. I padlocked my bike somewhere but, to this day, I can’t remember where.

This might be my age but I doubt it. My lack of memory is caused by too much data. Digitalisation has made it too easy to create and distribute information, with the result that I’m drowning in a sea of endless trivia.

But there’s a much bigger problem on the horizon. The Internet might be making us stupid. For example, have you noticed how attention spans are shortening? Perhaps you are scanning newspaper articles because they appear too long? Or perhaps you’ve read bits of books twice because you weren’t properly concentrating.

The problem is that infinite choice is fragmenting our attention. Digitalisation is also fuelling an obsession with speed with the result that Information and entertainment are now only acceptable if they are delivered in snack-sized formats. Similarly, speed of information retrieval is becoming more important in some instances than accuracy.

Google is also making us somewhat stupid because the first thing we do when we need information is do what everyone does. We go to Google and we look at the first few pages of results. This is a problem because if information isn’t ranked on the first couple of pages it might as well not exist. It’s also a problem because everyone is looking for things in the same place and everyone is using the same sources to create ‘new’ information. Hence, we are becoming increasingly self-referential and our knowledge is narrowing. Don’t believe me? Well the first thing I did when I started writing this was to Google ‘Google Goggles’. Guess what I found? Something I’d already written but forgotten about. Brilliant.

Human Contact a Cyber Failing

Something I wrote for the Herald Sun Newspaper…

In America, you can attend a Cuddle Party. This is where a group of strangers pay to hug each other. According to cuddleparty.com, it’s “a place for people to rediscover non-sexual touch and affection”.Only in America right?

Wrong. Cuddle Parties are taking off in Australia, too.

These parties are clearly a fad, and I’m sure they attract more than their fair share of Harry Potter-reading, pyjama-wearing weirdos, but they’re also perhaps an early indication of the fact that an increasing number of people, many of whom now live alone, crave the sensation of being physically held and touched.

Like many fringe ideas, it may also represent an unmet need – in this case a remedy for sadness or loneliness.

These Cuddle Parties represent a safe form of intimacy-on-demand that appeals to singles and married couples alike, many of whom are either too busy or tired to become involved with any other form of shared physical activity.

On one level this craving for instant intimacy is ironic because in many other areas we are being told to accept as “normal” behaviour that is exactly the opposite. Physically touching a colleague at work (and I’m talking here about affectionate hugs) is now strictly verboten.

Research by Manchester Metropolitan University also says there is a growing anxiety in childcare circles about touching children.Recent panics include a male teacher who instructed a small child to apply a plaster himself because the teacher was too afraid to touch the child.

This is clearly insane but the madness isn’t restricted to loony politicians and pediatricians. Organisations are also trying to convince us that reducing human contact is a good thing because it saves us time and money.

It’s the new economy dude. Society 2.0. They are liars. Reducing human contact saves them time and money.

It’s the same with social networking sites. Some people claim that we have never been better connected. Apparently, people on Facebook have an average of 150 friends. But these are digital acquaintances.

These are superficial friends and we are confusing familiarity with intimacy. Hence our growing need for physical contact. Even the environment has been roped in to help sell us the lie that less human contact is healthy.

There are organisations out there right now encouraging us to hold meetings in cyberspace because this reduces carbon emissions. Really?

I thought that computers were powered by electricity from burning oil and coal, so where’s the logic in that? But even if these e-vangelicals are right they are still wrong. For instance, sociologists at the University of Arizona and Duke University North Carolina have found that Americans have fewer real friends than they used to.

In 1985, the average American had three people to talk to about their problems. Now the figure is just two. Why? Longer working hours are one reason but the real culprit is technology. Use of the internet and mobile phones has reduced face-to-face contact.

People need to intimately connect with other people.

If they don’t, there is a danger that they will spend too much time inhabiting virtual worlds like Second Life.This is not good for them and is not good for the planet either. People need people because happiness comes from intimate interactions with friends and family.

Moreover, as everyone instinctively knows, new ideas are born serendipitously in places like stairwells and over lunch, not at overly orchestrated brainstorms or government summits where most of the solutions are so small that they could be mistaken for homeopathic remedies.

We need to establish an intimate relationship with the thought that a life lived remotely, or at a physical distance from others, is ultimately unbearable. Time, in other words, for a physical revolution.

How to unearth growth by digging in the dirt.

My columns for Fast Company are being migrated to the new Fast Company.com website. In the meantime I’m posting as many of the originals as I can find here. Be aware that some of these go back to 2004 and many ideas have moved on to say the least. ‘Columns’ also includes various other magazine and newspaper writings.

—-

Everything you need to know about innovation is growing (and dying) in a garden near you. So forget balanced scorecards, six sigma and SWOT analysis and read this instead.

There is an element of business, which, as far as I know, has never been written about. Business is like gardening. That’s right; growing a business is like growing a tree. I know this sounds flaky, and I’ve probably lost many of you at this point, but for those of you that remain consider this: most metaphors about business are about sport or war. This is useful, but the fatal floor in these analogies is that both have an end point in the immediate future. Moreover, the objective of both is to defeat a clearly defined enemy. Aims and outcomes are always fairly clear.

But business isn’t like that and neither is gardening.

Gardening has no end. There is no finish line. It is about a journey not a specific destination. Moreover, whilst business and gardening certainly have enemies, focussing on them too much can divert your attention away from the real game. A good example of this is the historical war between Coca-Cola and Pepsi, which, in my view, has all too often shifted attention away from the customer.

The feeling in most organizations like these is that business is a mechanical process. In this context the analogies of war and sport are very apt. It’s all about pre-planned strategies, resources and control within a fairly fixed environment or known set of rules.

But real life doesn’t work quite like that does it? We cannot control everything and it is egotistical to think that we can. So perhaps better metaphors are rooted in plants not machinery, especially as we move away from fixed pyramidal structures to informal (and often temporary) organizational networks.

If you start to think of business ideas as plants your mindset shifts.In this metaphor you plant a business idea in a patch of soil, which is set within an overall grand scheme or design, water it and watch it grow.

But, as any gardener knows, half your plants won’t grow. There is an early American saying about gardening that you can apply to business: “One for the blackbird, one for the crow, one for the cutworm, and one to grow”. Business, like gardening, is about flexibility and persistence in the face of changing external circumstances.

However, even tenacity doesn’t always work. Sometimes plants don’t grow because they have been put in the wrong place or because pests have destroyed them. Either way you have to nurse them back to health or yank them out and start all over again.

Planting things in the right place is vital. This is something that McKinsey might agree with. According to McKinsey: “In sectors such as banking, telecommunications and technology, almost two-thirds of the organic growth of listed Western companies can be attributed to being in the right markets and geographies”. In other words, a good business idea in the wrong place can struggle whereas an average business idea in a perfect spot is likely to do well.

Then there’s the opposite problem. Sometimes things grow so fast that they overshadow what’s next to them and they have to be moved if both plants are to flourish. Perhaps the parallel here is with ‘skunkworks’, where teams are moved away from the shadow of parent company.

For example, the telecommunications firm Vodafone was created by accident as a tiny division of Racal Electronics. Someone, somewhere, was given the green light to plant something and see whether it would grow. It did and Vodafone is now a GB £80 billion colossus that dwarfs it’s former parent, although I wonder whether this rapid growth would have been achieved if it had been left in the shadow of Racal.

Of course, sometimes things grow so well that, over many years, the soil becomes exhausted and the only solution is to start again. This is not a bad thing. It is just part of a natural cycle. Fields must be allowed to lay fallow every so often if they are to regain their natural health and vitality. This applies to organizations but it also applies to people. Sometimes there is a tendency to think that you’re useless when in fact all that is wrong is that you are working in the wrong place or you are exhausted. So if you’re sick and tired of being sick and tired take some time off and take a rest.

Does any of this apply to innovation?

Yes and no. Business is like gardening but ultimately the metaphor falls down because innovations are like weeds. They grow where they’re not supposed to and cannot be cultivated like orchids in a greenhouse. You cannot sow weeds in any meaningful sense, you can only provide the conditions necessary for them to grow, which in many instances means leaving them well alone. Weeds thrive on neglect.

Therefore, if you want innovation in your business, all you can really do is recognise what a weed looks like and allow certain of them to carry on growing even when they are in the ‘wrong’ place in your garden.

Is M&A the new R&D?

A while ago I wrote a piece for Fast Company called An Evolutionary Approach to Innovation. The central idea was that Darwinism teaches us quite a bit about innovation. In particular, random mutations and adaptations caused by a particular local context or by rapidly changing conditions can spread to become the norm through a process of natural selection. Innovations are generally mutations created when one or more old idea is cross-fertilised by another.

The same is true with trends. New trends emerge when someone starts to think or behave differently – or starts to create or customize something because existing offers do not fit with their needs or circumstances. If conditions are right a trend will become widely accepted, eventually moving from the fringe to the mass-market and from early adopters and trendsetters to laggards. Trends that occur at an intersection of other trends may also turn into mega-trends, which are the key disrupters and drivers of innovation and change across all industries.

Creative leaps also tend to emerge when someone with a differing perspective tries something new – either through bravery or sheer naivety. If that person is young or comes from another place (i.e. a different discipline or perhaps a different country) things sometime start to happen. Put two or move differing people together and the sparks can really fly.

But why is this so? In my experience it’s because older people have usually invested too much under the current system and therefore have too much to lose if a new idea displaces an older one. Equally, people that don’t move around or come from the same department or discipline sometimes fail to see what is hidden under their own noses, whereas people from ‘somewhere else’ often see it.

For these reasons game changing ideas and radical innovations tend to come, not from well-funded industry incumbents (i.e. large organizations), but from lone inventors or a couple of individuals in a cramped garage. In other words, too much experience, too much familiarity or too much money can kill innovation faster than phrases like “I like it but” and “We tried that once”.

Perhaps this explains why, for instance, 25% of Silicon Valley start-ups are created by either Indian or Chinese entrepreneurs. They see things differently. Another example of outsider thinking and mutation is Virgin Atlantic Airways. Richard Branson managed to shake up the airline industry precisely because he did not have an airline industry background. So when other airlines were worrying about legroom, routes and punctuality, Branson was cross-fertilising his experience from the entertainment industry and worrying about why flying wasn’t more fun.

Not all new ideas and innovations make it of course. It’s a case of survival of the fittest (or luckiest). Eventually, however, the sheer number of new ideas that are hatched means that a few emerge and make it into the mainstream where they do battle with deeply set vested interests. Then it’s usually youth and energy versus experience and money. Organizations are like this too in a sense. They start of hungry, agile and curious and end up bloated, lazy and stiff.

So my question is this. If external events and adaptation are the driving forces of innovation, is it possible to develop an innovation culture and process that seeks out change and mutation? Moreover, if evolution is the result of genetic accidents is it possible to replicate such accidents through experimentation? An imminent threat of extinction would certainly explain why it often takes a crisis to spur a lazy and bureaucratic organization to adapt and embrace change.

My answer is that generally speaking it’s not. This may be a heretical statement, especially coming from someone that makes a living advising companies how to create innovation systems, but I think it’s true. Some large companies are excellent at innovation. It’s their reason for being and is imprinted in their DNA.

However, for most large organizations innovation is an inconvenience. Organizational cultures develop a kind of corporate immune system that subconsciously suppresses or rejects any new idea that could threaten the existing business. Quite right too.The primary aim of established organizations is to extract revenue and profit from legacy businesses and not to do anything that would upset the apple cart.

This primarily means executing flawlessly in the present and requires tight control and strict hierarchies. Small companies, in contrast, have less to lose and are not encumbered by their history. Their mental models about ‘what works’ are less fixed and they are more open to picking up weak signals about change.

So here’s my idea. If you are the kind of organization that’s does innovation well great. Equally, if you’re half-way decent at innovation keep with the programme and perhaps play around with some of these thoughts about using trends as a framework for innovation and scenario planning. If you’re lucky you may give birth to a strange mutation. If this happens recognise it as a gift and run with it as far as it goes.

If, however, you are the type of organization that’s not very good at innovation then give up. That’s right. Throw in the towel and get into hunting instead of agriculture. In other words stop trying to grow your own through research & development and go out hunting with mergers and acquisitions instead. Seek out small innovative companies and buy them.

Big organizations, even ones that are really bad at innovation, are very good at scaling up an idea and dealing with everything from intellectual property and sales to marketing and finance. This is handy because these things are precisely what start-ups and small companies are often very bad at.

Why the Future Keeps Catching Us Out.

Why is it that some innovations score a home run, whereas others leave the field almost as soon as they walk on? The reasons obviously vary according to on the specific context, but two culprits are undoubtedly timing and the irritating and irrational behavior of human beings.

The future is not what it used to be. In the 1950s, as Daniel H. Horne has pointed out, there were flying cars, x-ray specs and personal jetpacks. In the 1960s we had personal teleportation devices, moon bases, Smell-O-Vision television and warp-drive. At least we had all these things in our imagination. So why have none of these things happened yet in reality? Where did the expected future go?

Our futuristic frustration has been building up for a many years. First there was the millennium. Despite all the Y2K hype, nothing really seemed very different on 1 January 2000 did it? 2001 was also expected to be a futuristic date, but all we got was a bunch of lunatics with old-fashioned box-cutters taking over some airplanes (not to be underestimated for its impact, but not quite the life-changing event that many people were quite anticipating). Indeed, it almost feels as though ‘progress’ has slowed down or been put on hold recently. Look, for instance, at the boom in retro video games, ‘vintage’ sneakers, retro car design or remakes of movies from the 1970s. I haven’t made my mind up about why this is the case, but it could be the thought that we are living in anxious and uncertain times, so we escape into era’s that we perceive (often incorrectly) as simpler, safer and more certain.

When we think of the future we usually think of it in terms of space travel or time machines, but what has arrived is no less fantastic. Innovations from the last thirty years or so include; the Internet, iPods, mobile phones, industrial robots, microwave ovens, smoke detectors, GPS, Wikipedia, Second Life, and DNA fingerprinting. These innovations, and others, are just as futuristic as silver space suits and ray guns. Moreover, many of the things that were predicted back in the ‘50s and ‘60s have made a brief appearance, but then vanished again before you could say Segway or Apple Newton.

So what is the takeaway here? First, many futuristic ideas haven’t shown up, but given enough time they will. Remember space tourism? Or how about meat grown in a laboratory or domestic robots? Well the US Federal Aviation Authority has already published a set of proposed regulations for space tourism operators, cultured meat is around the corner and it is predicted that there will be 6.1 million robots in domestic service worldwide by the end of 2007. In other words, future predictions can come true if only you give them enough time to happen.

Timing is everything and from a purely commercial standpoint being too early can be just as disastrous as being too late. However, as the inventor Ray Kurzweil points out, “an invention has to make sense in the world which it is finished, not the world in which it started.” So unless you have very deep pockets think very carefully about long-term trends and the world in which your innovation will live. This isn’t easy, but it is essential. For example, there is an argument that the more life speeds up and becomes virtual the more that some people will want to slow it down and take their lives offline. So products that make our lives quicker will work for a while, but ultimately I’d expect there to be a significant demand for products and services that do the opposite. For example, MetroNaps is, to my mind at least, an early example of this.

The second key takeaway is perhaps the point that the future usually arrives subtlety and unannounced. We are all somehow waiting for aliens, hover-boards, time travel, robocops, dinner-in-a-pill and eternal youth, whereas what actually shows up is computer speech recognition, data mining, artificial ears, Astroturf, carbon dating, IVF, digital photography, pocket calculators, disposable contact lenses and Viagra.

But we shouldn’t get too hung up on technology. The reason that many of our scientific fantasies haven’t made it into reality is that many innovators and futurists sometimes make the mistake of forgetting about human history and psychology. Technology tends to change fast and exponentially, while people tend to change slowly and incrementally. So whilst reading e-books on cell phones might look good on paper (and are big hit in Japan already), it may take more than a generation for such an idea to significantly displace traditional reading habits in countries like the United States.

In other words, one reason why the future is never quite as we expect is because innovations that are logical and technically feasible, smash up against people, who are irrational and emotional. And that’s one thing that I predict won’t change in the future.

Have I got your full and undivided attention?

Life is speeding up and we are constantly inventing new ways to make things move even faster. But what are some of the consequences of this constant busyness? Are we losing our ability to think and properly relate to other people?

‘Crowd of One: The Future of Individual Identity’ is an interesting book by John Henry Clippinger – a senior fellow at the Beckman Centre for Internet & Society at Harvard Law School. One of the central thoughts of the book is that people only become themselves through their relationship with others. If we become isolated our growth becomes stunted. This seems like a great idea although, being rather busy this month, I haven’t actually read the book myself so I can’t actually tell you for sure.

According to a snapshot review that I did manage to read on my way to work, technology is changing our territorial and psychological boundaries. This point is somewhat picked up in a recent essay by another thought-leader, Sherry Turkle, who is Professor of Social Studies of Science and Technology in the Program in Science, Technology, and Society at MIT and the founder of the MIT Initiative on Technology and Self. She argues that “what people mostly want from public spaces these days is to be left alone with their personal networks” and that a new “state of self” is now developing whereby people can transport themselves somewhere else at the touch of a button.

I think I’ve witnessed this first hand very recently. First on holiday last month where numerous couples were sunbathing next to a swimming pool, each of them on some kind of portable electronic device. What were they doing? I have no idea but they certainly weren’t talking to each other. They were undoubtedly connected to something but I couldn’t tell you whether their ‘self’ was developing or not.

The second instance was when I took my brother’s kids to an indoor playground last Christmas. Soon after I sat down a couple in their late twenties sat down next to me with a girl aged perhaps six years of age. The girl was dispatched into the safe play area and both parents took out Blackberries and proceeded to check email. They did this for over sixty minutes without once speaking to each other or acknowledging the presence of their small daughter. Again, they were certainly connected but to what and what and for what reason I’m not sure.

It’s the same at work. Ten or fifteen years ago people didn’t take calls in the middle of meetings. Today it’s commonplace. I was in a meeting last year with News Corp when someone from their ad agency took a call and the rest of the room was put on hold for almost ten minutes until the call had ended. You can see this teleportation process in operation in countless restaurants too where couples are talking to each one minute and then divert to receiving phone calls or checking emails the next. In my day this would have been considered rather rude and people would have switched these devices off or hidden them under the table. These days it’s just considered normal and these devices are proudly and openly on public display.

In short, we are becoming so tethered to our electronic devices that we never entirely switch off and escape from the presence of others. Now this may be a very good thing in terms of the development of individual identity, because we are constantly connected to other people, but I wonder what it’s doing to the quality of our thinking.

Firstly, our connectedness to others through digital networks means that a culture of rapid response has developed in which the speed of our response is sometimes considered more important than its substance. We shoot off email mails that are half thought out and long-term strategic thinking is constrained by a lack of proper thinking time. We are always responding to what’s urgent rather than what’s important. I could have probably put all that together a lot better but I’m pushed for time and really can’t be bothered.

This connectedness is constant but our full attention is only partial as a result. If I can mention just one more thought leader then it needs to be Linda Stone, an ex Microsoft researcher, who has coined the term Constant Partial Attention to describe the fact that we feel some kind of need to scan electronic and digital environments to ensure that we are not missing out on something more important. We don’t want to be left out of the loop. As a result nobody feels secure enough to leave these electronic devices off for an hour during a meeting, let alone for a week when they are sitting next to a pool on holiday.

But it’s not necessarily the speed thing that worries me. There is evidence that many of our best decisions are made when we have little or no time to think. What concerns me the most, rather, is that we just don’t switch off. Ever. We are now so connected, available and never alone that we have left ourselves no time to properly reflect.We scroll through our days without thinking about what we are really doing or where we are ultimately going. We can probably get away with this for a while, especially when the decisions that need to be made are at a fairly low level, but sooner or later I suspect our lack of aloneness and reflection will catch up with us.