Why the future needs more people in it

Screen Shot 2016-04-07 at 10.29.38Last year Facebook launched a virtual assistant. It was called Moneypenny after the secretary in the James Bond books. Yet again, a vision of the future was shaped by the past, possibly with a nod to Walt Disney’s Tomorrow Land in the 1950s. Is this sexist or just a natural outcome of the fact that more than two thirds of Facebook’s employees are men? Whatever the reason, the future is generally shaped by white, middle-aged, male Americans. The majority of the World Future Society’s members are white men aged 55 to 65 years of age and when it comes to the media’s go-to guys for discussing the future they’re men too. What this means is that visions of the future are overwhelmingly created by – and to some extent shaped for – a tiny slice of society, one that’s usually in some way employed in science or technology and has not had to struggle too much.

This is perhaps why technological advances usually define the future and why portrayals of the future are almost always optimistic scenarios in which technology will solve all of mankind’s problems. In the future, for example, we’ll all live far longer, which is fine if you have enough money, but less fine if you are already struggling to survive in the present.

Is this a problem? You bet it is. For one thing a lack of diversity in terms of the people imaging the future means that we are missing out on vast networks and frameworks of perspectives, experience and imagination. Second, by focussing on technology we are missing out on the social and emotional side, not to mention the politics of futurism. Scientists and technologists are essential to explore what’s possible in the future, but as Alvin and Heidi Toffler pointed out in their book Future Shock in the 1970s, we also need people from the arts and humanities to explore what’s preferable. We need ethical code alongside computer code. At the moment a tiny minority of people has hijacked the future – less than 0.1 per cent of the world’s population perhaps. What the remaining 99.9 per cent urgently need to do is reclaim it and especially add a softer and more human perspective to the discussion.

Sensing the Future

DSC00581

Here’s a prediction. You are reading this because you believe that it’s important to have a sense of what’s coming next. Or perhaps you believe that since disruptive events are becoming more frequent you need more warning about potential game-changers, although at the same time you’re frustrated by the unstructured nature of futures thinking.

Foresight is usually defined as the act of seeing or looking forward – or to be in some way forewarned about future events. In the context of science and technology it can be interpreted as an awareness of the latest discoveries and where these may lead, while in business and politics it’s generally connected with an ability to think through longer-term opportunities and risks be these societal, technological, economic, environmental or geopolitical. But how should one use foresight or otherwise think about the future? What practical tools are available for companies to stay one step ahead of the future and to deal with potential disruption?

The answer to this depends on your state of mind. In short, if alongside an ability to focus on the here and now you have – or can develop – a corporate culture that’s furiously curious, intellectually promiscuous, self-doubting and meddlesome you are likely to be far more effective at foresight and futures thinking than if you doggedly stick to a single idea or worldview. This is due to the fact that the future is rarely a logical or singular extension of future ideas or conditions. Furthermore, even when it looks as though this may be so, everything from totally unexpected events, feedback loops, behavioural change, pricing, taxation and regulation have a habit of tripping up even the best-prepared linear plans.

Always look both ways
In other words, when it comes to the future most people aren’t really thinking, they are just being logical based on small sets of data or recent personal experience. The future is inherently unpredictable, but this gives us a clue as to how best to deal with it. If you accept – and how can you not – that the future is uncertain then you must accept that there will always be numerous ways in which things could play out. Developing a prudent, practical, pluralistic mind-set that’s not narrow, self-assured, fixated or over-invested in any singular outcome or future is therefore a wise move.

This is similar in some respects to the scientific method, which seeks new knowledge based upon the formulation, testing and subsequent modification of a hypothesis. The scientific method is perhaps best summed up by the idea that you should always keep an open mind about what’s possible whilst simultaneously remaining somewhat cynical about new discoveries or ideas.

Not blindly accepting conventional wisdom, being questioning and self-critical, looking for opposing forces, seeking out disagreement and above all being open to disagreements and anomalies are all ways of ensuring agility and most of all resilience in what is becoming an increasingly febrile and inconstant world.

This is all much easier said than done, of course. We are a pattern seeing species and two of the things we loathe are randomness and uncertainty. We are therefore drawn to forceful personalities with apparent expertise who build narrative arcs from a subjective selection of so called ‘facts’. Critically, such narratives can force linkages between events that are unrelated or ignore important factors.

Seeking singular drivers of change or maintaining a simple positive or negative attitude toward any new scientific, economic or political development is therefore easier than constantly looking for complex interactions or erecting a barrier of scepticism about ideas that almost everyone else appears to agree upon or accept without question.

Danger: hidden assumptions
In this context a systems approach to thinking can pay dividends. In a globalised, hyper-connected world, few things exist in splendid isolation and one of the main reasons that long-term planning can go so spectacularly wrong is the over simplification of complex systems and relationships.

Another major factor is assumption, especially the hidden assumptions about how industries or technologies will evolve or how individuals will behave in relation to new ideas or events. The historical hysteria about Peak Oil might be a case in point. Putting to one side the assumption that we’ll need oil in the future, which we may not, the amount of oil that’s available has always depended upon its price. If the price of oil is high there’s more incentive to discover and extract more oil. A high oil price also fuels a search for alternative energy sources, and incentivises behavioural change at both an individual and governmental level. It’s not quite an equal and opposite reaction, but the dynamic tensions inherent within powerful forces means that significant balancing forces do often appear over time.

Thus we should always think in terms of technology plus psychology or one factor combined with others. In this context, one should also consider wildcards. These are forces that appear out of nowhere or which blindside us because we’ve not seen or discounted their importance. For example, who could have foreseen the importance of the climate change debate in the early 2000s or its relative disappearance due to economic conditions in the 2010s?

Similarly, it can often be useful to think in terms of future and past. History gives us clues about how people have behaved before and may behave again. Therefore it’s often worth travelling backwards into history to explore the history of industries, products or technologies before travelling forwards into the future. As a rule of thumb, for every year you want to look forward, look backward for five.

If hidden assumptions, the extrapolation of recent experience, and the interplay of multiple factors are three traps, cognitive biases are a fourth. The human brain is a marvellous thing, but too often it tricks us into believing that something that’s personal or subjective is objective reality. For example, unless you are aware of confirmation bias it’s difficult to unmake your mind once it’s made up. Back to Peak Oil hysteria and Climate Change scepticism perhaps.

Once you have formed an idea about something – or for that matter someone – your conscious mind will seek out data to confirm your view, while your subconscious mind will block anything that contradicts it. This is why couples argue, why companies steadfastly refuse to evolve their strategy and why countries accidently go to war. Confirmation bias also explains why we persistently think that things we have experienced recently will continue into the future. Similar biases mean that we stick to strategies long after they should have been abandoned (loss aversion) or fail to see things that are hidden in plain sight (inattentional blindness).

In 2013, a study in the US called the Good Judgement Project asked 20,000 people to forecast a series of geopolitical events. One of their key findings was that an understanding of these natural biases produced better predictions. An understanding of probabilities was also shown to be of benefit as was working as part of a team where a broad range of options and opinions were discussed. You have to be aware of another strong bias – Group Think – in this context, but as long as you are aware of the power of consensus you can at least work to offset some of its more negative aspects.

Being aware of how people relate to one another also brings to mind the thought that being a good forecaster doesn’t only mean being good at forecasts. Forecasts are no good unless someone is listening to you and is prepared to take action. Thinking about who is and who is not invested in certain outcomes – especially the status quo – can improve the odds when it comes to being heard. What you say is important, but so too is whom you speak to and how you illustrate your argument, especially within organisations that are more aligned to the world as it is than the world as it could one day become.

Steve Sasson, the Kodak engineer who invented the world’s first digital camera in 1975 showed his invention to Kodak’s management and their reaction was: ‘That’s cute, but don’t tell anyone.” Eventually Kodak commissioned some research, the conclusion of which was that digital photography could be disruptive. However it also said that Kodak would have a decade to prepare for any transition. This was all Kodak needed to hear to ignore it. It wasn’t digital photography per se that killed Kodak, but the emergence of photo-sharing and shared group think that equated photography with printing, but the end result was much the same.

Good forecasters are good at getting other peoples’ attention through the use of narratives or visual representations. Just look at the power of science fiction, especially sci-fi movies, versus that of white papers or power point presentations. If the engineers at Kodak had persisted, or had brought to life changing customer attitudes and behaviours through the use of vivid storytelling – or perhaps photographs or film – things might have developed rather differently.

Find out what you don’t know
Beyond thinking about your own thinking and thinking through whom you speak to and how you illustrate your argument, what else can you do to avoid being caught on the wrong side of history? According to Michael Laynor at Deloitte Research, strategy should begin with an assessment of what you don’t know, not with what you do. This is reminiscent of Donald Rumsfeld’s infamous ‘unknown unknowns’ speech.

“Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know….”

The language that’s used here is tortured, but it does fit with the viewpoint of several leading futurists including Paul Saffo at the Institute for the Future. Saffo has argued that one of the key goals of forecasting is to map uncertainties. What forecasting is about is uncovering hidden patterns and unexamined assumptions, which may signal significant opportunities or threats in the future.

Hence the primary aim of forecasting is not to precisely predict, but to fully identify a range of possible outcomes, which includes elements and ideas that people haven’t previously known about, taken seriously or fully considered.
The most useful starter question in this context is: ‘What’s next?’ but forecasters must not stop there. They must also ask: ‘So what?’ and consider the full range of ‘What if?’

Consider the improbable
A key point here is to distinguish between what’s probable, and what’s possible. Sherlock Holmes said that: “Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth.” This statement is applicable to forecasting because it is important to understand that improbability does not imply impossibility. Most scenarios about the future consider an expected or probable future and some then move on to include other possible futures. But unless improbable futures are also considered significant opportunities and vulnerabilities will remain.

This is all potentially moving us into the territory of risk management rather than foresight, but both are connected. Foresight can be used to identify commercial opportunities, but it is equally applicable to due diligence or the hedging of risk. Unfortunately this thought is lost on many corporations and governments who shy away from such long-term thinking or assume that new developments will follow a simple straight line. What invariably happens, though, is that change tends to follow an S-Curve and developments have a tendency to change direction when counter-forces inevitably emerge.

Knowing precisely when a trend will bend is almost impossible, but keeping in mind that many will is itself useful knowledge. The Hype Cycle developed by Gartner Research is helpful in this respect because it can separate recent developments or fads (the noise) from deeper or longer-term forces (the signal). The Gartner diagram links to another important point too, which is that because we often fail to see broad context we have a tendency to simplify.

Screen Shot 2016-03-10 at 11.27.02

This means that we ignore market inertia and consequently overestimate the importance of events in the shorter term, whilst simultaneously underestimating their importance over much longer timespans as technologies or other forces develop and counter-forces or contingencies start to emerge. An example of this tendency is the home computer. In the 1980s, many industry observers were forecasting a Personal Computer in every home. They were right, but it took much longer than expected and, more importantly, we are not using our home computers for word processing or to view CDs as predicted. Instead we are carrying mobile computers (phones) everywhere. This is driving universal connectivity, the Internet of Things, smart sensors, big data, predictive analytics, which are in turn changing our homes, our cities, our minds and much else besides.


Drilling down to the real why

What else can you do to see the future early? One trick is to ask what’s really behind recent developments. What are the deep technological, regulatory of behavioural drivers of change? But don’t stop there.

Dig down beyond the shifting sands of popular trends and ephemeral technologies to uncover the hidden bedrock upon which new developments are being built. Then balance this out against the degree of associated uncertainty. Other tips might include travelling to parts of the world that are in some way ahead technologically or socially. If you wish to study the trajectory of ageing or robotics, for instance, Japan is a good place to go. This is because Japan is the fastest ageing country on earth and has been curious about robotics longer than most. Japan is therefore looking at the use of robots to replace people in various roles ranging from kindergartens to aged-care.

You can just read about such things, of course. New Scientist, Scientific American, MIT Technology Review, The Economist Technology Quarterly and Wired magazine are all ways to reduce your travel budget. But seeing things with your own eyes tends to be more effective. Speaking with early adopters (often, but not exclusively younger people) is useful too as is spending time with heavy or highly enthusiastic users of particular products and services.

Academia is a useful laboratory for futures thinking too, as are the writings of some science fiction authors. And, of course, these two worlds can collide. It is perhaps no coincidence that the sci-fi author HG Wells studied science at what was to become Imperial College London or that many of the most successful sci-fi writers, such as Isaac Asimov and Arthur C. Clarke, have scientific backgrounds.

So find out what’s going on within certain academic institutions, especially those focussed on science and technology, and familiarise yourself with the themes the best science-fiction writers are speculating about right now.

Will doing any or all of these things allow you to see the future in any truly useful sense? The answer to this depends upon what it is that you are trying to see. If you aim is granular – to see the future with 100% precision – then you’ll be 100% disappointed. However, if you aim is to highlight possible directions and discuss potential drivers of change there’s a very good chance that you won’t be 100% wrong. Thinking about the distant future is inherently problematic, but if you spend enough time doing so it will almost certainly beat not thinking about the future at all.

Moreover, creating the time to peer at the distant horizon can result in something far more valuable than planning or prediction. Our inclination to relate discussions about the future to the present means that the more time we spend thinking about future outcomes the more we will think about whether what we are doing right now is correct. Perhaps this is the true value of forecasting: It allows us to see the present with greater far clarity and depth.

This essay was first written as a series of blog posts for the Technology Foresight Practice at Imperial College London. Future essays and posts will concern the use and misuse of scenario planning and global trends.

Links between science fiction and science fact

Screen Shot 2015-11-12 at 14.54.13

 

 

 

 

 

 

How Ideas Happen

Here’s a perfect example of how random events combine to create ideas and insights. I’ve been writing something about whether or not forecasting the future is futile or functional. It’s been a disaster. It jumps around, it doesn’t flow and I’m not really sure what the key thought is. I’ll persist for a while, but my prediction is that it’s heading for the wastebasket.

At about the same time as writing this piece I was at Imperial College and visited the science fiction library. Nothing dramatic, although the experience sparked off a thought about the extent to which science fiction influences invention. If you took a long enough time period would sci-fi writers prove to be better than futurologists at predicting the future? This didn’t really go anywhere initially, although a couple of lines in my piece did reference this thought and I had the idea of a call-out box (above) showing a couple of ideas in science fiction that became science fact.

A week later I’m at Imperial again and it suddenly hit me that you could create a rather wonderful graphic showing the connections between imagination and invention. With enough examples (50?, 100?) you could possibly make an interesting point about the time lag between speculation and appearance. For example, is the time between these two points getting shorter?

Very rough pencil sketch to come….

Memory & Understanding: Paper versus Pixels

027_24_1

A study by Pam A. Mueller of Princeton and Daniel M. Oppenheimer of UCLA has found that US college students who take notes on laptop computers are more likely to record lecturers’ words verbatim. Sounds like a good thing, but the study goes on to say that because notes are verbatim, students are therefore LESS LIKELY to mentally absorb what’s being said.

In one study, laptop-using students recorded 65% more of lectures verbatim than did those who used longhand; 30-minutes later, the laptop users performed significantly worse on conceptual questions. According to the researchers, longhand note takers learn by re-framing lecturers’ ideas in their own words.

This chimes with anecdotal evidence in the UK that some students aged around 16-18 are going back to index cards for exam revision because, as one said to me quite recently:  “stuff on screens doesn’t seem to sink in.”

Source: Sage Journals: ‘The pen is mightier than the keyboard: The advantages of longhand over laptop note taking’. See also Scientific American (Nov 2013) ‘Why the brain prefers paper.’ (summary here).

Future Orientation

Not sure I get this, but I thought I’d share. Needs more info I’d say…at least an example.

“The structure of the language spoken by a company’s top team affects the firm’s planning for the future, according to doctoral student Hao Liang, Christopher Marquis of Harvard Business School, and two colleagues. If the language is English, Spanish, or one of many others that use mainly grammar, rather than context, to distinguish present from future (“It is raining,” “It will rain”), people tend to focus less on the future, presumably because it seems more distant. On corporate social responsibility, which is a highly future-oriented activity, firms in countries speaking these “strong-future-time-reference” languages under-perform firms in weak-future-time-reference countries by more than 1.2 grade on a 7-step scale, the researchers say.”

Harvard Business Review, Daily Stat, 4 April 2024

Time to Reclaim the Future

 

 

 

 

 

 

I’d like to let you in on a little secret about the future, which is that there isn’t one.

Instead there are multiple futures, all of which can be influenced by how you, me and we choose to act right now. Moreover, how we imagine the future to be influences present attitudes and behaviours, much in the same way that our individual and collective histories help to define who we are. Put in a slightly different way, both past and future are always present.

But many people can’t see this. For them the future is something that just happens. It cannot be influenced. Increasingly, the future is also something that people fear, not something they look forward to. This is especially true in parts of the US and Europe.

This wasn’t always the case. The 1950s and 1960s were generally periods of great optimism, especially around technology. Even during the 1970s, 1980s and into the 1990s, the future was generally thought to be a good thing. It would bring more of whatever it was that people wanted.

But this view changed very suddenly. We were expecting trouble on 1 January 2000, but it was not forthcoming. Our computers still worked, our trains still ran and no planes fell out of the sky.

But then, on 11 September 2001, some lunatics armed with nothing more than a strong idea and a few feeble box-cutters flew two planes into the Twin Towers. On this date a number of Western certainties collapsed.

Or maybe the date was 9 November 1989. This was the day when the Berlin Wall fell down. This was meant to be a good day, an opening up of freedom and democracy. But it soon became apparent that two empires waged in a war of words on either side of the wall had meant certainty for almost half a century.

Whatever the precise date, it seems that many people no longer believe in the future. Indeed, many people seem to have fallen out of love with the very idea of progress – the idea that tomorrow will generally be better than yesterday.

Back in the early 1970s, Alvin and Heidi Toffler wrote a best-selling book called Future Shock. In it the authors argued that too much technological change, or at least the perception of too much change, over what was felt to be too short a period of time, was resulting in psychological damage to our Stone Age brains. The Tofflers also placed the term ‘information overload’ into the general consciousness.

In many ways, the idea of future shock is similar to that of culture shock. Both refer to the way in which individuals feel disorientated and to some extent powerless as they move from one familiar way of life to another. In the case of culture shock, this usually refers to the physical movement from one country or culture to another. In the case of future shock, we might use the term to describe our shift from analogue to digital culture or from a period containing what were thought to be fixed truths and geopolitical certainties to an era where boundaries are more fluid and nothing seems to be very certain.

The rapid change argument is certainty plausible. Adherents to this argument could cite Moore’s Law in the case of computing or rapid developments in synthetic biology, robotics, artificial intelligence or nanotechnology – or perhaps the breathless expansion of social media – as evidence for their case.

But was it not ever thus? The Internet, a fundamentally disruptive technology, can be compared in terms of impact with the rapid development of the telegraph, railways or even electricity in Victorian times. In fact I met an old gentleman not so long ago who was defending the erection of a mobile phone mast. Apparently, there was a similar fuss when the streetlights were first put in.

As for recent developments in the Middle East, history does seem to repeat itself, often as tragedy and sometimes as farce, as Karl Marx once observed.

Many things are indeed changing, but this has always been the case. Moreover, many of our most basic needs and desires – for example our quest for human connection, our sense of fairness, our need to belong to a community and our love of stories have hardly changed.

We still eat. We still drink. We still fall in and out of love. We still watch movies. We still listen to music. We still wear clothes and wear wristwatches. Actually I’m not totally sure about that last one. A friend of mine told me that he’d recently given his twelve-year-old son a watch for his birthday. The kid looked rather perplexed and asked why his father had given him a “Single function device.”

But I don’t think that the Swiss watch industry has much to worry about. Indeed we have been here before in the late 1970s when watches first went digital. People in the industry initially panicked, but then calmed down once they realised that people don’t buy watches just to tell the time.

Indeed, focusing purely on logic sometimes misses the point in the same way that focusing on technology at the expense of psychology can get you into a whole heap of trouble.

There are also cycles. I’m old enough to remember not only bicycles, but also beer and cider and a host of other things ranging from butter to bespoke suits being written off one moment and being reinvented and reinvigorated the next. As someone once said, there’s no such thing as mature industries, only mature executives that think certain things are impossible.

So why is pessimism about the future all the rage? To a great extent this is a Western phenomenon. If you speak with people in Mumbai, Shanghai or Dubai about the future there is generally more optimism on the streets. But even here there are worries surfacing that relate to everything from rapid urbanisation and income polarisation to water security, food prices and pollution. Fear of change, it seems, is increasingly universal. Why could this be so?

You might cite globalisation. Globalisation has brought many benefits, but as the world flattens and becomes more alike, cultural identity is coming more important. Indeed, the more globalised the world becomes, the more the local seems to matter and the more that historical differences come to the surface.

A loss of cultural identity could be a reason for the unease. But there are many other contenders. The economy is another potential culprit. One could also mention the number of people living on their own, the fear of unemployment, the blurring of work and home life, the breakdown of marriage, the decline of trust or the general acceleration of everyday life.

It could also be due to the (supposed) Wane of the West – the idea that the American and European Empires are falling while those of China, Russia, India, Brazil, Africa and others are all on the rise. These are all credible explanations, but I don’t think that any, or all, are quite right.

I would like to suggest an answer in three parts.

First, I think we are anxious because we are indeed exposed to too much information. The Toffler’s were right, they were just 40 years wrong.

What you don’t know can’t hurt you as they say, but in a digitally connected world everything, it seems, is visible and therefore everything has the potential to hurt or to embarrass you. Reputational risk is everywhere, not only for institutions but for individuals too.

I am not only referring here to privacy and the immortal nature of digital stupidity. I am also pointing towards cyber bullying, data and identity theft and a world in which we are all being watched, not by Big Brother, as Orwell predicted, but by our own social networks. These networks claim to be connecting us, but in so doing they never leave us alone and never allow us to be our real selves.

Second, I think we are anxious and worry about the future when we haven’t got anything more serious to be concerned about. Hence we imagine risks that barely exist or blow the risks that do exist out of all proportion. We indulge in self-loathing for similar reasons. As a species we have achieved a lot over the last few thousand years. If you look at almost any measure that matters – numbers ranging from infant mortality and adult longevity to literacy rates, extreme poverty, access to education, health, the number of women in education, the number of women in the workforce or the number of people killed in major conflicts we have rarely had it as good as we have it today.

But we don’t see this. What we see instead is the idea that progress itself has somehow become impoverished.

The popular view is that although technology and GDP have advanced, morals and ethics are treading water or, depending on your choice of broadcaster, are sinking back into decadence or barbarism. I would suggest that the reason for this has something to do with our own megalomania, our own sense of importance. I think it also has something to do with a lack of purpose and narrative, which brings me onto my third and final point.

I believe that the reason so many people around the world are anxious is because they can no longer see what lies ahead.

Back in the 60s, 70s, 80s and 90s, people believed that the future would be a logical extension of the present. They were wrong, deluded in many instances, but this hardly matters. What does matter is they had something – and it really could have been anything – to aim for and to orientate around.

But now we no longer believe in such things. Outside of a few pockets of massive wealth and/or techno-optimism, we have somehow fallen victim to the idea that the future is something that just happens to us. Something to which we can only react. Something we can do nothing about.

But this attitude is nonsense. The only thing we know for certain about the future is that it’s uncertain. And if it’s uncertain there must be a number of potential outcomes, a number of different ways in which the future might unfold. And all of these futures can, to a greater or lesser extent, be influenced by what we, as individuals and society, decide we want.

This, if you’ve not figured it out already, is intimately connected with leadership and to some extent innovation. In both cases an individual, or sometimes a small group, has a vision of a world that they’d like to see. This individual, or group, then creates a compelling story, a vision if you like, and convinces others around them to join them on a journey.

And that, I suppose, is the challenge.

Yes, we should all be aware of the drivers of change at both a global and local level.

We might, if we are not doing so already, consider the potential impacts of ageing societies, water and other resource shortages, climate change, the changing nature of influence, the impacts of re-localisation, the growth of more sedentary lifestyles, the shift from paper to pixels and its impact on understanding and the digitalisation of both friendship and community.

We should also imagine what various futures might look like and consider how we’d react if certain futures unfolded. We should develop a scenario for a world that turns out much better than we currently expect. We should also create a scenario for a world that gets much worse than we expect. We may even want to build a scenario for a future that turns out far weirder than we expect.

But fundamentally we need to make a choice. We need to decide, as individuals, organisations, nations or indeed the whole planet, where it is that we want to go next and start moving in that direction.

 

Historical views of the future

Couple of good things today. The first is something I stumbled upon looking for a reference to the economist John Maynard Keynes. It’s an essay he wrote in 1930 about life in the year 2030. It’s a good read, especially when you stop to consider what was happening in 1930.

Here’s a tiny taste.

“We are suffering just now from a bad attack of economic pessimism. It is common to hear people say that the epoch of enormous economic progress which characterised the nineteenth century is over, that the rapid improvement in the standard of life is now going to slow down – at any rate in Great Britain; that a decline in prosperity is more likely than an improvement in the decade which lies ahead of us. I believe that this is a wildly mistaken assumption…”

Click here for the essay (7 pages).

The other tasty morsel is another view of the future, this a bit more fun and time from a newspaper looking at 2011 from the perspective of 1911 (via Buzzfeed, via Sonny in Germany).

Click here to read why automobiles will be cheaper than horses, why you’ll be able to travel from New York to Liverpool in two days and why wireless telephones and telegraphs will span the world….

Preface From New Book (pre-edit)

“Google knows everything” – Nick, aged 8.

This is a book about how the digital era is changing our minds. It is about how new digital objects and environments, such as the internet, mobile phones and e-books are re-wiring our brains — at home, at work and at play.

Technology clearly has a lot to do with this, although in many instances it is not technology’s fault per se. Rather it is the way that many trends are combining and technology is either facilitating this confluence or accelerating and amplifying the effects. This may sound alarming but it needn’t be. We have created these digital technologies using imagination and ingenuity and it is surely within our grasp to decide how best to use them — or when not to.

But can something as seemingly innocent as a Google search or a mobile phone call really change the way that people think and act? I believe they can — and do.

This thought occurred to me one morning when I was looking out into space, from the rooftop of a hotel in Sydney. But then I reflected. Would I have thought this if I were on the phone, looking at a computer screen, in a basement office in London?

I think the answer is no. The hotel was a calm and relaxed environment with expansive harbour views, whereas an office can be a box of digital distractions. Modern life is indeed changing the quality of our thinking, but perhaps the clarity to see this only comes with a certain distance or detachment.

Does this matter? I think it does. Mobile phones, computers and iPods, have become a central feature of everyday life in hundreds of millions of households around the world. There are currently more than one billion personal computers and more than four billion mobile phones*(1) on the planet. In 2005, 12% of US newlyweds met online, while kids aged 5-16 years of age now spend, on average, around six hours every day in front of some kind of screen. This technological ubiquity must surely be resulting in significant attitudinal and behavioural shifts — but what are they? The answer is that nobody is really quite sure. The technology is too new (the internet is barely 5,000 days old) and our knowledge of the human mind is still too limited.

We do know the human brain is ‘plastic.’ It responds to any new stimulus or experience. Our thinking is therefore framed by the tools we choose to use. This has been the case for millennia but we have had millennia to consider the consequences. This has arguably changed. We are now so connected though digital networks that a culture of rapid response has developed. We are so continually available that we have left ourselves no time to properly think about what we are doing. We have become so obsessed with asking whether something can be done that we have left no time to consider whether something should be done. Perhaps the way our brains are constructed means that we just can’t see what is going on.

Moreover, the digital age (the internet, search engines and screens in general and mobile phones and digital books in particular) is chipping away at our ability to concentrate. As Professor Mark Bauerlein, author of The Dumbest Generation points out, screen reading “conditions minds against quiet, concentrated study, against imagination unassisted by visuals, against linear sequential analysis of texts, against an idle afternoon with a detective story and nothing else”. We are therefore in danger of developing a new generation that has plenty of answers but few good questions. A generation that is connected and collaborative but one that is also impatient, isolated and detached from reality. A generation that is unable to think in the ‘real’ world.

It’s not just the new generations either. We all scroll through our days without thinking deeply about what we are really doing or where we are ultimately going. We are turning into whirling dervishes, frantically moving from place to place in search of superficial ecstasy, unaware that many the things we most yearn for are being trampled by our own feet. It is only when we stop moving and the dust settles that we can see this destruction clearly. Our attention and relationships are becoming atomised too. We are connected globally, but our physical relationships are becoming wafer thin and ephemeral. Digital objects and environments influence how we all think and are profoundly shaping how we interact.

Ultimately, I believe the quality of our thinking – and ultimately our decisions – is suffering. Digital devices are turning us into a society of scatterbrains. If any piece of information can be recalled at the click of a mouse, why bother to learn anything? We are all becoming google-eyed. If GPS*(2) can allow us to find anything in an instant, why master map reading? But what if one day the technology doesn’t work? What then?*(3)

It is the right kind of thinking – what I call deep thinking – that makes us uniquely human. This is the type of thinking that is associated with new insights and ideas that move the world forward. It is thinking that is rigorous, focused, deliberate, independent, original, imaginative and reflective. But deep thinking like this can’t be done in a hurry or in an environment full of noise and interruptions. It can’t be done in 140 characters or less. It can’t be done when you are doing three things at once.

Yes it’s possible to walk and chew gum at the same time but I am concerned about what happens when you add a Twitter stream, a Kindle and an iPod into the mix. In short, what happens to the quality of our thinking when we never really sit still or completely switch off?

Why does all this matter? Because a knowledge revolution is replacing human brawn with human brains as the primary tool of economic production.*(4) It is now intellectual capital (i.e. the product of human minds) that matters most. But we are on the cusp of another revolution. In the future, our minds will compete with smart machines for employment and even human affection. Hence, being able to think in ways that machines cannot, will become vitally important. Put another way, machines are becoming adept at matching stored knowledge to patterns of human behaviour, so we are shifting from a world where people are paid to accumulate and distribute information to an innovation economy where people will be rewarded as conceptual thinkers. Yet this is precisely the type of thinking that is currently under attack.

So how should we as individuals, organisations and institutions (the latter being those deliberately built environments where we spend most of our lives) be dealing with the changing way that people think? How can we harness the potential of new digital objects and environments whilst minimising their downsides?

Personally, I think we need to do a little less and think a little more. We need to slow things down. Not all the time but occasionally. We need to stop confusing movement with progress and get away from the idea that all communication and decision making has to be done instantly. The tyranny of the next financial quarter is just as damaging to deep thinking as a noisy office fitted with fluorescent lighting.

I’m sure that by writing this book I will be accused by some people of going backwards, or of being a pastist. But remember that some of the tried and tested technologies of yesteryear have grown old precisely because they are good and we should think twice before deleting them. Equally, being a member of the Tech No movement doesn’t mean smashing the nearest digital device. It simply means questioning potential consequences or asking for some level of balance. It is about arguing that we need a little more of this and a little less of that.

This is a book about work, education, time, space, books, baths, sleep, music and other things that influence our thinking. It is about how something as physical, finite and flimsy as a 1.5 kg box of proteins and carbohydrates can generate something as infinite and potentially valuable as an idea. Hence, it is for anyone who’s curious about thinking about their own thinking and for everyone who’s interested in unleashing the extraordinarily potential of the human mind.

Whether you are interested in how to deal with too much information, constant partial attention, our obsession with busyness, leisure guilt, the myth of multi-tasking, the sex life of ideas, or the rise of the screenager, this book explores the different aspects of how digital objects and environments are re-wiring our brains – and makes some practical suggestions about what we can do about it.


* (1)Half of British children aged between 5 and 9 now own a mobile phone. For 7 to 15 year-olds the figure is 75%. This is despite government advice that no child under-16 should be using one. The average age that children in the UK now acquire a mobile phone is 8 years.

* (2) I interviewed someone for a job recently and one of her questions was whether or not she could use my car. I said she could, so she asked whether my car had a GPS in it. It doesn’t. She turned the job down. I wish her luck, whatever direction her life goes in. The point here is that GPS and Google give us information but they do not impart understanding and in some cases they can prevent us from properly planning ahead.

*(3) We assume the internet will always work. But what if it doesn’t? A US think-tank (Nemertes Research) says internet use is rising by 60% each year worldwide. Unless we can increase capacity they claim ‘brownouts’ (frozen screens, download delays etc) will become commonplace, relegating the internet to the status of a toy. How would you cope with that?

* (4) A study by McKinsey & Company, a management consultancy, claims that 85% of new jobs created in the US between 1998 and 2006 involved “knowledge work”.

Where Do People Do Their Best Thinking?

You may remember that as part of my new book (Future Minds) I spoke to a few people asking them the question: “Where and when do you do your best thinking?” Here are some more responses…

“Over the decades, I think that my best thinking has occurred when I am visiting a foreign country, have my obligations out of the way, and am sitting in a pleasant spot – in a café, near a lake – with a piece of paper in front of me.”

“Usually when I am not working, and most often when I am travelling!”

“The most relevant (issue) for me is ideas needed for a piece of writing. As a drummer I am generally required to avoid deep thinking of any sort. So it’s probably whilst driving on a motorway, or on the start of a transatlantic flight. I think it’s to do with some distractions so that the thinking is a little freer – there is nothing worse than tidying the desk, sharpening a pencil and sensing the creative part of the brain creeping out the back door… also there’s a nice reward element that can be employed. No motorway fry up, or extra dry martini before there’s an opening line invented.”

“Lying in bed in the dark, with the white noise generator producing a soothing whoosh, I sometimes have a few seconds of modest insight.”

“I love doing household chores: loading the dishwasher, scrubbing the floors, scouring the pans; the polishing, the cleaning. All the time I am thinking of ways to improve upon the equipment; what would bring forward the technology.”

“I’ve had creative thoughts while walking down the street, in the shower, on the squash court, in the bathroom (of course), while shaving… .”

“I do my best thinking in bed – sometimes even when asleep. I wake up having solved a problem.”

Future blogs

I was thinking that I should add ‘serendipity’ to my extinction timeline when I stumbled upon a great blog about the future by accident. I was looking for a precise definition of Horizon 1, 2 & 3 innovations when I came across the Next Big Future blog. Cracking stuff.

Check it out…. www.nextbigfuture.com

Link to seven horizons timeline below in comments.