Everything I’ve Ever Learnt About The Future

Here’s a prediction. You are reading this because you believe that it’s important to have a sense of what’s coming next.

Or perhaps you believe that since disruptive events are becoming more frequent you need more warning about potential game-changers, although at the same time you’re frustrated by the unstructured nature of futures thinking.

Foresight is usually defined as the act of seeing or looking forward – or to be in some way forewarned about future events. In the context of science, it can be interpreted as an awareness of the latest discoveries and where these may lead, while in business it’s generally connected with an ability to think through longer-term opportunities and risks be these technological, geopolitical, economic, or environmental.

But how does one use foresight? What practical tools are available for individuals to stay one step ahead and to deal with potential pivots?

The answer to this depends on your state of mind.

In short, if alongside an ability to focus on the here and now you have – or can develop – a culture that’s furiously curious, intellectually promiscuous, self-doubting, and meddlesome you are likely to be far more effective at foresight than if you doggedly stick to a single idea or worldview. This is because the future is rarely a logical extension of single ideas or conditions.

Furthermore, even when it looks as though this may be so, everything from totally unexpected events, feedback loops, behavioural change, pricing, taxation, and regulation have a habit of tripping up even the best-prepared plans.

Looking both ways

In other words, when it comes to the future most people aren’t really thinking, they are just being logical based on small sets of recent data or personal experience. The future is inherently unpredictable, but this gives us a clue as to how best to deal with it. If you accept – and how can you not – that the future is uncertain, then you must accept that there will always be numerous ways in which the future could play out. Developing a prudent, practical, pluralistic mind-set that’s not narrow, self-assured, fixated, or over-invested in any singular outcome or future is therefore a wise move.

This is similar in some respects to the scientific method, which seeks new knowledge based upon the formulation, testing, and subsequent modification of a hypothesis.

Not blindly accepting conventional wisdom, being questioning and self-critical, looking for opposing forces, seeking out disagreement and above all being open to disagreements and anomalies are all ways of ensuring agility and most of all resilience in what is becoming an increasingly febrile and inconstant world.

This is all much easier said than done, of course. Homo sapiens are a pattern seeing species and two of the things we loathe are randomness and uncertainty. We are therefore drawn to forceful personalities with apparent expertise who build narrative arcs from a subjective selection of so-called facts. Critically, such narratives can force linkages between events that are unrelated or ignore important factors.

Seeking singular drivers of change or maintaining a simple positive or negative attitude toward any new scientific, technological, economic, or political development is therefore easier than constantly looking for complex interactions or erecting a barrier of scepticism about ideas that almost everyone else appears to agree upon or accept without question.

Danger: hidden assumptions

In this context a systems approach to thinking can pay dividends. In a globalised, hyper-connected world, few things exist in isolation and one of the main reasons that long-term planning can go so spectacularly wrong is the oversimplification of complex systems and relationships.

Another major factor is assumption, especially the hidden assumptions about how industries or technologies will evolve or how individuals will behave in relation to new ideas or events. The hysteria about Peak Oil might be a case in point.  Putting to one side the natural assumption that we’ll need oil in the future, the amount of oil that’s available depends upon its price. If the price is high there’s more incentive to discover and extract more oil especially, as it turned out, shale oil.

A high oil price also fuels the search for alternative energy sources, but also incentivises behavioural change at both an individual and governmental level.  It’s not an equal and opposite reaction, but the dynamic tensions inherent within powerful forces means that balancing forces do often appear over time.

Thus, we should always think in terms of technology plus psychology, or one factor combined with others.  In this context, one should also consider wildcards. These are forces that appear out of nowhere or which blindside us because we’ve discounted their importance.

Similarly, it can often be useful to think in terms of future and past. History gives us clues about how people have behaved before and may behave again. Therefore, it’s often worth travelling backwards to explore the history of industries, products, or technologies before travelling forwards.

If hidden assumptions, the extrapolation of recent experience, and the interplay of multiple factors are three traps, cognitive biases are a fourth. The human brain is a marvellous thing, but too often tricks us into believing that something that’s personal or subjective is objective reality. For example, unless you are aware of confirmation bias it’s difficult to unmake your mind once it’s made up.

Once you have formed an idea about something – or someone – your conscious mind will seek out data to confirm your view, while your subconscious will block anything that contradicts it. This is why couples argue, why companies steadfastly refuse to evolve their strategy and why countries accidently go to war. Confirmation bias also explains why we persistently think that things we have experienced recently will continue.  Similar biases mean that we stick to strategies long after they should have been abandoned (loss aversion) or fail to see things that are hidden in plain sight (inattentional blindness).

In 2013, a study in the US called the Good Judgement Project asked 20,000 people to forecast a series of geopolitical events. One of their key findings was that an understanding of these natural biases produced better predictions. An understanding of probabilities was also shown to be of benefit as was working as part of a team where a broad range of options and opinions were discussed.

You must be aware of another bias – Group Think – in this context, but if you are aware of the power of consensus you can at least work to offset its negative aspects.

Being aware of how people relate to one another also recalls the thought that being a good forecaster doesn’t only mean being good at forecasts. Forecasts are no good unless someone is listening and is prepared to act.

Thinking about who is and who is not invested in certain outcomes – especially the status quo – can improve the odds when it comes to being heard. What you say is important, but so too is whom you speak to and how you illustrate your argument, especially in organisations that are more aligned to the world as it is than the world as it could become.

Steve Sasson, the Kodak engineer who invented the world’s first digital camera in 1975 showed his invention to Kodak’s management and their reaction allegedly was: ‘That’s cute, but don’t tell anyone.”  Eventually Kodak commissioned research, the conclusion of which was that digital photography could be disruptive.

However, it also said that Kodak would have a decade to prepare for any transition. This was all Kodak needed to hear to ignore it. It wasn’t digital photography per se that killed Kodak, but the emergence of photo-sharing and of group think that equated photography with printing, but the result was much the same.

Good forecasters are good at getting other peoples’ attention using narratives or visual representations. Just look at the power of science fiction, especially movies, versus that of white papers or power point presentations.

If the engineers at Kodak had persisted or had brought to life changing customer attitudes and behaviours using vivid storytelling – or perhaps photographs or film – things might have developed rather differently.

Find out what you don’t know.

Beyond thinking about your own thinking and thinking through whom you speak to and how you illustrate your argument, what else can you do to avoid being caught on the wrong side of history? According to Michael Laynor at Deloitte Research, strategy should begin with an assessment of what you don’t know, not with what you do. This is reminiscent of Donald Rumsfeld’s infamous ‘unknown unknowns’ speech.

“Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know….”

The language that’s used here is tortured, but it does fit with the viewpoint of several leading futurists including Paul Saffo at the Institute for the Future. Saffo has argued that one of the key goals of forecasting is to map uncertainties.

What forecasting is about is uncovering hidden patterns and unexamined assumptions, which may signal significant revenue opportunities or threats in the future.

Hence the primary aim of forecasting is not to precisely predict, but to fully identify a range of possible outcomes, which includes elements and ideas that people haven’t previously known about, taken seriously or fully considered.

The most useful starter question in this context is: ‘What’s next?’ but forecasters must not stop there. They must also ask: ‘So what?’ and consider the full range of ‘What if?’

Consider the improbable

A key point here is to distinguish between what’s probable, and what’s possible. (See Introducing the 4Ps post).

Sherlock Holmes said that: “Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth.” This statement is applicable to forecasting because it is important to understand that improbability does not imply impossibility. Most scenarios about the future consider an expected or probable future and then move on to include other possible futures. But unless improbable futures are also considered significant opportunities or vulnerabilities will remain unseen.

This is all potentially moving us into the territory of risks rather than foresight, but both are connected. Foresight can be used to identify commercial opportunities, but it is equally applicable to due diligence or the hedging of risk. Unfortunately, this thought is lost on many corporations and governments who shy away from such long-term thinking or assume that new developments will follow a simple straight line.  What invariably happens though is that change tends to follow an S Curve and developments tend to change direction when counterforces inevitably emerge.

Knowing precisely when a trend will bend is almost impossible but keeping in mind that many will is itself useful knowledge.

The Hype Cycle developed by Gartner Research is also helpful in this respect because it helps us to separate recent developments or fads (the noise) from deeper or longer-term forces (the signal). The Gartner model links to another important point too, which is that because we often fail to see broad context, we tend to simplify.

This means that we ignore market inertia and consequently overestimate or hype the importance of events in the shorter term, whilst simultaneously underestimating their importance over much longer timespans.

An example of this tendency is the home computer. In the 1980s, most industry observers were forecasting a Personal Computer in every home. They were right, but this took much longer than expected and, more importantly, we are not using our home computers for word processing or to view CDs as predicted. Instead, we are carrying mobile computers everywhere, which is driving universal connectivity, the Internet of Things, smart sensors, big data, predictive analytics, which are in turn changing our homes, our cities, our minds and much else besides.

Drilling down into the bedrock to reveal the real why.

What else can you do to see the future early? One trick is to ask what’s behind recent developments. What are the deep technological, regulatory of behavioural drivers of change? But don’t stop there.

Dig down beyond the shifting sands of popular trends to uncover the hidden bedrock upon which new developments are being built. Then balance this out against the degree of associated uncertainty.

Other tips might include travelling to parts of the world that are in some way ahead technologically or socially. If you wish to study the trajectory of ageing, for instance, Japan is a good place to start. This is because Japan is the fastest ageing country on earth and consequently has been curious about robotics longer than most. Japan is already running out of humans and is looking to use robots to replace people in various roles ranging from kindergartens to aged care.

You can just read about such things, of course. New Scientist, Scientific American, MIT Technology Review, The Economist Technology Quarterly are all ways to reduce your travel budget, but seeing things with your own eyes tends to be more effective. Speaking with early adopters (often, but not exclusively younger people) is useful too as is spending time with heavy or highly enthusiastic users of products and services.  

Academia is a useful laboratory for futures thinking too, as are the writings of some science fiction authors. And, of course, these two worlds can collide. It is perhaps no coincidence that the sci-fi author HG Wells studied science or that many of the most successful sci-fi writers, such as Isaac Asimov and Arthur C. Clarke, have scientific backgrounds.

So, find out what’s going on within certain academic institutions, especially those focussed on science and technology, and familiarise yourself with the themes the best science-fiction writers are speculating about.

Will doing any or all these things allow you to see the future in any truly useful sense? The answer to this depends upon what it is that you are trying to achieve. If you aim is to get the future 100% correct, then you’ll be 100% disappointed. However, if you aim is to highlight possible directions and discuss potential drivers of change there’s a very good chance that you won’t be 100% wrong. Thinking about the distant future is inherently problematic, but if you spend enough time doing so it will almost certainly beat not thinking about the future at all.

Creating the time to peer at the distant horizon can result in something far more valuable than prediction too. Our inclination to relate discussions about the future to the present means that the more time we spend thinking about future the more we will think about whether what we are doing right now is correct. Perhaps this is the true value of forecasting: It allows us to see the present with greater clarity and precision.

Richard Watson April 2023. richard@nowandnext.com

Future Shock at Fifty

Here’s my chapter from the new book that’s just out looking back at Future Shock. The book, After Shock, can be bought here. US link here.

Future Shock @ 50.

One of the enduring issues futurists face is bad timing. Like science-fiction writers and technologists, futurists often believe that things will happen sooner rather than later. Future Shock, which accurately described the turmoil of the 1970s, is perhaps more accurate and relevant now than it was then. The book’s central idea, that the perception of too much change over too short a period of time would create instability, perfectly describes the volatility, uncertainty and confusion currently being created by geo-political events, climate change and technology, most of all digital technologies, that accelerate everything except our capacity to cope with change. Hence a global epidemic of anxiety that expresses itself in everything from the rise of mental illness to the mass prescription of painkillers and anti-depressants.

Looking backwards, it’s hard to imagine how the upheavals of the early 1970s could have be seen as shocking. Surely the pace of change then was glacial compared to what it is now? But we have largely forgotten about the seismic shift from ‘we’ to ‘me’ that accompanied the fading of the sixties and the blossoming of the seventies. Group love was replaced by empowered individualism. The invention of modern computing, which soon became personal computing, amplified the focus on the individual even further. Peace, too, was shattered, not only by the enduring war in Vietnam, but by the invention of international terrorism, while OPEC’s oil shock created inflation and economic uncertainty. Oh, and the US had a rogue President, which proves, to me at least, that some things do not change quite as much as we sometimes imagine.

Maybe the speed of change in the 70s wasn’t as rapid as today, but events back then were still alarming compared to the relative stability that had endured before. Global media made such events more visible too. Information Overload is idea championed in Future Shock and another that feels almost quaint when it’s used in the context of the 70s. Overloaded? Way back then? You cannot be serious.

The problem, of course, is that while many things did indeed change, and continue to do so today, we do not. Or, at least, we struggle to keep up. Edward O. Wilson, the American biologist, sums the situation up perfectly when he says that:

Humanity today is like a waking dreamer, caught between the fantasies of sleep and the chaos of the real world.

The mind seeks but cannot find the precise place and hour.

We have created a Star Wars civilization, with Stone Age emotions, medieval institutions, and godlike technology.”

This clash, between technologies and behaviours that change rapidly (exponential is a word that’s thrown around with careless abandon, but rarely stacks up outside computing) and our brains which do not change as fast was seen by the Toffler’s as being injurious to not only our health, but to our decision-making abilities too.

Interestingly, decades after Future Shock was written, a study published by Angelika Dimoka, Director of the Centre for Neural Decision Making at Temple University, appears to support part of what the book proposed.

This study found that as information is increased, so too is activity in the dorsolateral prefrontal cortex, a region of the human brain associated with decision-making and the control of emotions. Yet eventually, if incoming information continues to flow unrestrained, activity in this region falls off. The reason for this is that part of the brain has essentially left the building. When information reaches a tipping point, the brain protects itself by shutting down certain functions. Outcomes include a tendency for anxiety and stress levels to soar and for people to abstain from making important decisions. Not so much future shock as present paralysis, perhaps.

Future Shock did not propose a solution to this problem.

The book, as the authors pointed out, was always more of a diagnosis than a cure, although education is mentioned as a critically important factor going forward. So too is a need to guard against change that’s either unguided or unrestrained.

This, again, is very much where we stand today. Technologies such as autonomous trucks, surgical robots and battlefield drones are being developed with little or no public debate.

If Tesla, for example, were Boeing or Bristol-Myers Squibb, I find it hard to imagine how their products would be allowed anywhere near the public at such a stage of development. As for artificial intelligence, the situation is even worse.

Broad or General AI could be the best or the worst invention humankind will ever make and yet there is hardly any broad discussion about the risks, let alone any public debate about what this technology is ultimately for. Like many other things, AI is largely being imposed upon people for economic gain and it is hard to see why it will enhance rather than diminish our humanity overall. However, this pessimistic view ignores two things I’ve learned about the future, which is that the future is not linear and it’s not binary either. Things will happen that nobody, especially futurists, saw coming and the most likely scenario is an uneasy balance between people wishing to push forwards and others wanting to pull back.

So, fifty years on, is there now a cure for Future Shock?

I think there is and I believe that we are beginning to see some early signs of it.

In my view, the central problem at the moment is not change or acceleration per se, but that we have lost trust in government, science, business, the media and even each other. We’ve also simultaneously lost our anchor points and thrown away our ballast, with the result that we are being tossed about in a high sea without any sign of land to navigate towards. No wonder we are feeling disorientated and somewhat nauseous.

It is this lack of direction, more than anything else, that is fuelling our anxiety in my view. But a simple solution is at hand. First, we need a moderate level of disconnection.

We need to stop treating all information as power and reclaim some control over what enters our brains. The information universe is infinite, but our attention spans and mental processing capabilities are not. More importantly, without a strong sense of identity, in other words a sense of whom we are and where we stand, we will continue to be thrown around by the slightest disturbance.

I’m not for a moment suggesting that we ditch our cell-phones or throw out our televisions in the style of Peter Finch in the movie Network, more than we consider more carefully the ideas that we let into our homes and our brains.

One we have regained some sense of calm and perspective we then need to talk to each other about where it is that we want to go in the future. How do we want to live? Even the ‘fact’ that the future has arrived all at once, which some people cite as being a source for many of our troubles, would be easy to deal with if we only knew what destination we were heading towards. Then, as Future Shock suggests, we could restrain or reject anything that impedes our progress.

Having a view of what lies ahead, a shared vision of a promised land if you will, would also allow us to focus more on the present and worry less about endless unknowns. We should spend far less time individually worrying about what might happen and much more time collectively thinking about what it is that we want to happen.

There are echoes of this happening already in everything from the growing disenchantment with our political elites and Big Tech to the criticisms that are emerging concerning globalisation and the sterile nature of free-market economics and the inequality of wealth and opportunity that results.

In my opinion, the next big thing will be a seismic shift concerning what we value, which is us. All we are waiting for is a trigger. People can feel that something or someone is coming already, although nobody has expressed it yet.

It is the absence of a future, and especially a future where humanity matters, that is being felt, but I believe we are on the cusp of changing things in the right direction.

As Future Shock says, we need to humanize distant tomorrows and the best way of doing this is to take control of the future. Whether we like it or not, we are on the cusp of developing technologies that will give us godlike powers. The simple question is what to do with this power. Who do we, as individuals, societies and a species, want to be?

Alvin Toffler died in 2016, his wife and co-author in 2019, and I think we all owe them a great debt. Not only did they place future thinking firmly on the world’s stage, they established a literary genre that continues to this day.

Richard Watson is the author of Digital Vs. Human and the founder of nowandnext.com

Future Shock 50 Years On

It’s been 50 years since the publication of Future Shock, arguably the most famous non-fiction book ever written about the future. To celebrate, a new book called After Shock is being published next week. I have contributed a chapter, which I will post next week. In the meantime here is a link to a pdf of the original book.

Are Futurists History?

Apparently, being a futurist is one of the coolest jobs on the planet. Shame I’m in the process of getting out of being one then. Am I just way ahead of everyone else or seriously behind? (Personally, I think futurists come and go. They come in and out of fashion and at the moment we’re at peak futurist (or, at least, futurists have reached the peak of inflated expectations).

-Firstly, the article, in Bustle. Not bad, but it’s really describing trend spotters not futurists in my opinion. Real futurists are much closer to sci-fi writers in my view. They tend to think at least 10-20 years out, not next year, and they don’t predict either. OK, maybe they do, for fun, but “based on stone cold facts”? How can you have stone cold facts about the future when the future hasn’t happened yet? What is being described here is extrapolation from historical data or current trends (“future trends” is another oxymoron btw – guilty!). I think futurists, like good sci-fi writers, warn of possible dangers or at least illuminate current concerns. They point to worlds that might be rather than world’s that are, and again this is rarely trend-based and is impossible to ‘prove’ with numbers. In other words they get people to think. Also, just look at the history of bad predictions from futurists. The accuracy of forecasting is shocking, although I’m sure it’s not stopping. So here’s a prediction: One thing that won’t change in the future is our interest in the future – what’s coming next. But something else that won’t change is our total inability to remember people that, with hindsight, make shockingly incorrect statements. Thanks to Tara for spotting this article. Link to article here. 

Predictions for 2018 (from 1968)

Lovely article in The New Yorker (part above) about what a bunch of smart people in 1968 thought 2018 might look like. Lots of things way off the mark, but a few, especially around climate change and computing, that are more or less on the money.

The bext bit is the end bit. Let’s worry less about what might happen 50 years into the future and worry more about affecting change in the present. Amen to that.

Full article.

Sensing the Futures (how to challenge ruling narratives)

Screen Shot 2016-06-14 at 12.19.18

Here’s a prediction. You are reading this because you believe that it’s important to have a sense of what’s coming next. Or perhaps you believe that since disruptive events are becoming more frequent you need more warning about potential game-changers, although at the same time you’re frustrated by the unstructured nature of futures thinking.

Foresight is usually defined as the act of seeing or looking forward – or to be in some way forewarned about future events. In the context of science and technology it can be interpreted as an awareness of the latest discoveries and where these may lead, while in business and politics it’s generally connected with an ability to think through longer-term opportunities and risks be these geopolitical, economic or environmental. But how does one use foresight? What practical tools are available for companies to stay one step ahead of the future and to deal with potential disruption?

The answer to this depends on your state of mind. In short, if alongside an ability to focus on the here and now you have – or can develop – a corporate culture that’s furiously curious, intellectually promiscuous, self-doubting and meddlesome you are likely to be far more effective at foresight than if you doggedly stick to a single idea or worldview. This is due to the fact that the future is rarely a logical extension of single ideas or conditions. Furthermore, even when it looks as though this may be so, everything from totally unexpected events, feedback loops, behavioural change, pricing, taxation and regulation have a habit of tripping up even the best-prepared linear forecasts.

Always look both ways
In other words, when it comes to the future most people aren’t really thinking, they are just being logical based on small sets of data or recent personal experience.

The future is inherently unpredictable, but this gives us a clue as to how best to deal with it. If you accept – and how can you not – that the future is uncertain then you must accept that there will always be numerous ways in which the future could play out. Developing a prudent, practical, pluralistic mind-set that’s not narrow, self-assured, fixated or over-invested in any singular outcome or future is therefore a wise move.

This is similar in some respects to the scientific method, which seeks new knowledge based upon the formulation, testing and subsequent modification of a hypothesis. The scientific method is perhaps best summed up by the idea that you should always keep an open mind about what’s possible whilst simultaneously remaining somewhat cynical.

Not blindly accepting conventional wisdom, being questioning and self-critical, looking for opposing forces, seeking out disagreement and above all being open to disagreements and anomalies are all ways of ensuring agility and most of all resilience in what is becoming an increasingly febrile and inconstant world.

This is all much easier said than done, of course. Homo sapiens are a pattern seeing species and two of the things we loathe are randomness and uncertainty. We are therefore drawn to forceful personalities with apparent expertise who build narrative arcs from a subjective selection of so called ‘facts’. Critically, such narratives can force linkages between events that are unrelated or ignore important factors.

Seeking singular drivers of change or maintaining a simple positive or negative attitude toward any new scientific, economic or political development is therefore easier than constantly looking for complex interactions or erecting a barrier of scepticism about ideas that almost everyone else appears to agree upon or accept without question.
Danger: hidden assumptions.In this context a systems approach to thinking can pay dividends. In a globalised, hyper-connected world, few things exist in isolation and one of the main reasons that long-term planning can go so spectacularly wrong is the over simplification of complex systems and relationships.

Another major factor is assumption, especially the hidden assumptions about how industries or technologies will evolve or how individuals will behave in relation to new ideas or events. The historical hysteria about Peak Oil might be a case in point. Putting to one side the assumption that we’ll need oil in the future, the amount of oil that’s available depends upon its price. If the price is high there’s more incentive to discover and extract more oil. A high oil price fuels a search for alternative energy sources, but also incentivises behavioural change at both an individual and governmental level. It’s not an equal and opposite reaction, but the dynamic tensions inherent within powerful forces means that balancing forces do often appear over time.

Thus we should always think in terms of technology plus psychology or one factor combined with others. In this context, one should also consider wildcards. These are forces that appear out of nowhere or which blindside us because we’ve discounted their importance. For example, who could have foreseen the importance of the climate climate debate in the early 2000s or its relative disappearance in the aftermath after the global financial crisis in 2007/8.

Similarly, it can often be useful to think in terms of future and past. History gives us clues about how people have behaved before and may behave again. Therefore it’s often worth travelling backwards to explore the history of industries, products or technologies before travelling forwards.

If hidden assumptions, the extrapolation of recent experience, and the interplay of multiple factors are three traps, cognitive biases are a fourth. The human brain is a marvellous thing, but too often it tricks us into believing that something that’s personal or subjective is objective reality. For example, unless you are aware of confirmation bias it’s difficult to unmake your mind once it’s made up. Back to Peak Oil and Climate Change scepticism perhaps.

Once you have formed an idea about something – or someone – your conscious mind will seek out data to confirm your view, while your subconscious will block anything that contradicts it. This is why couples argue, why companies steadfastly refuse to evolve their strategy and why countries accidently go to war. Confirmation bias also explains why we persistently think that things we have experienced recently will continue into the future. Similar biases mean that we stick to strategies long after they should have been abandoned (loss aversion) or fail to see things that are hidden in plain sight (inattentional blindness).

In 2013, a study in the US called the Good Judgement Project asked 20,000 people to forecast a series of geopolitical events. One of their key findings was that an understanding of these natural biases produced better predictions. An understanding of probabilities was also shown to be of benefit as was working as part of a team where a broad range of options and opinions were discussed. You have to be aware of another bias – Group Think – in this context, but as long as you are aware of the power of consensus you can at least work to offset its negative aspects.

Being aware of how people relate to one another also brings to mind the thought that being a good forecaster doesn’t only mean being good at forecasts. Forecasts are no good unless someone is listening and is prepared to take action.Thinking about who is and who is not invested in certain outcomes – especially the status quo – can improve the odds when it comes to being heard. What you say is important, but so too is whom you speak to and how you illustrate your argument, especially in organisations that are more aligned to the world as it is than the world as it could become.

Steve Sasson, the Kodak engineer who invented the world’s first digital camera in 1975 showed his invention to Kodak’s management and their reaction was: ‘That’s cute, but don’t tell anyone.” Eventually Kodak commissioned research, the conclusion of which was that digital photography could be disruptive. However it also said that Kodak would have a decade to prepare for any transition. This was all Kodak needed to hear to ignore it. It wasn’t digital photography per se that killed Kodak, but the emergence of photo-sharing and of group think that equated photography with printing, but the end result was much the same.

Good forecasters are good at getting other peoples’ attention through the use of narratives or visual representations. Just look at the power of science fiction, especially movies, versus that of white papers or power point presentations.

If the engineers at Kodak had persisted, or had brought to life changing customer attitudes and behaviours through the use of vivid storytelling – or perhaps photographs or film – things might have developed rather differently.

Find out what you don’t know
Beyond thinking about your own thinking and thinking through whom you speak to and how you illustrate your argument, what else can you do to avoid being caught on the wrong side of future history? According to Michael Laynor at Deloitte Research, strategy should begin with an assessment of what you don’t know, not with what you do. This is reminiscent of Donald Rumsfeld’s infamous ‘unknown unknowns’ speech.

“Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know….”

The language that’s used here is tortured, but it does fit with the viewpoint of several leading futurists including Paul Saffo at the Institute for the Future. Saffo has argued that one of the key goals of forecasting is to map uncertainties.

What forecasting is about is uncovering hidden patterns and unexamined assumptions, which may signal significant revenue opportunities or threats in the future.

Hence the primary aim of forecasting is not to precisely predict, but to fully identify a range of possible outcomes, which includes elements and ideas that people haven’t previously known about, taken seriously or fully considered. The most useful starter question in this context is: ‘What’s next?’ but forecasters must not stop there. They must also ask: ‘So what?’ and consider the full range of ‘What if?’

Consider the improbable
A key point here is to distinguish between what’s probable, and what’s possible. Sherlock Holmes said that: “Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth.” This statement is applicable to forecasting because it is important to understand that improbability does not imply impossibility. Most scenarios about the future consider an expected or probable future and then move on to include other possible futures. But unless improbable futures are also considered significant opportunities or vulnerabilities will remain unseen.

Screen Shot 2016-06-14 at 12.07.47

This is all potentially moving us into the territory of risks rather than foresight, but both are connected. Foresight can be used to identify commercial opportunities, but it is equally applicable to due diligence or the hedging of risk. Unfortunately this thought is lost on many corporations and governments who shy away from such long-term thinking or assume that new developments will follow a simple straight line. What invariably happens though is that change tends to follow an S Curve and developments have a tendency to change direction when counter-forces inevitably emerge.

Knowing precisely when a trend will bend is almost impossible, but keeping in mind that many will is itself useful knowledge. The Hype Cycle developed by Gartner Research is also helpful in this respect because it helps us to separate recent developments or fads (the noise) from deeper or longer-term forces (the signal). The Gartner diagram links to another important point too, which is that because we often fail to see broad context we have a tendency to simplify.

Screen Shot 2016-06-14 at 12.13.58

This means that we ignore market inertia and consequently overestimate the importance of events in the shorter term, whilst simultaneously underestimating their importance over much longer timespans.

An example of this tendency is the home computer. In the 1980s, most industry observers were forecasting a Personal Computer in every home. They were right, but this took much longer than expected and, more importantly, we are not using our home computers for word processing or to view CDs as predicted. Instead we are carrying mobile computers (phones) everywhere, which is driving universal connectivity, the Internet of Things, smart sensors, big data, predictive analytics, which are in turn changing our homes, our cities, our minds and much else besides.

Drilling down to reveal the real why
What else can you do to see the future early? One trick is to ask what’s really behind recent developments. What are the deep technological, regulatory of behavioural drivers of change? But don’t stop there.

Dig down beyond the shifting sands of popular trends to uncover the hidden bedrock upon which new developments are being built. Then balance this out against the degree of associated uncertainty.

Other tips might include travelling to parts of the world that are in some way ahead technologically or socially. If you wish to study the trajectory of ageing or robotics, for instance, Japan is a good place to start. This is because Japan is the fastest ageing country on earth and has been curious about robotics longer than most. Japan is therefore looking at the use of robots to replace people in various roles ranging from kindergartens to aged-care.

You can just read about such things, of course. New Scientist, Scientific American, MIT Technology Review, The Economist Technology Quarterly are all ways to reduce your travel budget, but seeing things with your own eyes tends to be more effective. Speaking with early adopters (often, but not exclusively younger people) is useful too as is spending time with heavy or highly enthusiastic users of particular products and services.

Academia is a useful laboratory for futures thinking too, as are the writings of some science fiction authors. And, of course, these two worlds can collide. It is perhaps no coincidence that the sci-fi author HG Wells studied science at what was to become Imperial College London or that many of the most successful sci-fi writers, such as Isaac Asimov and Arthur C. Clarke, have scientific backgrounds.

So find out what’s going on within certain academic institutions, especially those focussed on science and technology, and familiarise yourself with the themes the best science-fiction writers are speculating about.

Will doing any or all of these things allow you to see the future in any truly useful sense? The answer to this depends upon what it is that you are trying to achieve. If you aim is granular – to get the future 100% correct – then you’ll be 100% disappointed. However, if you aim is to highlight possible directions and discuss potential drivers of change there’s a very good chance that you won’t be 100% wrong. Thinking about the distant future is inherently problematic, but if you spend enough time doing so it will almost certainly beat not thinking about the future at all.

Moreover, creating the time to peer at the distant horizon can result in something more valuable than planning or prediction. Our inclination to relate discussions about the future to the present means that the more time we spend thinking about future outcomes the more we will think about whether what we are doing right now is correct. Perhaps this is the true value of forecasting: It allows us to see the present with greater far clarity and precision.

This is the first of three essays about foresight and thinking about the future written by Richard Watson for Tech Foresight at Imperial College. Part two is about the use of scenarios to anticipate change while part three is about the use – and misuse – of trends.

Illustrations:
Diagram 1: Timeline of emerging science & technology, Tech Foresight, Imperial College, 2015.

Diagram 2: Gartner Hype Cycle

Why the future needs more people in it

Screen Shot 2016-04-07 at 10.29.38Last year Facebook launched a virtual assistant. It was called Moneypenny after the secretary in the James Bond books. Yet again, a vision of the future was shaped by the past, possibly with a nod to Walt Disney’s Tomorrow Land in the 1950s. Is this sexist or just a natural outcome of the fact that more than two thirds of Facebook’s employees are men? Whatever the reason, the future is generally shaped by white, middle-aged, male Americans. The majority of the World Future Society’s members are white men aged 55 to 65 years of age and when it comes to the media’s go-to guys for discussing the future they’re men too. What this means is that visions of the future are overwhelmingly created by – and to some extent shaped for – a tiny slice of society, one that’s usually in some way employed in science or technology and has not had to struggle too much.

This is perhaps why technological advances usually define the future and why portrayals of the future are almost always optimistic scenarios in which technology will solve all of mankind’s problems. In the future, for example, we’ll all live far longer, which is fine if you have enough money, but less fine if you are already struggling to survive in the present.

Is this a problem? You bet it is. For one thing a lack of diversity in terms of the people imaging the future means that we are missing out on vast networks and frameworks of perspectives, experience and imagination. Second, by focussing on technology we are missing out on the social and emotional side, not to mention the politics of futurism. Scientists and technologists are essential to explore what’s possible in the future, but as Alvin and Heidi Toffler pointed out in their book Future Shock in the 1970s, we also need people from the arts and humanities to explore what’s preferable. We need ethical code alongside computer code. At the moment a tiny minority of people has hijacked the future – less than 0.1 per cent of the world’s population perhaps. What the remaining 99.9 per cent urgently need to do is reclaim it and especially add a softer and more human perspective to the discussion.

Trends are not where the future is found

Screen Shot 2016-02-03 at 12.42.39

I’ve had a bit of an epiphany. I was talking to someone last week about what the future of retail might look like…in 2020. That’s right, how the future will look in 48-months. Such a timescale automatically pulls anyone back into now and the projection of current trends and technologies forward. This is when reality destroys your thinking. I was polite and hopefully helpful but that’s me done. I’m no longer interested in the year after next or anything less than a decade away. As far as possible I am going to stop using trends as an excuse to talk about the future and think about the real future instead. From now in it’s timescales of 2030, 2040 and 2050 .