Past views of the future

I’m researching a few ideas for one of my new books and just came across an old book called Looking Backward: 2000-1887. The book was written in 1888 about someone that falls asleep in 1887 and wakes up in the year 2000 to a socialist utopia. Here’s a bit of it.

“It was the sincere belief of even the best of men at that epoch that the only stable elements in human nature, on which a social system could be safely founded, were its worst propensities. They had been taught and believed that greed and self-seeking were all that held mankind together, and that all human associations would fall to pieces if anything were done to blunt the edge of these motives or curb their operation. In a word, they believed — even those who longed to believe otherwise — the exact reverse of what to us seems self-evident; they believed, that is, that the antisocial qualities of men, and not their social qualities, were what furnished the cohesive force of society … It seems absurd to expect anyone to believe that convictions like these were ever seriously entertained by men …”

What’s quite interesting about this, and other books like it, is they seem to go to one of two extremes – utopia or dystopia. Why is that do you think?

The Seven Sides of Cyber (Part 1)

This topic is a huge, and rapidly evolving, so I’d like to restrict myself to a small area.       But before I do this I’d like to add a bit of context and discuss the area of forecasting and prediction. In short, why do we get so many things about the future so utterly wrong?

In 1886, the engineer Karl Benz predicted that: “The worldwide demand for automobiles will not surpass one million.” Eight later, in 1894, an article appeared in the Times newspaper in London predicting that: “In 50 years, every street in London will be buried under 9 feet of horse manure.”

How could they have got things so wrong? Simple. Both predictions were based on current (at the time) trends. Put another way, they used critically false assumptions.

In the case of Karl Benz his mistake was to assume that cars would always require a chauffeur and that the supply of skilled chauffers would eventually run out.
In the case of London being buried under horse manure the mistake was to assume that the volume of horse transport would increase indefinitely alongside population.
The article also totally missed, or at the very least misjudged, the disruptive impact of motorised transport, invented by Mr Benz 8 years earlier.

This is one reason why I’m such a fan of scenario planning – it’s a philosophy that rejects the idea of a singular future. Scenario planning insists that there are multiple possibilities and encourages the idea of things being part of a wider system. It focuses on external rather than internal shocks. In short, it’s a framework for dealing with ambiguity and uncertainty and combinations of external events. It doesn’t claim to predict the future in the sense of helping people get things 100% right, but used well it can get you to places you’ve not previously imagined and prevent individuals – and especially institutions – from getting things 100% wrong.

Why else, apart from projecting current trends and personal experience forward, do we make big mistakes? The main reason, I would suggest, is because of how our brains are wired and what we choose to do with the rest of our bodies each and every day.

Let’s start with our brains.

Our brain’s default setting is to believe what it already knows, largely because our brain is lazy and this is the easier, or more economical, energy conserving position. Confirmation bias says that our brains prefer to deal with information and ideas they are already familiar with, because the pathways for this information to travel down have already been built and traffic flows freely. This is why we subconsciously hunt for facts to fit pre-conceived ideas or play bad cop and fit up evidence to fit pre-established views.

Unfamiliar information and ideas, in contrast, have difficulty entering our brains, because new pathways need to be built to deal with new data or experiences. Hence our natural neural default is a combination of “Yes, I know what that is, I’ve seen it before” mixed up with “I have no idea what this is so I’m going to ignore it.” The result is that critical information is missed, ignored or not prioritized.

However, when the brain is especially busy it takes all this to extremes and starts to believe things that it would ordinarily question or distrust. I’m sure that you know where I’m going with this but in case you are especially busy – or on Twitter – let me spell it out for you.

If you are very busy there is every chance that your brain will not listen to reason and you will end up supporting information, or ideas, that are dangerous or perhaps you will support people that seek to do you, or others, harm. Fakery, insincerity and big fat lies all prosper in a world that is too busy or too distracted to listen properly. Hence the importance of occasionally switching our devices, and ourselves, off.

There are some other well-known reasons for getting things wrong. The first reason is something called sunken cost. This is essentially the idea that people sink time and money into things and therefore continue to back actions or strategies well past the point of logic in a quest to get either their time or their money back.

This is similar to the endowment effect, which says that people behave differently with things that they own (spending other people’s research money versus their own). Other reasons include egocentric bias (usually fatal when combined with Alpha male competitive behaviour, the collapse of RBS in the UK perhaps being a prime example), overconfidence, expediency and conformity. Here’s a classic example of a mistake cited by Joseph Hallinan in his book ‘Why We Make Mistakes”.

“A man walks into a bar. The man’s name is Burt Reynolds. Yes, that Burt Reynolds. Except this is early in his career, and nobody knows him yet – including a guy at the end of the bar with huge shoulders. Reynolds sits down two stools away. Suddenly the man starts yelling obscenities at a couple seated at a table nearby. Reynolds tells him to watch his language. That’s when the guy with the huge shoulders turns on Reynolds.” Here’s how Reynolds recalls the event: “I remember looking down and planting my right foot on this brass rail for leverage, and then I came around and caught him with a tremendous right to the side of the head…he just flew off the stool and landed on his back in the doorway, about fifteen feet away. And it was while he was in mid-air that I saw…that he had no legs.”

How could this happen? Easy. Reynolds was distracted. He looked but he didn’t see. He had a view of reality that was influencing what he saw. He thought he knew the context, and what might happen next, so he didn’t question his assumptions.

Our brains aren’t alone in deceiving us either. Most of us wake up at the same time each morning, we leave our houses in the same manner, we take the same route to work, we read the same newspapers and websites, we hang out with the same people at work and then we go home and sit down to do more or less the same things we always do. This is fine, it’s comfortable and convenient, but we are restricting our experience (what our brains are familiar with). This means that we are allowing the lens with which we view the world to be distorted, or at the very least narrowed.

It also means that the raw material from which new insights and new ideas are made is similarly restricted.

But I have some good news for you. It needn’t be like this.

First, you can exercise your brain much as you can exercise your other muscles.
You can feed it a diet of cerebral snacks that make it stronger and more resilient.
How can you do this? Go to places you’ve never been to before, talk to people you don’t know in spheres that you aren’t familiar with, read magazines you wouldn’t normally read and above all indulge in intellectual promiscuity and encourage serendipity.

The second bit of good news is we are getting better at forecasting and prediction, largely because digital and wireless technology is allowing us to see things that were previously hidden from view. As many people know, using digital money and shopping online leaves vast trails of data, as does walking around with a phone that’s switched on, joining Facebook or searching for things on Google.

In short, it’s becoming far easier to know where people are all of the time and what they are doing. It’s even becoming possible to work out what they are likely to do next based on historical patterns of observed behaviour and connections overlaid with basic human psychology.

Is this a good or a bad thing? That’s not really for me to say, largely because it can be a bit of both depending upon whether or not the surrender of information is consensual and what a second or third party does with the data. Interestingly, one thing that I have observed studying other people is that many of us are developing what can only be called an addiction to mobile devices, which I would suggest is harming our ability to think deeply about matters of substance, but more on that later (Part 2).

A great many people also seem to have little or no idea what they are doing with regard to surrendering personal information or information about their precise whereabouts. Maybe this is generational? Gen Y and below know that privacy is dead and have gotten over it. Gen X is deeply worried and Baby Boomers have absolutely no idea what’s going on.

There is perhaps also the thought that many people now live in the here and now and are either unconcerned about what might happen to them (or their data) in the future (having their identity stolen or being burgled because they broadcast too much information about themselves or their current location) or they have little or no appreciation of history (especially the historical misuse of technology). Maybe it’s simply that much of this technology is too new and its effects are largely unknown or misunderstood.

Talking of new technology, another thing that I’ve noticed is how little people are aware of what’s already happening now, let alone what’s likely to happen next.

For example, when I show a general audience things like Google Flu Trends, 23andMe or Life Circle + they are amazed. They are even more gob-smacked by London Underground’s suicide algorithm, the Department of Homeland Security’s Malicious Intent Detectors or DARPAs Total Information Awareness project.

Imagine their reaction when they hear about what’s starting to happen with artificial intelligence, brain-to-machine interfaces, real-time crime mapping, prediction markets and mood-mining facial recognition technologies.

Overall, what I think is happening here is an explosion of connectivity, which is driving transparency, the growth of collaborative consumption, location specific data and dematerialisation (of goods and services). It’s also driving personalisation and various location-specific services.

The personalisation point, which at times spins off into unrestrained narcissism, is especially interesting, because it appears to clash with collaboration on one level.
Collaboration (co-creation, co-filtering and Web 2.0 generally) has what appears to be a collective ethos, which clashes with the individualistic ethos of personalisation.
I don’t have an answer as to whether ‘me’ or ‘we’ is in the ascendant, but it’s something I’m watching.

Moving back to the privacy point, I suspect that in a future that’s highly connected, where digital data is difficult to contain and fear and anxiety are in the ascendant (created, I’d argue, by a mixture of Future Shock from the accelerating effects of technology, globalisation, the asymmetric nature of modern conflict and volatility created by complex and highly connected systems) I’d venture to suggest that most people will indeed give up a degree of privacy and freedom in return for the promise of certainty, simplicity and risk free environments.

Whether this promise can ever be delivered is, of course, an entirely different matter, especially when you stop to consider how the uses of technology tend to cascade.
We should also, perhaps, remember how privacy is to some extent a modern invention. Our historical pre-set was largely openness, collaboration and transparency because this allowed villages to thrive. Maybe we’re just returning to the village?

This may not be a good thing. Perhaps a loss of privacy (and to some extent secrecy) will lead to a reduction in original thinking and experimental behaviour, simply because people don’t want to be seen by the rest of the village as being stupid.

But let’s move on.

The second area I’d like to focus on is how some of the technologies I’ve just mentioned, especially digital, wireless and screens are changing how people think and act and what this could mean.

I don’t claim to be an expert in this area, but I did recently write a book about what the digital era, and screen culture in particular, is doing to our thinking, especially with regard to the thinking of people that are younger than Google (under 14).

Here’s a quote from Cass Sunstein, a Professor of Law and Political Science at the University of Chicago.

“The Internet makes it far easier for us to restrict ourselves to groups of like-minded people – to live in echo chambers of our own devising. In this way, the Internet is creating an increase, in many places, of social fragmentation, and hence an increase in both intolerance and incivility, as people end up seeing their fellow citizens as stupid, or malicious, or despicable. This problem is increased by the fact that much of the Internet is intolerant and far from civil….this isn’t healthy for democracy or tolerance, because it encourages people to choose teams, rather than to think issues through. For many people the Internet is aggravating this problem.”

And another quote from Thomas Friedman from the New York Times

“At its best, the Internet can educate more people faster than any media tool. At its worst, it can make people dumber faster than any media tool. Because the Internet has an aura of “technology” surrounding it, the uneducated believe information from it even more. They don’t realize that the Internet, at its ugliest, is just an open sewer: an electronic conduit for untreated, unfiltered information. Just when you might have thought you were all alone with your extreme views, the Internet puts you together with a community of people from around the world who hate all the things and people you do. You can scrap the BBC and just get your news from those Websites that reinforce your own stereotypes.”

Is this fair? It’s hard to say, because the internet, Google and Facebook, for example, are all still too new and because we still don’t know enough about how the human brain works to understand what the precise impacts things like BBM, YouTube and Twitter are having.

However, be assured that one thing is fairly certain, which is that because of the way our brains work, soaking up whatever stimuli are lying around, they are having some kind of impact. However, I’d suggest that these impacts are not isolated, but will be part of a much wider system of influences.

For example, I am cynical about comments that Facebook and Twitter were responsible for the Arab Spring. What I think happened was that mobile phones and social media, like most forms of technology, were an accelerant to an already existing condition.

In other words, the dry wood was already lying around. What was the wood?
I’d suggest a relatively large number of 16-24 years olds in the populations, high levels of youth unemployment, reasonably high levels of education, state repression of media, corrupt and bureaucratic governments and possibly food inflation.

The spark, in the case of Tunisia, was Mohamed Bouazizi setting himself alight in response to official harassment. Social media merely fanned these flames and, critically, thereafter, provided a way for people to by-pass government information sources and self-organise against incumbent powers.

There are a some parallels here with the London riots, although the strongest connection, in my view, was the use of social media and mobiles to organise protest – by definition evading the pyramidal command and control police structures. The other potential connection is the issue of fairness. In the case of the UK, this appeared to be a mixture of three things.

1) A culture of self-entitlement with individuals asking: “Where’s my share of the pie?” This is not a bad question in light of UK MPs recent expenses fiasco – the idea that anything is more or less OK so long as you don’t get caught.

2) Income polarisation – bankers earning bonuses that were often unelated to wider performance. Gains from speculation belong to the individuals involved, but any losses or wider social impact, belong to society at large so the theory goes.

3) There wasn’t much else to do and the weather was nice!

Interestingly, this self-organisation is indicative of what the internet and social media enable at a much broader level, which is the instant aggregation of opinion and the creation of highly fluid, collaborative and often leaderless networks that are often pitched against rigid and highly structured bureaucracies with very clear levels of command.

To be continued…

Unrest in Europe

Here’s a semi-serious prediction. High levels of youth unemployment in Europe will trigger unusually high levels of social unrest and crime and we will see a coup in a Western-European country (generals running Spain or Greece?) by year end.

Historical views of the future

Couple of good things today. The first is something I stumbled upon looking for a reference to the economist John Maynard Keynes. It’s an essay he wrote in 1930 about life in the year 2030. It’s a good read, especially when you stop to consider what was happening in 1930.

Here’s a tiny taste.

“We are suffering just now from a bad attack of economic pessimism. It is common to hear people say that the epoch of enormous economic progress which characterised the nineteenth century is over, that the rapid improvement in the standard of life is now going to slow down – at any rate in Great Britain; that a decline in prosperity is more likely than an improvement in the decade which lies ahead of us. I believe that this is a wildly mistaken assumption…”

Click here for the essay (7 pages).

The other tasty morsel is another view of the future, this a bit more fun and time from a newspaper looking at 2011 from the perspective of 1911 (via Buzzfeed, via Sonny in Germany).

Click here to read why automobiles will be cheaper than horses, why you’ll be able to travel from New York to Liverpool in two days and why wireless telephones and telegraphs will span the world….

Trends for 2011 still good for 2012

Now that it’s almost 2012, I think it’s about time we revisited my top 10 trends for 2011 that were written in late 2010 and posted in early January. How right have they proven to be? Judge for yourself. Personally I think the only one that’s way off is number 8.

Ten Trends for 2011

#1. Uncertainty
In 2011, nothing is certain except uncertainty. With the economic recovery still brittle in many parts of the world, people will be looking for safety, reassurance and control. They will be largely disappointed. What we can expect, with some degree of certainty, is that there will be widespread anxiety, especially in financial markets, and there will be a background expectation that something will sooner or later go wrong.

Implications
Politicians and markets will swing between irrational pessimism and exuberance and this will create policies (and prices) that tend to overshoot and then over-correct. On the domestic front, people will hold off making big-ticket purchases until there is a clearer view of what lies ahead. They may have to wait a long time.

#2. Volatility
Volatility is intimately connected to uncertainty. The root cause of this is the universal connectivity now built into everything from financial markets to media and communications. This means that risk is now networked and failure can quickly spread. Financial markets are at the epicenter of potential seismic events. In some cases the risk can be real. The threat of a debt crisis in one country can lead to a genuine crisis of confidence in another. However, the contagion can be imagined and in some cases manufactured. Information pandemics rapidly spread false or misleading information but such is the rapidity of the cycle that nobody has the time to verify the facts – or call to account those spreading the disinformation – before another real or imaged threat breaks out.

Implications
Commodity price spikes and wild fluctuations in the price of assets. Indeed, volatility means that nobody is quite sure what the underlying value of many assets now is, which only adds to the volatility.Some speculators will make huge amounts of money from such movements, although where one can safely stash such rewards is far from clear. Watch out for individuals flocking to ‘safe’ investments such as art, prime real estate, agricultural land and gold (and then look out for bubbles bursting in gold, real estate, land and so on).

#3. Rage
In the US, 20% of American men aged between 25-55 are now unemployed. In the 1960s 95% of the same group had a job. This could be reason enough to get angry but the bad news doesn’t end there. Food prices are rising, energy costs are increasing and the US faces the prospect of economic decline relative to the emerging powers, most of which have come out of the global recession relatively unscathed. Add to this the twin evils of Washington and Wall Street and you have the makings of a new age of rage in middle-America. Only 33% of Americans think that their kids will have a better quality of life than they’ve had and 57% think that the next generation will inherit a generally worse situation according to the writer Frank Luntz.

In Europe things are looking especially nasty. In the UK, for instance, indirect taxes are going up and infrastructure investment (police, health, transport, education etc) is going down. This all adds up to people paying more to get less, which in turn leads to more frustration, which, thanks to social media, can easily tip over into large groups expressing their anger physically.

Implications
An age of rage founded on the realization that current generations may not enjoy the standards of living that were experienced (expected) by their parents and grandparents. Where this rage will be channeled is anybody’s guess, but we might see outbreaks of violence against individuals and institutions that are perceived to be immune from pain. We may also see a resurgence of protectionist economic policies, anti-immigration rhetoric and populist right-wing politics.

#4. Religion rising
Imagine no religion. It appears that many people can’t. Science and technology are supposed to be making religion redundant, but it appears that the opposite is increasingly the case. When times are complex and confusing organized religion offers people hope alongside universal truth. Indeed, when life becomes a struggle economically or uncertain environmentally, religion offers an easy to understand view of why things are as they are and how things will eventually work out. In an increasingly chaotic world it’s also comforting to believe that someone, or something, is in control.

Clearly the growth of fundamentalism is one aspect of this, but religion is enjoying resurgence across the board, especially in the Southern Hemisphere. Part of the reason for this is globalization. Religious beliefs now move around more freely. Similarly, urbanization and social media are bringing people and religious beliefs closer together.

Implications
Expect to see a resurgence in traditional practices, rituals and beliefs, including pilgrimages. Also expect to see rise in spiritualism and an interest in the afterlife (much of it linked to societal ageing). Finally, expect continued sectarian violence in some regions.

#5. Formality
Democracy and anti-elitism have led to the growth of informality in recent years. This has been accelerated by the egalitarian tendencies of the Internet – or at least by the egalitarian ethos of young Internet start-ups, many of which are run by t-shirt wearing revolutionaries. However, the global recession has created a counter-trend moving in the opposite direction. Seriousness is back in vogue and older people have started to dress smarter in a vain attempt to keep their jobs.

This formality has trickled down to younger individuals, especially those concerned with getting or keeping a professional career. To some extent this trend is cyclical, but it is also connected with an interest in tradition, craft and artisan skills. Another accelerant is popular revulsion at yobbish culture, especially in the UK.

Implications
A resurgence in good manners, formal dining and bespoke menswear. Also watch-out for resurgence in period drama, books and documentaries about the Victorians and traditional Sunday roasts.

#6. Food inflation
Food has been cheap for a long time and many people now view ingredients that were once considered luxuries as necessities. But this situation is about to change. The primary problem is population. There are simply more mouths to feed. However, the real issue is not so much demand per se but peoples’ changing consumption habits. Put simply, more people – especially people in developing markets – are changing eating habits in line with rapidly rising incomes. Hence people that used to live on subsistence diets of rice are now demanding meat. Call it calorie inflation if you like but whatever you call it the consequence is rapidly rising prices. According to Nomura, the investment bank, “the surge in commodity prices in 2003-8 was the largest, longest and most broad-based of any commodity boom since 1980.” Nomura goes on” “The prices of energy and metals surged the most but it was the agricultural market that saw the most fundamental change.”

Implications
Surging commodity prices and price spikes are one consequence, but also look out for changing eating habits. In Western markets there will be new interest in cheaper cuts of meat and in new species of fish. More people will be cooking at home to save money too. On the really nasty side, watch-out for a return of food riots. In some countries this could boil over into serious violence. In terms of opportunities, fish farming and fish ranching look like sure-fire winners.

#7 “Long land”
According to the World Bank, agricultural production must increase by 70% by the year 2050. Why? The primary reason is demographic – there will be more people in the future and they will want something to eat. The second reason is consumption habits – more people with more money means switching to meat-based diets, especially in Asia (see food inflation trend). The third reason is bio-fuels. Energy companies are interested in land not because of way lies beneath but because of what can be grown on top. The result? People taking a ‘long’ position on fertile land, especially in foreign lands, with the expectation that the value of the land and the food grown on it will increase substantially over the years ahead. For example, according to the World Bank, purchases of land in developing regions increased tenfold in 2009 to 45 million hectares. This trend is set to make the value of good land soar, especially well-watered hinterlands in Africa and Latin America. But buyers beware. Land isn’t just another commodity. Land is tied up with notions of nationalism and is semi-sacred in many regions.

Implications
Expect a global land grab by wealthy foreign investors (especially Chinese and Middle-Eastern sovereign wealth funds) to increase substantially over the coming years – but also expect protectionist backlashes over the purchase, or attempted purchase, of land by foreign investors. Also expect the issue of water access to rise to the surface on the back of this trend.

#8 Digital disenchantment
Are you sinking in a sea of scurrilous spam? How about drowning in deluge of digital dross? The internet, and Web 2.0 in particular, are wonderful things, but there are digital downsides, notably the fact that people are suffering from too much information and too much choice.The result is confusion on a grand scale. Our attention spans are dwindling (books are now seen by many younger generations as being “too long”) and we seem unable to retain important information, such as home phone numbers, ATM PIN numbers, family birthdays and security codes. As for work, all some people seem to do is answer endless mails only to be faced with more once they have dealt with the first batch.

Of course, you could use technology to solve most of these problems. Use RSS feeds or Google alerts to filter the amount of incoming information or simply switch off your email or mobile phone. But filtering seems to create even more information and switching off isn’t really an option when everyone else is still switched on and expects an instant response.

Implications
People are seeking harmony with regard to their digital/analogue balance much in the same way that they are seeking work/life balance. One way to achieve this is to set clear boundaries about when you do certain things or when you use certain technologies. This will work up to a point but at some stage you will have to be more brutal. Switch your mobile off after 7.30pm. Don’t become “friends” with people you’ve never met and unblock your digital drain from time to time by unsubscribing to or disconnecting from unread or unused information.

#9. Pyjamas
In 1970, Alvin and Heidi Toffler wrote about the “Future Shock” created by rapid technological change. Fast-forward to 1991 and Faith Popcorn predicted the emergence of ‘cocooning’ – a reaction against the unpredictable and somewhat stressful nature of modern life. Scrawl forward again a few digital decades and it looks like more of the same.
So what have pyjamas (PJs, Jim-Jams, sleep suits et al) got to do with rapid technological acceleration and rampant economic uncertainty? Simple. They offer physical and emotional warmth in cold and complex times. But escapism isn’t the only reason. More people are working from home nowadays, so what some people wear to work doesn’t really matter. We are also spending more leisure time at home surfing websites rather than going out, so this is fuelling the trend also.

Implications
Rising sales of nighttime clothing, especially ‘third wardrobe’ items that can be worn in bed, around the house or front of the television or computer. BTW, if you are the type of person that likes to go shopping, or collect the children, wearing PJs there’s a word for you. It’s “No!”

Trend #10 – No Trend

My final trend for 2011 is that there is no trend. There are certain uncertainties but beyond this it’s impossible to see what lies ahead. Is the Euro finished? Will Portugal go the same way as Ireland and Greece? Is China heading for a fall? Who can say? One thing I would say is that is we appear to have entered a phase where technology is acting as an accelerant – or at least creating a confluence – for a number of existing trends.

For example, cascading failure can be a feature of highly complex systems. Some things are now so finely engineered and universally connected that there are no tolerances. If one element fails (or is attacked) whole parts of the system come crashing down. Hopefully, the view ahead will clear sometime in 2011 or 2012 but until then all one can say with any degree of certainty is the future is a riddle wrapped up in a mystery inside an enigma.

TED talk (I won’t know what I think until I see what I say)

Here’s the text of my recent TEDx talk in Germany. Note that these are only my notes and not what actually came out of my mouth…

If you gave an infinite number of futurists an infinite amount of time, would one of them eventually be correct about something?

I wrote a book a few years ago about what I thought might happen over the next 50 years. Ever since I’ve been called a futurist and I’m now regularly called upon to make predictions.

But the history of prediction isn’t particularly good.

For example, in 1886, the engineer Karl Benz predicted that: “The worldwide demand for automobiles will not surpass one million.” Eight later, in 1894, an article appeared in the Times newspaper in London predicting that: “In 50 years, every street in London will be buried under 9 feet of horse manure.”

How could they have got things so wrong? Simple. Both predictions were based on past experience. They were extrapolations built upon what turned out to be short-term trends. Both used critically false assumptions.

In the case of Karl Benz the mistake was to assume that cars would always require a chauffeur and that the supply of skilled chauffers would eventually dry up. In the case of London being buried under horse manure the mistake was to assume that the volume of horse transport would increase indefinitely alongside population. The article also totally misjudged the disruptive impact of motorised transport, invented by the aforementioned Mr Benz.

On the other hand, the accuracy of some predictions can be rather good, especially if you give them enough time to come true. Hindsight, it would appear, is a necessary accompaniment to futurism.

A few years ago I picked up a couple of old books in a junk shop in the middle of the English countryside. The first was called Originality and was written by T. Sharper Knowlson in 1917. Here he is quoting Sir Aston Webb about the Future of London from the perspective of 20th January 1914 .

“There are two great railways stations, one for the north and one for the south. The great roads out of London are 120 feet wide, with two divisions, one for slow-moving and the other for fast-moving traffic; and there will be a huge belt of green fields surrounding London.”

Not bad. Or how this passage from the second book I bought, which was Future Shock written by Alvin Toffler in 1970:

“The high rate of turn-over is most dramatically symbolised by the rapid rise of what executives call ‘project’ or ‘task-force’ management. Here teams are assembled to solve specific short-term problems. Then, exactly like the mobile playgrounds, they are disassembled and their human components re-assigned. Sometimes these teams are thrown together to serve only a few days. Sometimes they are intended to last a few years. But unlike the functional departments or divisions of a traditional bureaucratic organization, which are presumed to be permanent, the project or
task-force team is temporary by design.”

Remember, this book was written more than 40 years ago. In sounds like Silicon Valley in 2011. The list goes on. Peter Drucker wrote about portfolio careers in 1988 and Warren Bennis was writing about the need for radical innovation in the late 1960s.

And let’s not forget HG Wells launching ballistic missiles from submarines in The Shape of Things to Come in 1933, Arthur C. Clarke envisioning a network of communications satellites in geostationary orbit above the earth in 1945 and Captain James T. Kirk using what appears to be a Motorola cell-phone way back before any such thing had been invented.

Admittedly some of these are broad concepts rather than specific predictions, but this doesn’t negate the fact that a few seers do occasionally get it right and that the future would be a good subject for serious study if only the sources were more forthcoming.

Of course, what you really need when you are thinking about distant horizons is a map, so I designed one last year. It contains far too much information and is far to complex, but then that’s probably the future isn’t it?

The outside of the map contains a series of predictions that get more playful and more provocative as you move out towards the edges. For example:

• There will be a convergence of healthcare & financial planning
• We will have face recognition doors & augmented reality contact lenses
• Online communities will start physical communities

The centre of the map contains some mega-trends such as globalisation, urbanisation, sustainability, volatility, the power-shift Eastwards, ageing, anxiety and so on. But be careful. Trends like these can get you into all kinds of trouble. In fact they can easily get you into more trouble than predictions because people believe them.

Firstly, trends represent the unfolding of current events or dispositions. They tell us next to nothing about future direction let alone the velocity of events. They do not take into account the impact of counter-trends, strange combination or anomalies.

But there’s an even bigger problem, which is that in my view there’s no such thing as the future. The future is ambiguous. It’s uncertain. Therefore, there must surely be more than one future. In other words, there must be a number of alternative futures.

Look at this chart for instance. This shows a series of forecasts about the number of active oilrigs drawn up by an oilfield supplies company in the early 1980s. Someone looked at the data and produced a series of entirely logical high, medium and low predictions (I, II & III). The bottom line is reality.

What nobody foresaw was that what looked like a long-term trend was in fact a short-term situation based on a high oil price, low interest rates and government subsidies. In short, they failed to see that while all our knowledge is about the past, all of our most important decisions are about the future.

So what can we do to address this central problem of prediction? Is there any point whatsoever in trying to predict the future or is it best just to sit back and let it happen?

Letting the future happen is actually not a bad option, but only if you are nimble. If you have an open mind and can move quickly a fast follower strategy can work. But most organizations are neither nimble nor open. They are mentally closed to the outside world and stuck with assets and systems that were created in the past. Most organizations are built around historical ideas and are constrained by various legacy issues and are concerned with numbers that relate to the last 12 months and worry about what will happen to these numbers over the next 12 weeks.

A much better bet, in many instances, is Scenario Planning. This has its origins in war gaming or battle planning, especially a 6th Century Indian game called Caturanga, meaning four divisions, and Kriegsspiel, a German war game invented in 1812. War gaming is still used by the military today, but also by companies such as Shell, who use narrative scenarios to look at the potential impact of a number of external variables on strategy or long-term capital expenditure.

To illustrate how scenarios work here’s a set of scenarios I did with some friends a few years ago. When you do scenarios you generally start with a focussing question and I like to explain this by asking people to imagine that time travel really exists and to prove it the inventor has just come back from the future to say hello.

If you were allowed to ask this person just one question what would it be? Now I should explain that questions like “when will I die?” or when will the USA soccer team win the World Cup aren’t allowed. The question is supposed to relate to work, although if you were to do a personal set of scenarios then it would be fine.

In this case the question was simply around future customer mindsets. You’ll notice that there’s a vertical and a horizontal axis. These are created by finding two unrelated forces that are both highly impactful and highly uncertain. In this instance there’s one axis built around activism versus passivism  (‘We’ versus ‘Me’ if you prefer) and one built around optimism versus pessimism, which is created by attitudes towards the economy and climate change.

Once you impose one axis against the other four divergent futures are revealed, each of which becomes more distinct and extreme as you move outwards from the middle.
Starting in the bottom right corner we have a scenario called Moreism. The key drivers creating this scenario are optimism and individualism, so we get a world of globalisation, free markets, materialism and economic growth at almost any cost.
This is a world primarily driven by greed.

Moving across to the bottom left, we get a scenario where the drivers are pessimism and individualism. So this is a world where people essentially give up hope and move into a survivalist mindset. It’s a world initially dominated by local community and self-reliance, but as you move outwards it starts to incorporate protectionism and to some extent isolationism – even xenophobia – where hatred is focussed upon anyone that is not considered to be part of the group. In a word, it’s a world driven by fear.

Moving up to the top left we have Enoughism. This is obviously the polar opposite to Moreism. It’s a world where people decide that they’ve got enough and that they’ve had enough. It’s a world where people decide to change how they live in relation to the planet and reinvent many of the institutions, models and structures that have grown up over the past hundred years. It is very sustainable, very ethical and very community driven. It is a post-materialist world where work-life balance features strongly, as do social value, meaning, purpose and happiness. It is to some extent idealistic and certainly altruistic.

Finally, in the top right, we have Smart Planet. This is a world driven by a strong belief in the power of science and technology. A world powered by human imagination and ingenuity. An accelerated world of genetics, robotics, internet and nanotechnology, where smart machines reshape the world. However, there are some unexpected counter-trends. Alongside emotionally aware machines and augmented reality we see a need develop for physical objects and human interactions. It is smart and efficient, but this creates a certain coldness, so people crave heart and soul.

But there’s still a problem here. The point of scenarios such as these is to help people to anticipate change. To foresee, to some extent, a range of alternative futures against which current thinking and strategies can be tested. But whilst it’s possible to track emergent scenarios – even to develop contingency plans for each eventuality – at the end of the day I think you have to commit.

Cast you mind back to the map and some of the trends in the middle. One of the trends was anxiety, which I said was caused by several of the other trends. But it’s also caused by the fact that people no longer have a clear view of what lies ahead.

Doing nothing and waiting for the future to unfold is one option. Making a series of educated guesses – deeply questioning why things are happening and asking what might happen next is a much better idea. But there’s a third option. Leaders are supposed to have a clear vision of what can be achieved and what the future could look like if we take a certain path. Unfortunately, most leaders nowadays don’t do this. They wait to see what the majority of people want and then they say that they agree. This is despite the fact that most people don’t know what they want and soon forget what they’ve said. Similarly, CEOs and politicians will not agree to anything that takes to long or is too difficult to deliver.

But we can all be leaders. We can all drive things forward ourselves. All we have to do – as individuals, countries, corporations, even the entire planet – is to decide upon where it is we want to go and to slowly start moving in that direction. In other words, we need to pick the future we want and start building it. This will be difficult. We will get many things wrong. But if we could at least agree amongst ourselves where we are going we would soon start to re-perceive the present and the future would be a much better place to live.

One Question about the Future

I’m doing a TEDex talk in Munich next month and originally intended to do something around The Architecture of Thinking (Alternative title: Can Buildings Have Brains?). However, one of the organisers has put me off. Instead I’m doing the Perils of Prediction about why predictions about the future go so horribly wrong and whether there is any serious way to think about what’s coming next.

Anyway, long story short, I have to give the audience (or perhaps it’s a panel?) three questions. One question that I’ve thought of is “If you met someone from the future what  would you ask them?”If I was given just one question it would have to be “When will I die?” Given another question I would be curious about whether there is anyone else out there – is there intelligent life on other planets? Then again, as Arthur C. Clarke pointed out, this question can only have two possible answers, both of which are quite terrifying.

Anyone out there got any better questions?

Predictions for the next 25 years

The Observer newspaper recently asked a group of experts for some predictions for what might happen over the next 25 years. Here’s some extracts from the predictions, together with my own comments. Thanks to Bradley who sent me the original link to this article. BTW, the original article came out on 2 January 2011, but many of the predictions refer to the year 2035, which would explain why 2011 + 25 = 2035.

http://www.guardian.co.uk/society/2011/jan/02/25-predictions-25-years

1 ‘Rivals will take greater risks against the US’ 

“The 21st century will see technological change on an astonishing scale. It may even transform what it means to be human. But in the short term – the next 20 years – the world will still be dominated by the doings of nation-states and the central issue will be the rise of the east” –Ian Morris (See my post January 14) Professor of history at Stanford University and the author of Why the West Rules – For Now

Comment

I totally agree. And, to quote another author, Michael Mandelbaum, “The one thing worse than an America that is too strong, the world will learn, is an America that is too weak”, However, remember that the US is probably more resilient than many people imagine and China has more problems than many can see.  I also agree that while science & technology make for great predictions (and scenarios) the most practical things you can look at are a few key indicators, such as demographics, energy, food and so on – all of which impact nation states.

2  ‘The popular revolt against bankers will become impossible to resist’

“The popular revolt against bankers, their current business model in which neglect of the real economy is embedded and the scale of their bonuses – all to be underwritten by bailouts from taxpayers – will become irresistible.” – Will Hutton, executive vice-chair of the Work Foundation and an Observer columnist

Comment

I came up with the same prediction back in late 2006, so I agree, but I don’t. At the moment this looks certain, but Hutton is making the classic mistake of extrapolating from the present. Banker bashing is a knee-jerk reaction. Moreover, it is too simplistic. What about the people that borrowed the money from the banks? Or what about the fact that interest rates have been at such low levels? For many people things aren’t that bad. I believe that the issue of banks will remain for a short period, but we will become concerned about something else in due course and the issue of bankers and banker bonuses will fade away. Unless, of course, there is indeed a second crisis of similar proportions, in which case things will change (but we will need a bigger crisis to create real shifts).

3 ‘A vaccine will rid the world of Aids’

“We will eradicate malaria, I believe, to the point where there are no human cases reported globally in 2035. We will also have effective means for preventing Aids infection, including a vaccine. With the encouraging results of the RV144 Aids vaccine trial in Thailand, we now know that an Aids vaccine is possible.” – Tachi Yamada, president of the global health programme at the Bill & Melinda Gates Foundation

Comment

I don’t know enough about this area to comment, so I won’t.

4 ‘Returning to a world that relies on muscle power is not an option’

“The challenge is to provide sufficient energy while reducing reliance on fossil fuels, which today supply 80% of our energy (in decreasing order of importance, the rest comes from burning biomass and waste, hydro, nuclear and, finally, other renewables, which together contribute less than 1%). Reducing use of fossil fuels is necessary both to avoid serious climate change and in anticipation of a time when scarcity makes them prohibitively expensive….disappointingly, with the present rate of investment in developing and deploying new energy sources, the world will still be powered mainly by fossil fuels in 25 years and will not be prepared to do without them.” – Chris Llewellyn Smith, former director general of Cern and chair of Iter, the world fusion project. 

Comment

Bad choice of headline, but I agree with the prediction. In 25 years coal, oil and gas will still produce the vast majority (70-80%) of the world’s energy. Nuclear will be more important in some, but not all, regions. As for renewable energy, the issue is scale. Solar looks promising for some countries, but wind, wave and geothermal just don’t cut it as global solutions in my view. There is the possibility of some ‘magic bullet’ technology, but I think this will only happen when the cost of fossil fuels becomes so great we are forced to really do something.

5 ‘All sorts of things will just be sold in plain packages’

“In 25 years, I bet there’ll be many products we’ll be allowed to buy but not see advertised – the things the government will decide we shouldn’t be consuming because of their impact on healthcare costs or the environment, but that they can’t muster the political will to ban outright. So, we’ll end up with all sorts of products in plain packaging with the product name in a generic typeface-as the government is currently discussing for cigarettes.” – Russell Davies, head of planning at the advertising agency Ogilvy and Mather

Comment

I disagree. There will be more warnings – on everything from alcohol and confectionary to credit cards and cars, but I don’t agree with the plain wrapper theory. 

6 ‘We’ll be able to plug information streams directly into the cortex’

“By 2030, we are likely to have developed no-frills brain-machine interfaces, allowing the paralysed to dance in their thought-controlled exoskeleton suits. I sincerely hope we will not still be interfacing with computers via keyboards, one forlorn letter at a time….I’d like to imagine we’ll have robots to do our bidding. But I predicted that 20 years ago, when I was a sanguine boy leaving Star Wars, and the smartest robot we have now is the Roomba vacuum cleaner. So I won’t be surprised if I’m wrong in another 25 years. AI has proved itself an unexpectedly difficult problem.” – David Eagleman, neuroscientist and writer

Comment

Best prediction on the list, not because I necessarily agree with it, but because it’s so provocative (that’s what predictions are for, surely?). Personally, I think this is all highly likely except for the bit about robots and the thought of downloading data directly into your mind – or of uploading your mind into a machine. We will have direct brain-to-machine interfaces (we already do – I’ve got one), which is much the same as mind control. We will ask our computers questions and they will answer back. The mouse and the keyboard will be long gone and machines will have some kind of emotional intelligence (they will know what mood you are in, for instance or they will be programmed to be afraid of doing certain things). However, I believe that the inventions the geeks really want (general AI or the ability to stream data directly into the cortex) are impossible for the foreseeable future.

7 ‘Within a decade, we’ll know what dark matter is’

“The next 25 years will see fundamental advances in our understanding of the underlying structure of matter and of the universe. At the moment, we have successful descriptions of both, but we have open questions. For example, why do particles of matter have mass and what is the dark matter that provides most of the matter in the universe?” – John Ellis, theoretical physicist at Cern and King’s College London

Comment

A 50% chance of this coming true. The real question, the biggest question of them all perhaps, is where we’ll we go from there? Personally, I find the idea of where the galaxies came from and where they (we) are going over the much longer term quite fascinating. 

8 ‘Russia will become a global food superpower’

“By the middle of that decade (2035), therefore, we will either all be starving, and fighting wars over resources, or our global food supply will have changed radically. The bitter reality is that it will probably be a mixture of both….in response to increasing prices, some of us may well have reduced our consumption of meat, the raising of which is a notoriously inefficient use of grain. This will probably create a food underclass, surviving on a carb – and fat-heavy diet, while those with money scarf the protein… Russia will become a global food superpower as the same climate change opens up the once frozen and massive Siberian prairie to food production.” – Jay Rayner, TV presenter and the Observer’s food critic

Comment

Disagree. First there are two big assumptions here. The first involves climate change and global warming. Personally, I think one should remain open about some of the predictions on this subject and look at several scenarios, including global cooling. 

If cooling were to happen things could look very nasty for Russia. Second, I think the timescale of 2036 is too short for this prediction and, third, Russia has some other issues, especially demography and health, to focus on before they worry about becoming a food superpower. BTW, Rayner says population will be nudging 9 billion by 2035. I think that could be wrong. I think it’s 2050.

9 ‘Privacy will be a quaint obsession’  

“Some, like the futurist Ray Kurzweil, predict that nanotechnology will lead to a revolution, allowing us to make any kind of product for virtually nothing; to have computers so powerful that they will surpass human intelligence; and to lead to a new kind of medicine on a sub-cellular level that will allow us to abolish ageing and death…I don’t think that Kurzweil’s “technological singularity” – a dream of scientific transcendence that echoes older visions of religious apocalypse – will happen. Some stubborn physics stands between us and “the rapture of the nerds”. But nanotech will lead to some genuinely transformative applications. “- Richard Jones, pro-vice-chancellor for research and innovation at the University of Sheffield

Comment

I don’t see the relationship here between the prediction (the headline) and the text of the story. Yes, nanotech has privacy implications, but they are nothing compared to what the internet is doing – or possibly genetics. Do I agree with the prediction about privacy being deleted? I haven’t made my mind up yet.

10 Gaming: ‘We’ll play games to solve problems’

“In the last decade, in the US and Europe but particularly in south-east Asia, we have witnessed a flight into virtual worlds, with people playing games such as Second Life. But over the course of the next 25 years, that flight will be successfully reversed, not because we’re going to spend less time playing games, but because games and virtual worlds are going to become more closely connected to reality.” – Jane McGonigal, director of games research & development at the Institute for the Future in California

Comment

I agree (but taken too far I don’t like the idea)

11 ‘Quantum computing is the future’

“The open web created by idealist geeks, hippies and academics, who believed in the free and generative flow of knowledge, is being overrun by a web that is safer, more controlled and commercial, created by problem-solving pragmatists. Henry Ford worked out how to make money by making products people wanted to own and buy for themselves. Mark Zuckerberg and Steve Jobs are working out how to make money from allowing people to share, on their terms… By 2035, the web, as a single space largely made up of webpages accessed on computers, will be long gone… and by 2035 we will be talking about the coming of quantum computing, which will take us beyond the world of binary, digital computing, on and off, black and white, 0s and 1s.” – Charles Leadbeater, author and social entrepreneur.

Comment

I don’t think this is saying very much. 

12 Fashion: ‘Technology creates smarter clothes’

“Technology is already being used to create clothing that fits better and is smarter; it is able to transmit a degree of information back to you. This is partly driven by customer demand and the desire to know where clothing comes from – so we’ll see tags on garments that tell you where every part of it was made, and some of this, I suspect, will be legislation-driven, too, for similar reasons, particularly as resources become scarcer and it becomes increasingly important to recognise water and carbon footprints.” – Dilys Williams, designer and the director for sustainable fashion at the London College of Fashion

Comment

Define smart? I see a polarization between high-tech fashion (wearable computers, smart fabrics, global brands) and sustainable clothing (often locally sourced and highly functional).

13 Nature: ‘We’ll redefine the wild’

“I’m confident that the charismatic mega fauna and flora will mostly still persist in 2035, but they will be increasingly restricted to highly managed and protected areas….. Increasingly, we won’t be living as a part of nature but alongside it, and we’ll have redefined what we mean by the wild and wilderness…Crucially, we are still rapidly losing overall biodiversity, including soil micro-organisms, plankton in the oceans, pollinators and the remaining tropical and temperate forests. These underpin productive soils, clean water, climate regulation and disease-resistance. We take these vital services from biodiversity and ecosystems for granted, treat them recklessly and don’t include them in any kind of national accounting.” – Georgina Mace, professor of conservation science and director of the Natural Environment Research Council’s Centre for Population Biology, Imperial College London

Comment

Sounds perfectly reasonable to me.

14 Architecture: What constitutes a ‘city’ will change

“In 2035, most of humanity will live in favelas. This will not be entirely wonderful, as many people will live in very poor housing, but it will have its good side. It will mean that cities will consist of series of small units organised, at best, by the people who know what is best for themselves and, at worst, by local crime bosses…Cities will be too big and complex for any single power to understand and manage them. They already are, in fact. The word “city” will lose some of its meaning: it will make less and less sense to describe agglomerations of tens of millions of people as if they were one place, with one identity. If current dreams of urban agriculture come true, the distinction between town and country will blur.” – Rowan Moore, Observer architecture correspondent

Comment

Urban agriculture is pure fantasy in my view, especially vertical farming. I agree that in 25 years most people will probably live in poor housing but it will probably be an improvement on what people lived in 25 years ago.

Most people will definitely live in cities, but I suspect that with the exception of a handful of iconic individual buildings, office complexes and retail developments, most people will continue to live in what we, living in 2011, would immediately recognize as cities. 

15 Sport: ‘Broadcasts will use holograms’

“Globalisation in sport will continue: it’s a trend we’ve seen by the choice of Rio for the 2016 Olympics and Qatar for the 2022 World Cup. This will mean changes to traditional sporting calendars in recognition of the demands of climate and time zones across the planet…Sport will have to respond to new technologies, the speed at which we process information and apparent reductions in attention span. Shorter formats, such as Twenty20 cricket and rugby sevens, could aid the development of traditional sports in new territories.” – Mike Lee, chairman of Vero Communications and ex-director of communications for London’s 2012 Olympic bid.

Comment

I am just about to start work looking at the future of sport so it’s a bit early to really comment on this one. Yes, the globalization of sport will probably continue but we should not underestimate counter-trends (e.g. local or ‘real football’ as an alternative to global football brands with mega-budgets). Yes to new technologies too and I certainly agree with short format sport as a reaction to shorter attention spans (Golf is next for the short-format treatment apparently). 

16 Transport: ‘There will be more automated cars’

“It’s not difficult to predict how our transport infrastructure will look in 25 years’ time – it can take decades to construct a high-speed rail line or a motorway, so we know now what’s in store. But there will be radical changes in how we think about transport. The technology of information and communication networks is changing rapidly and internet and mobile developments are helping make our journeys more seamless. Queues at St Pancras station or Heathrow airport when the infrastructure can’t cope for whatever reason should become a thing of the past.” – Frank Kelly, professor of the mathematics of systems at Cambridge University.

Comment

Agree with the first bit, but good luck with the point about queues and seamless journeys. First, I think he could be overestimating the impact of technology and secondly (more importantly) he is neglecting economics. In the West especially governments are burdened by debt and rising urban populations and it’s quite possible that much public infrastucture will be in very poor repair by 2036. 

17 Health: ‘We’ll feel less healthy’

“Health systems are generally quite conservative. That’s why the more radical forecasts of the recent past haven’t quite materialised. Contrary to past predictions, we don’t carry smart cards packed with health data; most treatments aren’t genetically tailored; and health tourism to Bangalore remains low. But for all that, health is set to undergo a slow but steady revolution. Life expectancy is rising about three months each year, but we’ll feel less healthy, partly because we’ll be more aware of the many things that are, or could be, going wrong, and partly because more of us will be living with a long-term condition.” – Geoff Mulgan, chief executive of the Young Foundation

Comment

Agreed.

18 Religion: ‘Secularists will flatter to deceive’

“Organised religions will increasingly work together to counter what they see as greater threats to their interests – creeping agnosticism and secularity… I predict an increase in debate around the tension between a secular agenda which says it is merely seeking to remove religious privilege, end discrimination and separate church and state, and organised orthodox religion which counterclaims that this would amount to driving religious voices from the public square.” – Dr Evan Harris, author of a secularist manifesto

Comment

Fascinating subject (I’d love to do some scenarios on the future of religion). I agree with some of this. I think that the world will become more religious over the next 25 years but the idea that secularization will grow is potentially suspect.

19 Theatre: ‘Cuts could force a new political fringe’

“Student marches will become more frequent and this mobilisation may breed a more politicised generation of theatre artists. We will see old forms from the 1960s re-emerge (like agit prop) and new forms will be generated to communicate ideology and politics.” – Katie Mitchell, theatre director.

Comment

Disagree. Yes, I this could happen, but I think a much stronger trend could be quite the opposite. If the economy (in the West) fails to pick up and we enter a prolonged period of doom and gloom, I would predict that people will get so sick and fed up that sugary, escapist, nostalgic fantasies and musicals with thrive.

20 Storytelling: ‘Eventually there’ll be a Twitter classic’

“Twenty-five years from now, we’ll be reading fewer books for pleasure. But authors shouldn’t fret too much; e-readers will make it easier to impulse-buy books at 4am even if we never read past the first 100 pages…My guess is that, in 2035, stories will be ubiquitous. There’ll be a tube-based soap opera to tune your iPod to during your commute, a tale (incorporating on-sale brands) to enjoy via augmented reality in the supermarket. Your employer will bribe you with stories to focus on your job. Most won’t be great, but then most of everything isn’t great – and eventually there’ll be a Twitter-based classic.” – Naomi Alderman, novelist and games writer

Comment

I’d predict that in 25 years Twitter won’t exist (and I’ll take a bet on that).

Forensic phenotyping – part 2

Here we go. Further to my post on forensic phenotyping (Jan 6), here’s something from my book Future Files (2007).

If you are charged with a criminal offence in the UK, a sample of your DNA is taken and added to a national DNA database where it stays indefinitely, even if you are subsequently acquitted. So far the UK database contains the profiles of 3,130,429 people, or 5.23 per cent of the entire UK population. In contrast, the US DNA database contains just 0.99 per cent of its population, while most other national databases contain the names of fewer than 100,000 people. In theory, this DNA fingerprinting is a very good idea, not least because the technology allows police to create a DNA fingerprint using a single human cell (taken from a print on a broken window, for example). In the future, police officers will carry handheld devices that can instantly upload these samples and test them against the database. These samples will then be used to create 3-D photofits of suspects, giving police officers accurate information on likely height, skin colour, hair colour and even personality type.

Privacy campaigners are obviously concerned about this but the database and associated technology will be so useful that I’d expect the database to be enlarged as part of a national biometric identity-card scheme. Eventually, every single person in the country will therefore be listed ‘for their own security’, at which point adding some kind of GPS or other location-tagging component would seem an entirely logical idea. The problem with this, of course, is that once a government starts to view all its citizens as potential suspects, there will be subtle changes to how everything from policing to law making operates. There is also the issue of data accuracy and data security.

Forensic phenotyping

I’m trying to write something about prediction for Fast Company magazine. So far all I’ve got is this from Aston Webb, a Victorian architect, talking about London in the year 2014 from the perspective of 1914:

“There are two great railways stations, one for the north and one for the south. The great roads out of London are 120 feet wide, with two divisions, one for slow-moving and the other for fast-moving traffic; and there will be a huge belt of green fields surrounding London.”

The other thing I’m trying to weave in is a story I picked up last Saturday about some Dutch scientists being on the verge of being able to develop physical descriptions of criminals based on traces of DNA left at crime scenes. Apparently, a blood test can predict the age of a suspect plus or minus nine years. Meanwhile, researchers in the US are close to being able to predict skin colour and perhaps facial geometry via analysis of DNA sequences.

Could such tests eventually prove better than eyewitnesses? Some people seem to think so, although the issue of whether or not such tests could generate false predictions seems rather vital. There’s also the issue that if one can predict likely age, race and facial features, could they also be used in a kind of ‘Department of Future Crime’ way to predict criminal behaviour before it happens. i.e. could DNA tests predict future aggression or tendency to lie or steal?

BTW, I don’t have a copy with me, but I’m pretty sure that I outlined exactly such developments in my book Future Files back in 2007. How? I just read the right literature (New Scientist and Scientific American from memory), joined a few dots and made an educated guess. I’ll dig out my original text when I get a moment.