Update. Event in London on the future of work followed by a flying visit to Oxford to do a talk at Blackwell’s bookshop. (Thanks Zool). Bright audience. Very bright audience. To misquote Yeats, it’s a wonder that anyone does anything in Oxford. The whole place just seems to be built to encourage people to think about the past and dream about the future. Nice comment on the Blackwell’s blog about “literary festivals (and, by extension of that, all programmes of author events) being almost like the new public meetings – not just individual occasions of interest but actual forums where the world moves forward through discussion, free and frank exchanges of views, philosophical extemporisations.” Marvelous stuff.
After breakfast I popped into the Ashmolean Museum on a whim and nearly burst into tears it was so beautiful. (Where there is beauty lays truth etc). Then back to London to have lunch with an old boss and on to for a meeting at the London Business School. Then back home to open an old bottle of red and watch some fantastically gripping reality TV.
The talk at Blackwell’s was based on the talk I did at the RSA, but tightened up a bit. If you are interested (i.e. you have a 21-hour flight to Sydney coming up and don’t have anything to read) here’s the transcript.
————-
The book (Future Minds) is about the impact of digital culture. It’s about how 4 billion mobile phones, 2 billion PCs, 550 million Facebook accounts and a Googlezillion internet searches, texts, tweets, zaps and pokes are changing how we think. At least that’s what the book ended up being about.
The reason I wrote the book is that I had an idea. An idea that occurred to me early one morning when I was looking out from the rooftop deck of a hotel in Sydney. Interestingly, I was the only person doing this. Everyone else was frantically answering calls or tapping Blackberry’s.
The idea I had was whether or not physical spaces, like the one I was in, influence thinking. But then I thought to myself – would I be having this thought if I were on the phone, looking at a computer screen, in a basement in London? I thought then – and I still think now – that the answer is no.
Modern life is changing how we think, but perhaps the clarity to see this only comes with a certain distance or detachment.
My original idea was to write a book about how physical spaces (like hotels or bookshops) influence thinking. A book about architecture essentially.
However, my publisher pointed out that such a book would probably sell about 3 copies, so I broadened the remit to include virtual spaces, digital devices and eventually screen culture.
It thus became a book about the future of thinking with a set of social and technological trends as the unifying force. Ultimately, though, it’s about something else. It’s about our addiction to digital technology and the way this is changing our relationships with each other.
The book is split into three parts.
The first part is about how attitudes and behaviours are changing. It looks at teens and pre-teens and considers, amongst other things, connectivity, multitasking, electronic games, the fate of physical books and whether IQ tests could be making kids stupid.
The second part of the book is about why this matters. This section looks at how our minds are different to machines, considers where ideas come from and contains a section about paperless offices and a plea for more organised chaos.
The third and final section is then about what we, as individuals and institutions, can do about a world choked with too much information and distraction.
I’d like to look briefly at each of the three sections.
We are constantly connected nowadays. This is largely due to digital technology.
A decade ago there were fewer than 500 million mobile phone subscribers worldwide. Now there are 4.6 Billion. In the UK 50% of children aged between 5 and 9 now own a mobile phone.
When I was growing up there was one only phone in the house, which represented a physical location, and it didn’t take messages. There was also just one TV, which had 3 channels and these all closed down at around midnight. Children’s TV was on for just two hours on two channels. Nowadays, 79% of British kids have a TV in their bedroom and children’s programming never ends if you have satellite or cable.
In the US it’s worse. 1/3 of kids in the US live in a home in which the TV is on “always” or “most of the time.” 80% of US toys now contain an electronic component and tech gadgets are now the Christmas presents of choice – amongst preschoolers!
Is it just a coincidence, I wonder, that Ritalin prescriptions (for attention deficit disorder) have grown by 300% over the last decade?
What I’m getting at here is the thought that we never get a chance to really be by ourselves, which means we never get a chance to really know ourselves. We never get the opportunity to sit quietly and think deeply about who we are and where we are going.
Ironically, this universal connectivity also means that we tend to be alone even when we are together. You can see this when couples go out to dinner and spend most of their time texting – or when kids get together for play-dates and end up sitting next to each other on separate gaming consoles for hours on end. This is what I call Digital Isolation. It is what Sherry Turkle at MIT calls being Alone Together.
What worries me most though is what’s happening to the quality of our thinking, which I believe is becoming shallow, narrow, cursory, fractured and thin.
This is problematic because I firmly believe that originality largely depends on periods of deep thinking. I believe that serious creativity, whether in academia, business or the arts, is dependent on periods of thinking that are calm, concentrated, focused and above all reflective.
Moreover, unless you have been asleep for the last thirty years, you might have noticed that a knowledge revolution has replaced brawn with brains as the primary tool of economic production. It is intellectual capital that now matters. But we are on the cusp of another revolution. Soon, smart machines will compete with clever people for employment and even human affection. Hence being able to think in ways that machines cannot will, I believe, become vitally important.
Put another way, machines are becoming adept at matching stored knowledge to patterns of human behaviour, so we are shifting from a world where people are paid to accumulate and distribute fixed information to one where knowledge will be automated and people will be rewarded instead as conceptual and creative thinkers.
However, there’s a problem. Our education system is still largely designed to produce people that will compete head on with such machines. We are still producing logical and convergent thinking, when what’s often needed, I would argue, is illogical and divergent thinking. In short, we are ignoring one half of our brains.
There is a woman in the US called Lynda Stone who came up with the term Constant Partial Attention to describe how our thinking is becoming fractured and fragmented. My version of this is Constant Partial Stupidity.
We tend not to fully concentrate on one thing nowadays. Instead, we continually scan the digital environment for new information. And we start to believe that we can do more than one thing at once. According to a 2009 US study, multi-tasking is now becoming the normal state. However, the same study found that the people that multi-task the most are in fact the worst at it. Heavy multi-taskers are poor at analysis and forward planning. They also lose the ability to ignore irrelevant data. They are suckers for distraction and become bored when they are not constantly stimulated.
Multi-tasking also means that we are more likely to make silly mistakes, although sometimes mistakes get serious. I don’t know whether you remember Mr de Silva but this was the man that famously used his laptop to get instructions on how to avoid a traffic jam on the M6 motorway. Unfortunately, he was driving a lorry at the time and smashed into a line of cars killing six people.
Such people are not alone in outsourcing their thinking to machines. For example, if you can Google any piece of information more or less instantly why bother learning anything? If a Sat Nav can always tell you where you are why worry about situational awareness or bother to learn how to read a map?
Now it’s true that you can always turn the technology off, but most of us don’t because there is cultural pressure to be constantly available and to instantly respond.
The point here is that it seems to me that we need context as well as text. We need to understand principles before we move on to applications. We need breadth and depth not superficial facts.
Unless we know how things relate to one another we will just have information. For knowledge we need to understand connections. For wisdom we need to understand consequences.
Second, if everyone is using the same sources what of originality? You might think I’m exaggerating about this but I’m not. Far from creating an intellectual paradise, it appears that digitalisation might be narrowing our thinking. For example, 99% of Google searches never proceed beyond page 1 of results and according to a meta-study of academic studies, academic papers are now citing fewer studies not more.
Third, what if one day the technology doesn’t work? What then? We assume, for instance, that the internet will always work. But what if it doesn’t. What if one day the volume of data becomes so great that it becomes blocked? What if energy shortages disrupt access? What if cyber-attacks become such a problem that things move offline? What then?
Why does any of this matter? Who cares if our brains are changing? We’ve always invented new things. We’ve always worried about new things and we’ve always moaned about younger generations. Surely most of what I’m saying is conjecture mashed up with middle-aged technology angst?
I think the answer to this is that it’s a bit different this time. Digital devices are becoming ubiquitous. They are becoming addictive. They are becoming prescribed.
At the moment we have a choice. We can choose paper over pixels. We can choose to talk to a human being rather than a customer service avatar. W can choose to go to go to the library rather than Google.
But what if one day there is no choice. What if all books become e-books? What if all physical libraries are replaced by digital equivalents (or by Google)?
This probably sounds fanciful. But it’s all happening already. Governments and businesses alike are moving everything they can online for the sake of cost or convenience. But I am concerned that while the quantity of communications is increasing exponentially, the quality may be going backwards.
Secondly, and more importantly, people need people. It seems that one by-product of the digital age is that our relationships are becoming more superficial. Thanks to text messages, e-greetings and social networks we know more people but we know them less well. We have replaced intimacy with familiarity.
It is interesting to note that 10 years ago 1 in 10 Americans said they had nobody to confide in. 10 years on and this figure has jumped to 1 in 4.
Now I’m sure I will be accused of exaggerating this point, but it seems to me that empathy and tolerance of others could be two of the casualties of instant digital gratification. If we are constantly looking at a screen, in iPod oblivion as it were, we are less aware of others, some of whom may need our help and some of whom may have something important to say.
Equally, if we are able to personalise our experience of reality via RSS feeds, Google alerts and friendship requests, it is less likely that we will be confronted with ideas or people that challenge the way we think.
The internet is a wonderful invention. I couldn’t do most of what I do today without it. Moreover, I am not declaring war on digital devices. Many of them are extremely useful. Neither am I saying that Google is evil or that Apple is rotten. They are not. I’m just arguing for some level of analogue/digital balance.
I’m saying that we should think further ahead and question some of our assumptions.
Also that technology should be used in combination with human judgement not as a replacement. That we use technology to enhance relationships not to negate them
So what can we do?
The first thing we need to do is think. We need to think about the relative merits of different analogue and digital technologies. For example, evidence is emerging that pixels are different to paper. When we use screens our minds are set on seek and acquire. This is great for the fast accumulation or distribution of facts. But with paper it’s different. Our minds are more relaxed, we tend to see context. Our thinking is more curious and questioning. In short, paper is more absorbent.
Another example. There’s evidence that people are more reckless with money when it’s digitalised. It’s as though it belongs to someone else and we spend it more impulsively. It’s the same, in my experience, with digital statements and bills.We look but we do not see.
Think about this for a second. What if it were proven beyond all reasonable doubt that reading something on a screen is inferior to reading something on paper. Can you imagine how that would impact education? Can you imagine the lobbying that would go on from the likes of Google, Apple, Microsoft and Nokia?
To sum up, I think that if originality and empathy are to survive the onslaught of the digital age we need to do three things.
First, we should restrict the flow of information. In the US, people consumed 300% more information in 2008 than they did in 1960. We should therefore learn how to control the flow of information and remember that not all information is useful or trustworthy. We should also remember that, despite the digital revolution, the medium still influences the message.
The Second thing we should do is disconnect from time to time. Our brains need to relax. If they don’t they can’t function properly. Without rest our memories are not properly stabilised. Moreover, a lack of sleep can inhibit the formation of new brain cells. So switch your mobile off at certain times. It’s interesting to me that we try to set boundaries around screen use for our kids, yet we do not restrict our own usage.
So resist the urge to take your BlackBerry on holiday. Don’t answer texts in restaurants and don’t send emails when you are spending time with your kids.
Most of all, create the time and space to think. When, for instance, was the last time that you told someone that you were going off “to do a bit of thinking.” So go for a walk. Do something that is superficially mundane that allows your mind to wander.
Third, go to places where new ideas can find you. I did some research for my book that asked people where they did their “best thinking.”
Interestingly, not a single person mentioned digital technology. Nobody said, “On the phone”, “On Facebook”, “Twitter” or “Google”. Technology, it seems, is good for developing and disseminating ideas, but not much use for hatching them.
What I also found fascinating was that only one person said in the office, and they said very early in the morning – in other words, when the building wasn’t really functioning as an office at all.
Why don’t people have good ideas at work? The main reason is that they’re too busy. You need to stop thinking before you can have a good idea.
Did any of the answers I received from people about their thinking spaces have anything in common? I think so. Scale seems to be important. You need to feel small. That’s probably why so many people mentioned beaches, mountains and churches. I imagine that great libraries are good too. In such situations our minds seem to expand to fill the available space. Seeing a distant horizon also appears to help in that our thinking is projected forward.
Movement (especially planes, trains and automobiles but also water) is good, as are environments that are slightly restricted or beyond our control (i.e. a long-haul plane trip where you can’t go anywhere). I imagine prisons are quite good too.
Here’s a final thought. The human brain is, as far as we know, the most complex structure in the universe. But it has one simple feature. It is not fixed. It is malleable. It is impressionable to the point where it records every single thing that happens to it.
You might think that text messages and internet searches don’t affect you but you’d be wrong. They already have. The question is therefore not whether they influence your thinking but how.
Missed it!
Do you have any others planned within a reasonable drive time of Cheltenham?
I’ve just sent you an email but always open to offers. I don’t do enough public lectures so especially interested in these, certainly anything around books, bookshops, public libraries and so on. Also schools + anything with kids in the audience!