Monthly Archives: May 2015
Digital communication – on a higher level
I’ll have to find somewhere to add this to the book. I’ve just been reading Harper’s magazine (US) May 2015 issue. On page 24 there’s a small feature about Instapray, an app that allows people to post and request prayers. The best bit is undoubtedly a post that reads as follows:
“God, please help me overcome my internet addiction. It’s ruining my life.”
New book – help wanted!
I’m not happy with the chapter title for money. Rest are fine. Can anyone improve on it? Below is the current chapter and some opening lines for context. Any thoughts greatfully received.
4. Money & Economy: Is invisible money tempting us down a path where reality will jump out and mug us?
A few years ago I was walking down a street in central London when a white van glided to a halt opposite. Four men stepped out and slowly slid what looked like a giant glass coffin from the rear. Inside it was a large live shark.
The sight of a shark in central London was slightly surreal, so I sauntered over to ask what was going on. It transpired that the creature in question was being installed in a giant underground aquarium in the basement of a house in Notting Hill. This secret subterranean lair should, I suppose, have belonged to Dr Evil. To local residents opposing deep basement developments it probably did. A more likely candidate might have been someone benefiting from the digital and globally networked nature of finance. A partner at Goldman Sachs, perhaps. This is the investment bank immortalised by Rolling Stone magazine as: “a great vampire squid wrapped around the face of humanity.” Or possibly the owner was the trader known as the London Whale, who lost close to six billion dollars in 2012 for his employer, JP Morgan, by electronically betting on a series of highly risky and somewhat shady derivatives known as Credit Default Swaps.
Go Slow Media
Back in 2010 I wrote that “A slow thinking movement will emerge as a parallel to the Slow Food movement, with people celebrating slow reading, slow writing and other forms of paper based communication” (Future Minds p 171).
I’ve previously commented on Delayed Gratification magazine (good article on journalism robots and automated writing in the current issue), but something else caught my eye last week. BBC television screened a two-hour long programme about a journey down a canal from the boat’s perspective – with no soundtrack whatsoever. All you can hear is the sound of water, dogs barking and birds tweeting (that’s wildlife singing, not women sending twitter updates).
Of course the temptation to record the show and play on x30 is almost too much to resist. Apparently the ‘go slow’ show is part of a series that was inspired by…wait for this, a nine hour long Scandinavian programme showing someone knitting. A good example of a counter-trend in action.
It’s a wonderfully weird world.
Digital v Human
I thought I’d share the preface to the new book, Digital v Human. I’ll share some more in a couple of months when things are more polished.
—
Preface: Taming the Future
The future casts a long shadow, in my case all the way back to Australia in 2006 when I was asked to write a book called Future Files about where I thought the world was heading over the next 50 years. But the future was always an excuse, used more as a satirical mirror than a distorting crystal ball. What I was interested in then, and remain interested in today, is people and how they respond to new ideas and events. Also, how we relate and respond to each other, which is what this book is about. It is peoples’ lives, their deepest dreams, what they believe in and what they are afraid of, that captivates me, not the latest ephemeral gadget, although these things can, and do, influence each other.
Future Files must have hit a nerve, because the book ended up being published in 16 languages. Part of the reason was timing – there hadn’t been a book about the distant future for a long time. But I was also lucky. I wrote then that debt levels were unsustainable and that a systemic shock to the financial system was inevitable. This was: “not a debt mountain, but an avalanche waiting to descend…. big banks, in particular, will come under increasing scrutiny about their lending practices, and there will be calls for salary and profit caps.”
Nothing like a page of prophesy to sell some books, although I’m still waiting patiently for the European Union to “Splinter and ultimately collapse” and for the day when “Women with facial lines will be highly desirable.” It seems that I made the mistake of thinking we’d tire of pixelated perfection, but evidently not. We haven’t grown tired of debt either; in which case I suspect that history will very soon repeat itself in the form of another spectacular financial crash.
But the main reason the book sold well was due to an emerging epidemic of anxiety and insecurity. The world was changing and readers were seeking a narrative that explained where things were going. The book provided a comforting cloak of reassurance to those grieving the loss of an imagined future.
The distant future had once been hopeful and at times rather fun. It was a preview of coming attractions. But by late 2007 people had given up hope of seeing flying cars or owning personal jetpacks. All people wanted to know was whether everything would turn out all right. Would there be a comforting resolution after the explosive opening sequence? Would computer-generated effects continue to enthral or would the computer move from all-conquering hero to sinister villain lurking behind our flickering screens?
This dystopian discomfort was likely linked to a feeling that events had got out of control. Things were unfolding too fast for people to comprehend. Gone were the days when you could start a broken down car by yourself or understand how a camera worked. Even by 2007, it wasn’t just financially engineered credit default swaps or ‘additionality’ linked to carbon credits that were incomprehensible – you almost needed a degree in complex systems theory simply to switch on a domestic washing machine. Seriously, do we really need 40+ washing choices, including the incomprehensible option to wash your clothes later?
Complexity, synonymous in engineering terms with instability, had become a hallmark of the early 21st Century and the world’s axis had shifted toward the outskirts of normal. This was unsettling, especially to anyone brought up in an analogue, western-centric world where globalization had meant Americanization and cheap washing machines.
There have always been generational overlays to future fatigue. As the writer Douglas Adams observed: “Anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilization we know it, until it’s been around for about ten years when it gradually turns out to be alright really.”
Despite this, sometime after the Millennium (probably after the explosive events of 9/11 or, possibly, after the premature death of Douglas Adams) the future became obscured. The dream that we once called the future soured and its shadow became awkward and indistinct. This wasn’t true for everyone. How one imagines and responds to the future has always depended upon whom and where you are. The future is a mental construct projected from things that have been experienced.
In large parts of Asia and Africa rapidly rising incomes and opportunities mean that optimism is in the ascendant; while across swathes of the US and Europe declining real incomes mean that it’s doom and gloom that’s projected forwards. Nevertheless, by 2008 the US financial crisis that had started with people borrowing too much money had become a global problem and unfolding events had created a vortex into which many age-old certainties were sucked.
If we had been able to remember the past and not over-react to the present we might have been all right. If the crisis had occurred much earlier, ignorance may have remained bliss. There was once less information and both people and money were less connected, which meant fewer systemic risks.
A study conducted by Angela Dimoka, director of the Center for Neural Decision Making at Temple University (US) found that as information is increased, so too is activity in the dorsolateral prefrontal cortex, a region of the brain associated with decision-making and the control of emotions. If information flow is steadily increased, activity in this region suddenly falls off. The reason for this is that part of our brain has essentially left the building. When incoming information reaches a tipping point the brain protects itself by shutting down certain functions. Key implications include a tendency for anxiety and stress levels to soar and for people to abstain from making important decisions.
Fast-forward a few years and some e-vangelists started looking at the world through Google Glass and other augmented reality devices. But others, a majority perhaps, put on rose-tinted spectacles and framed their gaze backwards. On the fringes, some squinted scornfully and aspired towards self-loathing. Others suggested that the very idea of human progress had become impoverished. Maybe they had a point, but there was no redemptive framework in clear sight.
What this amounts to is a clash between those racing towards the future and others fleeing from it. A similar tension between faith and scepticism plays out between Islamic fundamentalism and liberal agnosticism. Some fundamentalists would like to reinstate a 7th Century legal framework, while some online libertarians would like to escape legal constraints altogether.
Western self-loathing remains an especially odd development. On most measures that matter; life expectancy, infant mortality, literacy, extreme poverty, starvation, the number of women in education and employment – life has never been better for most of the planet. If you doubt this you have clearly not been paying attention.
But despite the good news about the expansion of the global middle class, the electrification of Africa or survival rates for cancer, we focus instead on doomsday forecasts about rogue asteroids, global pandemics and employment-eating robots.
These, along with climate change, obesity, resource depletion, bio-diversity and pollution, are serious issues, but I’d suggest that they are focal points for deeper anxieties and are unlikely to be terminal for the human race. So why are we feeling so miserable when there’s so little to feel miserable about?
Prior to 9/11 (or the fall of the Berlin Wall in 1989 or the financial crisis of 2008…take your pick) people believed that they had a clear view of what lay ahead. With 20/20 hindsight such views were delusional. But the detail was irrelevant. At least people had a sense of direction, from which they could construct a narrative to make sense of things. For many people life was hard, but they knew where they stood, which is why countries like Russia long to go backwards and reconstruct previous certainties along with territorial borders.
Today many people feel that the future has evaporated or else they are held hostage by some unknowable and uncontrollable force. But this is nonsense. Firstly, certain elements of the future are pre-determined. Demographics retain a high degree of certainty, while geography and geology impose a number of constraints. Parts of the future can therefore be found on the flood plains and tributaries of history.
Secondly, the collective psychology of nations, influenced again by the past, can suggest a sense of direction. Thirdly, there’s technology. It’s true that technology is neutral, but only if you take humans out of the equation. Once technology reaches humans hands it creates change.
More specifically, it is the nexus of human history, human nature and what many regard as increasingly inhuman technologies, where I’d expect the largest tensions to bubble up over the years ahead, especially as we struggle to adapt our slowly evolving monkey brains to the rapidly changing technological landscape.
These thoughts were on my radar in 2006 when I wrote that: “to a large degree, the history of the next 50 years will be about the relationship between technology and people,” but I believe that I underplayed the significance of this statement.
This is odd, because it is a point well made by Alvin and Heidi Toffler in Future Shock, published in 1972, of which I have a well-thumbed copy. Their book argued that the perception of too much change over too short a period of time would create psychological problems and mental instability at both an individual and societal level.
You might argue that they were wrong (it didn’t happen) or that they were right, but that they got their timing very wrong (futurists often use the line “give it time” in relation to suspect forecasts). One might also speculate about who gets listened to and what gets believed and why, but this isn’t the time or place.
Personally, I think the Toffler’s anticipated something of significance and if my book has a dramatic chase scene this would be it – our desire for change and renewal crashing up against our need for permanence and stability. Will we be forced to adapt to new technologies and global norms or will we insist that our new technologies adapt to us, deleting, controlling or escaping from them as necessary?
How, for instance, should technology serve humanity and what, ultimately, is its purpose? Should all forms of automation and artificial intelligence be made to exist within an agreed moral framework and where, if anywhere, should the line be drawn in relation to what humans and machines are permitted to do?
Should humans and machines be allowed to merge creating augmented, partially synthetic or cyber-human-hybrids and, if so, where would this leave any remaining natural, pure or organic humans?
Whatever happens we should never lose faith, because the future is always wide open. The future is shaped by the choices that we make and these choices can always be challenged and changed, even at the last minute.
In one sense the problem we currently face is not technology, but humans, but more about us later. One thing we should certainly do is worry far less about what might happen over the coming decades and focus far more upon what it is that we, as individuals and institutions, want to happen. And it is not necessarily logic, but rather our deepest hopes and our darkest fears that will help shape this.
The aim of this book is not prediction, but illustration. It is a critique of how we live now and a discussion about how we might live next. It is about who we are and where we could be going and about the need for human beings to remain central to any new digital interests or perspectives. Hopefully, the shadow cast by the future will henceforth be our own and will provide a small degree of comfort rather than bewilderment.
Richard Watson, somewhere in England, May 2015