The Strategic Use of Lunch

Here’s the third (of three) top posts from the past year or so.

6 million workers in the UK do not take lunch breaks according to a study by BUPA, a health insurance company. 34% say this is because of pressure from managers, while 50% say it’s due to excessive workloads. 48% also say that without lunch their productivity drops around 3.00 pm, which leads to 40-minutes lost work worth around £50 million per day in lost productivity.

Seems it’s time to roll this essay out once again…

Lunch. Let’s do lunch. Let’s skip lunch. Lunch is for wimps. It is fourteen years since Gordon Gekko made that last infamous announcement and yet ‘lunch’ is still a dirty word.

We need to eat; but we seem also to need to justify the time spent doing it. Sometimes we sit alone at our computer while we wolf a sandwich (extra points if purchased from an entrepreneur with a basket actually in the office). Sometimes we snatch a bite while we rush round doing the domestic errands that will allow us to stay later that night.

Sometimes we miss lunch altogether: we jog to burn up calories (the absolute opposite); or we go to the gym to work out (work up?) aggression before plunging back into the dog eat dog marketplace. Anything, anything but simply having lunch and enjoying it.

Why? When, indeed, eating in the middle of the day is a natural and healthy moment to do so – sustaining energy, allowing digestion and feeding conversation.

It is an Anglo-Saxon phenomenon, broadly speaking. Further south, societies have a stronger tradition of eating, and then resting, in the middle of the day. The day starts earlier, is broken by some hours in the afternoon, and then goes on later into the evening. The anthropological explanation of this is climatic: it is the heat which dictates, not the digestion.

Except that, now that Anglo-Saxon capitalism is dominant, in city after European city the habit is beginning to die, as desks must be staffed until Tokyo has gone to bed and New York has woken up. Global capitalism has overridden variations in the global climate.
This new capitalism is lean, mean and very hungry.

Back in the bad old days, when socialist sensitivities were keen, business lunches possibly earned a bad name. Fat cat capitalists sat late into the afternoon over brandy and cigars, while their workers toiled in satanic mills, only emerging late in the afternoon with pale, hungry faces and emaciated limbs.

But now bosses are thinner than shop-floor workers – they can afford more expensive gyms – and brandy and cigars. Business entertaining goes on, but water is the order of the ambitious lunchtime drinker, and the lunch – notoriously never free – must be justified by a concrete deal, a bottom line, a result.

Meals are significant social moments in all cultures. Meals have always attracted rituals and meanings. They used to be far more simply and recognisably significant in our own culture. The directors would have lunch in their own dining room, and would invite others to join them. Banks, shops and offices would close for lunch. Lunch was important. And it was important throughout the week. Sunday lunch involved all the family sitting down. Christmas dinner still does.

In America there is Thanksgiving. In church there is the Eucharist, the Mass, the Holy Communion, The Lord’s Supper … meals are where we find much that is significant about how we live, what is changing, what is enduring.

Lunch is an interface. Lunch is where work meets people (where colleagues became friends before the days of motivational workshops and team bonding courses). Lunch is where people talk and people think. It is where the new economy meets a very ancient set of rituals and customs.

How we approach lunch says a lot about our attitude to work, and work’s attitude towards us. Lunch has been on a long downhill trek – from luncheon to something, which we snatch, shamefaced, alone. So what have we gained by downgrading lunch? What have we lost?

“I’m going for lunch.” Yes, but are you going for lunch to eat; or are you going to do the things that you do instead of lunching? In one sense the latter could be said to be fraudulent because this hiatus in the working day is there in order that the natural human need to eat should be met.

But on the other hand we don’t want to eat. So we have turned lunch into something else, something broader – time out during the working day. And so employers negotiate about how long ‘lunch’ – and other breaks – should be.
Is the time taken at the employee’s expense, or the employer’s? If it is just used by the employee at will, then cannot employers reasonably argue that it should not count towards the working day? If it is used to eat, because an eight- or ten-hour stretch without food involves significant loss of efficiency towards the end, then cannot an employee regard that as benefiting the organisation?

Lunch, in those circumstances, becomes a necessary concomitant of employing people at all – and the employer’s business.This is lunch as a battleground. It suggests a workplace that is a battlefield. Investment banks are the new sweatshops, as much as the new call centres – only with bigger bonuses. So that’s all right then?

In a free market and a free society people are able to choose what they do with the time when they could be eating. That canteen, lunch-break culture was so paternalistic, so patronising. Yet now market pressures seem to work only one way. They have eaten up lunch for the keen employee. Modern business culture has become as food-friendly as a plague of locusts.

Historically, many communities dedicated to a common end have distrusted meals. Under the Rule of St Benedict monks eat in silence, listening to readings from improving texts. (When do the readers eat? But then, when do waiters have lunch?)

The trouble is, eating is so charged. Rows over the family table. Class war fought with serried ranks of cutlery and fish servers. Meals are the traditional moment to betray your enemy under the guise of friendship: the invitation to break bread speaks of peace, but treachery often strikes. Dante puts traitors to their guests into the very lowest Hell.
It is a busy place.

And yet, we should nevertheless try to reclaim lunch for the new economy. Because, what is wrong with eating? Can’t we simply enjoy that necessary break in the working day, make a virtue out of the necessity, feed ourselves, replenish ourselves, come back to give it back to our work? A solitary sandwich maybe efficient but is it effective?

Everyone should think about lunch more. Employers should value employees as people who need to eat. Employees should value employers as people for whose sake – among others – they eat. And maybe the ritual meal, the nourishing meal, the creative meal, food not as a weakness but as collaboration, can come back into business. So, re-build the subsidised canteen, bring back the dinner ladies!

Perhaps, even, we could invite Dionysos back to the lunch table, to oil our ideas, give us a little courage to make that intuitive leap, speak up to the boss, help us to dare outline that off-the-wall idea. Of course, it would be dangerous, but it’s food for thought.

New Rules for Innovation (on innovation process & culture)

This is the second of three ‘best of’ posts. This was originally written for Fast Company magazine last year, but posted on my blog this year. Again, it’s long…

“There is nothing more difficult to plan, more doubtful of success, more dangerous to manage than the creation of a new system”- Machiavelli (1469-1527)

One of the biggest issues facing organizations in the early part of the 21st Century is the commercialization of new ideas.Specifically, the issue of how organizations can create cultures and processes that support original thinking, which leads to the development of innovation.

This is a tricky business, not least because innovation is a tricky word. Innovation used to mean newness in a meaningful sense. But increasingly the word has anesthetised the idea of ideas. Innovation is now a suffix applied to everything from new flavors of jelly to new colors of socks.

Despite this there is still wide agreement about the value of innovation and an emerging consensus about the fact that economic value is increasingly flowing from the human application of creativity, imagination and aesthetics. Nevertheless there is scant understanding of where innovation comes from or even what the word actually means.

Some people see innovation as a mystical process over which organizations have little control, whereas others see innovation as little more than a process that can be planted inside any organization and switched on or off at will. But whilst CEOs speak at conferences and annual meetings about the importance of innovation very few of them can state clearly the direction that innovation should take and fewer still are able to make this direction clearly understood throughout an entire organization.

A key problem is definition. For example, there is confusion between creativity (the ability to see things differently and have new ideas) and innovation (the ability to take these new ideas and make them happen or to extract value from them).

The best quote I’ve come across that can help clarify this confusion is from William Coyne, a Senior Vice President for R&D at 3M. In his view: “Creativity is thinking of new and appropriate ideas whereas innovation is the successful implementation of those ideas within an organization. In other words creativity is the concept and innovation is the process.”

One of the main problems with innovation is that companies fail to make this distinction. They embrace creativity (ideas) because it sounds like fun and expect (or hope) that innovation (action) will happen by osmosis. This is where things really start to fall apart. The first stumbling block is culture.

Most large organizations do not exist to create new ideas. Companies largely exist to manage a legacy (existing) business (an historical idea) and to return a reasonable level of profit to shareholders – without putting the ongoing success of the entire operation on the line. In this context business is about order and control and managers are quite right to be risk averse. For example, if you are a shareholder or a customer of a bank the last thing you want is creative accounting or radical experiments that endanger the very existence of the bank and your money.

However, unless an organization can respond to changes in the external environment over time it will eventually get into trouble.
Equally, if an organization does not come up with any new ideas at all it will similarly wither because products and markets tend to commodify over time and margins are eventually squeezed by newer, smaller, more nimble rivals that are able to embrace risk and uncertainty and cuddle up to the often messy, irrational and emotional business of coming up with new ways of seeing the world.

What’s needed, clearly, is a balance. So what is the right level of risk and how should companies allocate resources between managing old ideas and coming up with new ideas?
The answer will depend on numerous factors such as industry, geography and ambition but, according to Mehradad Baghai at the Commonwealth Scientific and Research Organization (CSIRO), companies should operate on three horizons and attach differing views of risk on each of these levels.

The first aim of any business should be to extend or consolidate (defend) the existing business. Innovation in this instance should be focused on incremental or evolutionary innovation using known technologies. The second aim is to create and grow new products, services, businesses and markets, whilst the third is blue sky thinking focused on the distant future (10+ years out)and the cutting edge of emerging technologies. Investment of time and resources among the three segments should be roughly 6:3:1.

However, research by the Doblin Group (US) says that 96% of innovation resources are focused on incremental innovation, so most companies have got the balance entirely wrong. Again this is probably, consciously or unconsciously, due to risk.As Robert Gruden, a professor of English at the University of Oregon says: “Innovation is ipso facto dangerous, for if it endangers nothing else, it endangers the safety of a satisfied mind.” A related issue is that even when senior management believes that it has the right culture to nurture original thinking, people lower down the ladder often disagree.

A study of 600 executives quoted by Geoff Colvin in his book Talent is Overrated found that whilst the people at the top thought that not having enough of the right people was the biggest barrier to innovation, people lower down thought that the issue was culture. Cultural problems can take many forms. Risk aversion is one. Really big ideas scare the living daylights out of most organizations and individuals will go to great lengths not to get too close to them.

Lethargy is another problem. Some companies seem to be incapable of implementing anything new because it takes so long that any opportunity has usually passed by the time they eventually launch their idea or because an idea is eventually diluted so far that it becomes invisible both inside and outside the business.

Another big issue is that organizations think they can be great at creativity (ideas) and innovation (implementation) when generally they’re good either at one or the other. The trick, it seems to me, is knowing what you’re good at and then going outside for the other. However, doing this involves a level of self-knowledge and confidence that many organizations lack.

The good news is that even if your company is hopeless at coming up with new ideas, you can employ all sorts of people for almost no money to do it for you – which probably says something about the value of ideas without implementation.

Alternatively, companies can employ consultants to put processes in place to help generate and capture more ideas internally. This will probably work quite well if what you’re after is a constant flow of incremental innovations (continuous improvements to existing ideas). However if you really want to change the world, you’re probably better off looking somewhere else.

Real originality has never emerged from a formula. Rules are precisely what innovators and other paradigm shifters break.
Perhaps this is why real breakthrough thinking tends to come from smaller companies, inventors, entrepreneurs, immigrants and mavericks working in garages and garden sheds that are, in a phrase familiar to Peter Drucker fans, ‘alert to anomalies’.

For example, 50% of all new pharmaceutical products launched in the US a few years ago came from companies that were less than ten years old.Equally, it has been claimed that 60% of major technical innovations do not come from large corporations and in Blue Ocean Strategy,Cham Kim and Renee Mauborge claim that only 14% of innovations are radical (although they generate 61% of company profits).

Immigrants are another widely overlooked source of innovation. Immigrants tend to geographically move ideas from one place to another and they also have a tendency to cluster geographically – and proximity is another thing that ideas adore. Can you put any numbers against this? You bet. In the US, 47% of PhD holding scientists and engineers are immigrants. Inside Intel, for example, 40% of patent applications in 2005 came from individuals of Indian or Chinese descent.

So what can you do if you’re seeking to feed off the insights of such people? What can be done, in short, if you want radical innovation and ideas inside your organization? One trick is not to try and act like a start-up by imitating their “processes” (which many of them don’t have in any formalized sense) but simply to seek out their ideas and to then reapply them to your own situation. Procter and Gamble call this “find and re-apply.” Unilever calls it “creative swiping.” I’d call it innovation transfer. Better still, rather than stealing their ideas, simply buy them (license an idea or just buy the whole company).

The one thing the small ideas guys usually don’t have is precisely what the big guys are so good at: the skills to turn a new idea into an innovation. Corporations have the experience to get new ideas into market. They also have the knowledge to scale them up if they’re successful, as well as the legal resources to protect them.
Another option is dig into your past. All organizations have ideas that have been filed away, either in peoples’ heads or in the proverbial dust covered filing cabinet. However old ideas are mostly ignored which is a real shame because much value lies in the treasury of things past. This represents a huge opportunity and in my experience the problem inside most large organizations is not usually coming up with new ideas. It’s finding ways to find and filter the good ideas that already exist.

Another important point is that organizations need to think about what type of innovation process they are after. According to James Andrew and Harold Sirkin at the Boston Consulting Group (BCG), there are three approaches to innovation: the integrator approach, the orchestrator approach and the licensor approach. Again the trick is to know which you’re best at and work out what suits the specific situation or market. Most big organizations try and control everything (the integrator approach) but this takes time and money. So much time and money in fact that urgency and focus have a tendency to dissipate.

However, this approach can suit big companies in established markets where a core skill or brand can be leveraged.
The second approach is to join forces with other companies to implement an idea (the orchestrator approach). This reduces the risk (and returns) but it’s a smart move when you’re dealing with an idea that’s outside your comfort zone such as a new technology or new distribution channel. The final approach (the licensor model) is to simply sell your idea to someone else. This suits ideas with a strong intellectual property component where you’re too busy (or you lack the necessary skills, funding or brand credibility) to implement things yourself.

However, there is a very big problem with all three of these processes described above. Despite strategy, stage gates, red lights and funnels the number of new ideas that are developed and introduced into the market are infinitesimal. This is a problem because with the best research in the world nobody can ever know for sure what will work and what won’t, until a new idea is launched into a market. Nobody can tell for sure what’s silly and what’s not without the benefit of hindsight. So rather than researching and worrying about whether an idea will work or not, just launch it anyway and find out. In other words invert the traditional innovation process.

Instead of working up a handful of promising ideas and putting them into focus groups to establish which one is right, create the idea, polish it, and then launch it, letting your customers edit the concept for you. Most large organizations, outside the US at least, really don’t understand that there is no lasting humiliation in giving it a go in this manner. You can fail like crazy and still keep going until you eventually stumble on success.

Of course this presents many big organizations with something of a problem. How can they fail like crazy without looking like idiots? The answer is to facilitate and empower every employee, every customer and every stakeholder to become part of the innovation team and then to encourage them to perform small experiments. For example, Mozilla Corp is the company behind Firefox, the wildly successful internet browser. The company has 70 employees and almost 200,000 volunteer helpers. Moreover, Firefox 1.0 was developed not on purpose but by two renegade young programmers who went off in the wrong direction just because it felt like the right thing to do.

The idea of open or distributed innovation obviously links with other ideas like the wisdom of crowds, but the link I like the most is with James G. March’s idea of foolishness in organizations. March is a Professor Emeritus at Stanford Business School and one of his key insights is that companies need to mess around more.What I think he means by this is that people should try more things out even if rationally they don’t initially make sense.

For example, people should incorporate more ideas from outside their domain, or even do strange things just to see where this takes them.It’s a bit like going on holiday. You can follow the guidebooks but often the most interesting and useful experiences come when you put the guidebook down and walk down an unknown street for no particular reason. It’s also like kids. Children initially make sense of the world by touching and manipulating it. They are born curious and they have no problem whatsoever making mistakes. It’s how they initially learn. The only problem is that the minute they enter formal education that are taught that making mistakes is bad.

This, in my view, is the wrong answer if you are trying to breed a generation of risk takers and inventors. Of course, the idea of setting up an innovation process focused on making deliberate mistakes is itself a silly idea. At the moment, most organizational innovation strategies and processes are too sequential and too rigid. They only deal with one side of the brain. But moving to some kind of ‘anything goes’ system would be equally disastrous.

What’s needed is a balance – a combination of tight and loose, where 85-90% of internal resources are spent on internal innovation that is tightly planned and controlled. The remaining 10-15% of time and money should then be spent on unplanned ideas that are developed by simply releasing them into the world and seeing if they survive.

All companies start off as an idea. Start-ups are usually small and poor, which tends to create focus and urgency. If they develop a great product or service with an easily communicable point of difference (and they are lucky) they usually grow. And therein lies yet another problem. One of the first issues to arise in a growing company is that management gets separated from innovation. Peter Drucker made this point many years ago, although he used the term entrepreneurship. Although managing and innovating are different dimensions of the same thing, most companies regard them as separate.

Moreover, as the urgency to stay alive evaporates, the focus shifts to internal management issues. But without continuing to innovate, companies (and I’d include institutions and governments) eventually die a slow and painful death. There are other challenges too. As companies grow, senior managers become physically separated from their customers. The entire board of one of the major banks in Australia takes calls from customers every week, but this is a rare exception.

A recent survey by Bain & Company found that 80% of companies believed that their firm delivered superior service. Only 8% of their customers agreed. Perhaps senior managers are confusing profitable customers with happy ones. Departments like sales, customer service and customer complaints are usually close to the needs of customers. Hence they are close to one of the primary sources of innovation. Managers generally aren’t – they are closest to the needs of management and money.

The culture of an organization can also contribute to failure. The dominant culture of most successful companies is conservative – to avoid risk and to proceed in an orderly fashion. This, as I’ve said, is fine but in the longer-term, what made your company successful in the first place will not do so in the future. Eventually a kind of corporate immune system will develop that resists innovation and tries to free itself from any form of obligation to adapt, even when change is clearly on the horizon.

In this sense organizations can act like the human brain, ‘cementing’ certain experiences and rejecting information that does not fit with preconceived ideas. IBM failing to see the rise of desktop computing is a good example of such Groupthink. One suspects that Sony’s loss of the portable music and entertainment market might be another. You can spot such organizations a mile off because they tend to distrust people from the outside (including their own customers whom they also loathe). They also think that they have absolutely nothing to learn from anyone or anywhere else.

A classic mistake is only recruiting from the inside. I once worked with a retailer that strongly favored homegrown talent over external hires. Nothing wrong with that, except in this case it reinforced the arrogant and complacent attitude that there was nothing to be learnt from elsewhere.

There is also the issue of creating the reality you want, rather than seeing what is really happening. It is not uncommon for senior managers to edit news before it reaches the board level so things appear to be much better than they really are. There’s even a story about a supermarket chain in the UK that repainted its stores, and hired extra staff, just before the CEO was due to make a visit. I don’t think the company ever went as far as hiring customers for the day but once you start editing reality where do you stop?

In addition to corporate culture, corporate structure often gives rise to another problem. As Clayton Christensen (The Innovators Dilemma et al) points out, large organizations are generally structured on departmental levels. As a result most innovation is incremental. For example, most innovation inside fast-moving consumer goods companies takes the form of endless line extensions to existing products.

Fortunately, young start-ups have no respect for these boundaries, so it is generally they who invent new categories and business models in response to changing conditions or new customer attitudes and behavior. In other words, unless you can look at innovation from a whole business perspective and make innovation truly cross-functional (twinning designers with R&D staff as Procter & Gamble does, for instance) innovation will never get beyond the component or existing category/product level.

But perhaps none of this is a bad thing. After all, survival is not compulsory. Perhaps everything (individuals, organizations, markets, countries) need to die – or at least be threatened with extinction – so that a cycle of new thinking can begin again. Can biology teach us anything else about innovation? The essence of Darwinism is that progress is created by adaptation to changed conditions.

What starts as a random mutation can spread to become the norm through a process of natural selection and luck. The same is surely true with innovation. New ideas are mutations created when two or more old ideas combine (have sex essentially) so what’s needed is random variation plus subjective retention. In other words, you need to learn what to throw away and what to keep.

Similarly, having a plan is good but having too fixed a plan or predefined objective can mean that you miss out on opportunities that lie in a slightly different direction. Virgin Atlantic Airways (a company I’ve worked with on and off for twenty years) is an example of what happens when you cross an entertainment company with an airline business and was partly the result of an unsolicited approach from outside the company.

Virgin Records (retail) is another example, created when a postal strike threatened to shut down what was then a fledgling mail order record company. In my experience, what makes Virgin innovative is a strong sense of self, an ability to experiment, the skill to cross-fertilize ideas, and a willingness to change. The company has largely grown, not through the unfolding of some master plan, but through an accumulation of learning and ideas caused by threats, accidents, external approaches and luck.

So, if external events and adaptation are the driving forces of biological evolution, is it possible to develop an innovation process that seeks out accidents and mutations? The list of things created by accident is impressive. Aspirin, Band-Aids, Diners Club, DNA finger printing, dynamite (yikes!), inoculation, Jell-O, Lamborghini, microwave ovens, nylon, penicillin, PVC, the Smallpox vaccine, stainless steel, Teflon, Ferrari road cars, Viagra, Velcro and Vodafone to name just a few.

However, the use of the word accident is rather misleading. These inventions were not created by accident as such; rather these inventions were influenced by or associated with accidents. Having an accident is one thing but being able to see the opportunity embedded within an accident and then actually doing something about it is something else entirely. You have to have what Pasteur called a prepared mind or get beyond what Robert Austin at the Harvard Business School has called “the cone of expectations and intentions”.

Furthermore, one of the defining characteristics of business is a preoccupation with orderly process (“If you can’t measure it, you can’t manage it.”). So it’s hard to imagine corporate cultures embracing randomness – or agreeing with John Lennon, who said, “Life is what happens to you when you’re busy making other plans.” Accidents are born of experimentation, but the automotive and fashion industries are almost the only industries that publicly experiment with radical mutations. What, for example, is the soft drink industry equivalent of a concept car at the Detroit Motor Show?

Zara, the Spanish clothing retailer, is a classic example of experimentation and adaptation. Store managers send customer feedback and observations to in-house design teams via PDAs. This helps the company to spot fashion trends and adapt merchandise to local tastes. Just-in-time production (an idea transferred from the automotive industry) then gives the company an edge in terms of speed and flexibility. The result is a three-week turnaround time for new products (the industry average is nine months) and 10,000 new designs every year – none of which stays in store for more than four weeks.

The analogy of biology also leads to an interesting idea about whether companies are best thought of in mechanical or biological terms. Traditionally, we have likened companies to machines. Organizations are mechanical devices (engines if you like) that can be tuned by experts to deliver optimum performance. For companies that are looking to fine tune what they already do, this is probably correct. A product like the Porsche 911 evolves due to a process of continuous improvement and slowly changing environmental factors. The focus is on repetition. Development is logical and linear.

However, if you’re seeking to revolutionize a product or market, the biological model is an interesting thinking tool. In this context, biology reminds us that random events and non-linear thinking can cause developmental jumps. Unlike machines, living things have the ability to identify and translate opportunities and threats into strategies for survival. Creative leaps are usually the result of accidental cross-fertilization (variation) or rapid adaptation caused by the threat of change. Hence the importance of identifying an enemy, setting unrealistic deadlines and using diverse teams to create paradigm shifts.

The latter is a route employed by MIT who mix different disciplines together. As Nicholas Negroponte puts it, “New ideas do not necessarily live within the borders of existing intellectual domains. In fact they are most often at the edges and in curious intersections.”

This is a thought echoed by Edward de Bono, who talks about the need for provocation and discontinuity. In order to come up with a new solution you must first jump laterally to a different start or end point. For example, if you want to revolutionize the hotel industry you need to identify the assumptions upon which the industry operates and then create a divergent strategy.

This could lead you to invent Formule 1 Hotels (keep prices low by focusing on beds, hygiene, and privacy) or another value innovator, easyHotel (keep rooms cheap by making guests hire their own bed linen and clean their own rooms).

In his classic 1962 book The Structure of Scientific Revolutions, Thomas Kuhn argued that the people who achieve “fundamental inventions of a new paradigm have either been very young or very new to the field whose paradigm they change.” In other words, when it comes to innovation, organizations can be disabled by experience and specialization. As I mentioned earlier, Einstein and Picasso were at their best in their early years – the young Einstein invented the special theory of relativity in 1905 when he was just 26 years old. In 1907, a 26-year-old Picasso painted Les Demoiselles d’Avignon and effectively invented cubism.

Of course the idea of youth was itself new in the early 1900s and it wasn’t until the 1950s that someone invented teenagers. But some companies still haven’t quite caught up with the idea that it’s young people (a company’s young staff and its young customers) that are the most likely to invent their future.

There are plenty of reasons why the most innovative people in any organization are the newest recruits. Young people tend to have the most energy and the most confidence. They’re also outsiders and have little respect for tradition or orthodoxy. Their lack of experience can also be an asset because they’re not restrained by history or preconceptions. Older employees, on the other hand, know that it has all been tried (and failed) before. Their minds are made up. Their brains are set.

This lack of experience was something that Seymour Cray (an early designer of high-speed computers) seized upon. Cray had a policy of hiring young, fresh-faced engineers because they didn’t yet know what couldn’t be done. A company called Fresh Minds works on a similar principle. They supply freshly minted minds to some of the world’s top companies. The longer you work for an organization, the more you also tend to adopt groupthink and the further removed you become from real life (how customers think, feel, and behave).

I once worked with Toyota Motor who wanted to understand how people really bought cars. In one meeting we innocently asked a group of 35 senior auto executives when they had last bought a car on their own with their own money. Not a single person could remember. In contrast the younger employees, who were not given company cars, had a genuine grasp of reality. Younger people are also less concerned with failure because, bluntly, they have less to loose financially.

They also have less invested emotionally. You don’t read about failure very often. And I’m not just talking about ideas that don’t see the light of day. I’m talking about people too.

Why is this? What are we afraid of? After all, it’s not as if it’s totally unknown. Most companies – indeed, most people – fail more often than they succeed. It is the proverbial elephant-in-the-boardroom. And yet by being scared of failure, we are missing a great opportunity.

The point about failure is not that it happens but what we do when it happens. Most people flee. Or they find a way to be “economical with the truth.” “We launched too late.” “People weren’t ready for it.” No. You failed. Own up to it. This is a beginning, not the end. The problem is this: Most people believe that success breeds success and they believe that the converse is true too, that failure breeds failure. Says who? There are plenty of people who fail before they succeed, some of whom are serial failures. Indeed, there is rumored to be a venture capital firm in California that will only invest in you if you’ve gone bankrupt.

Take James Dyson, the inventor of the bag-less vacuum cleaner. He built 5,127 prototypes before he found a design that worked. He looked at his failures and he learned. He then looked at his next failure and he learned some more. Each adaptation led him closer to his goal. As someone once said, there’s magic in the wake of a fiasco. It gives you the opportunity to second-guess.

None of this is to be confused with the mantra of most motivational speakers who urge you not to give up. Success is 1% inspiration and 99% perspiration they say, and if you just keep on trying, it will eventually happen. And if it doesn’t, you’re just not trying hard enough. This is a big fat lie.

Doing the same thing over and over again in the hope that something will change is almost a perfect definition of madness. What you need to do is learn from your failure and try again differently. It is what you do when you fail that counts.

Remember Apple’s message pad, the Newton? Possibly not. This was a commercial flop but the failure was glorious. Indeed, who is to say that the tolerance of failure that is embedded in Apple’s company DNA is not one of the reasons for Apple’s success with the iPod and iTunes?

Does this mean you abandon your failures? Yes and no. Your idea could be right but your timing, delivery, or execution could be wrong. Who could have guessed, for instance, that the one-time AIDS wonder drug AZT had been a failed treatment for cancer or that Viagra was a failed heart medication that Pfizer stopped studying in 1992?

As the designer Alberto Alessi once said, anything very new often falls into the realm of the not possible, but you should still sail as close to the edge as you can because it is only through failure that you will know where the edge really is. The edge is also where real genius resides.

Hence organizations should occasionally promote people whose ideas never get off the ground or end up somewhere other than where they intended. These are the people who fail on our behalf. The unknown innovators that push things so far to the edge that they fall off. The unlucky or naïve few who open up a new trail and get scalped before someone else can see a way through with the wagons. (How’s that for a new historical definition of second-mover advantage?)

There’s a great quote by the English sculptor Henry Moore that sums this up pretty well: “The secret of life is to have a task, something you bring everything to, every minute of the day for your whole life. And the most important thing is: it must be something you cannot possibly do.”

So here’s an idea. Rather than putting up statues to people who did something that was successful, let’s sometimes build monuments to the people who didn’t. Let’s celebrate the lives of people who invented things that didn’t work out or tried to do something just a little bit crazy. These are the people we all watch with perverse envy when we are too scared, too self-conscious, or too constrained to fail ourselves. Without these wonderful people, there would be no progress.

In praise of public libraries – and librarians

I’m putting together some 2012 trends material, but in the meantime I thought I’d repeat the 3 most popular postings from my blog in 2011. Here’s the first one. BTW, interesting fact. The 3 most popular posts have all been quite long. So much for short attention spans!

There was a report in a newspaper a while ago about a mother whose six-year-old had asked her whether he should put a slice of bread in the toaster “landscape or portrait?” I mentioned this to my ten-year-old son and he said: “He should have Googled it.”

I mention this because I am interested in how spaces and places change how we think. In particular I am interested in how new digital objects and environments are starting to change age-old attitudes and behaviours, including how we relate to one another.

And this directly leads me to a very particular place, namely public libraries and the question of whether or not they have a future. In short, what is the role – or value- of public libraries and public librarians in an age of e-books and Google?

Now at this point I have to put my hand up and admit to being wrong. Some time ago I created an extinction timeline, because I believe that the future is as much about things we’re familiar with disappearing as it is about new things being invented. And, of course, I put libraries on the extinction timeline because, in an age of e-books and Google who needs them.

Big mistake. Especially when one day you make a presentation to a room full of librarians and show them the extinction timeline. I got roughly the same reaction as I got from a Belgian after he noticed that I’d put his country down as expired by 2025.

Fortunately most librarians have a sense of humour, as well as keen eyesight, so I ended up developing some scenarios for the future of public libraries and I now repent. I got it totally wrong. Probably.

Whether or not we will want libraries in the future I cannot say, but I can categorically state we will need them, because libraries aren’t just about the books they contain. Moreover, it is a big mistake, in my view, to confuse the future of books or publishing with the future of public libraries. They are not the same thing.

Let’s start by considering what a public library is for. Traditionally the answer would have been a place to borrow books. This is where the argument that libraries are now dying or will soon be dead originates. After all, if you can download any book in 60-seconds, buy cheap books from a supermarket or instantly search for any fact, image or utterance on Google why bother with a dusty local library?

I’d say the answer to this is that public libraries are important because of a word that’s been largely ignored or forgotten and that word is Public. Public libraries are about more than mere facts, information or ‘content’. Public libraries are places where local people and ideas come together. They are spaces, local gathering places, where people exchange knowledge, wisdom, insight and, most importantly of all, human dignity.

A good local library is not just about borrowing books or storing physical artefacts. It is where individuals become card-carrying members of a local community. They are places where people give as well as receive. Public libraries are keystones delivering the building blocks of social cohesion, especially for the very young and the very old. They are where individuals come to sit quietly and think, free from the distractions of our digital age. They are where people come to ask for help in finding things, especially themselves. And the fact that they largely do this for nothing is nothing short of a miracle.

It is interesting to me that so much is made of the fact that most things on the internet are free. Indeed whole books have been written on the subject of this radical new price. But the idea of free information is nothing new and when free public libraries were invented the idea was even more radical because of the high cost of books.

Of course, there is the argument that virtualisation means that we will no longer need public libraries – or that if they continue to exist their services will be tailored to the individual and they will be capable of instantly sending whatever it is that we, as individuals, want direct to the digital device of our choosing. And perhaps some libraries will do this for a fee rather than for free.

Costly mistake. This would be a huge error in my view, partly because what people want is not always the same as what they need and partly because this focuses purely on the information at the expense of overall learning and experience.

Some people have argued that content is now king and that the vessel that houses information is irrelevant. I disagree. I believe that how information is delivered influences the message and is, in some instances, more meaningful than the message.

As I’ve already said, libraries are about people, not just books, and librarians are about more than just saying “Shhh.” They are also about saying: “Psst – have a look at this.” They are sifters, guides and co-creators of human connection. Most of all they are cultural curators, not of paper, but of human history and ideas.

In a world cluttered with too much instant opinion and we need good librarians more than ever. Not just to find a popular book, but to recommend an obscure or original one. Not only to find events but to invent them. The internet can do this too, of course, but it can’t look you in the eye and smile gently whilst it does it. And in a world that’s becoming faster, noisier, more virtual and more connected, I think we need the slowness, quietness, physical presence and disconnection that libraries provide, even if all we end up doing in one is using a free computer.

Public libraries are about access and equality. They are open to all and do not judge a book by its cover any more than they judge a readers worth by the clothes they wear. They are one of the few free public spaces that we have left and they are among the most valuable, sometimes because of the things they contain, but more usually because of what they don’t.

Of course, we could put a Starbucks into every library – and we could allow mobile phone use and piped music throughout too – but then surely what we will be left with are more global outposts of Starbucks not local libraries.

What libraries do contain, and should continue to contain in my view, includes mother and toddler reading groups, computer classes for seniors, language lessons for recently arrived immigrants, family history workshops and shelter for the homeless and the abused. Equally, libraries should continue to work alongside local schools, local prisons and local hospitals and provide access to a wide range of e-services, especially for people with mental or physical disabilities.

In short, if libraries cease to exist, we will have to re-invent them.
Now, admittedly many younger people still see no need to visit a library. Many, if not most, will not have done so in years. But this could be because they still see libraries as spaces full of old books rather than places full of new ideas.But this may change.

In my view it is inevitable that the ongoing digitalisation of culture will lead to an ever-greater integration of cultural institutions and public libraries will shift from being book places to places that curate our cultural and intellectual heritage. Libraries will thus become memory institution like art galleries and museums. Indeed, why not physically combine all three?

This, of course, means that the role of librarians will change. The idea of professional librarianship will fade and in its place will emerge the idea of professional informational and cultural curators and this will embrace a variety of different skills.

But let’s bring it back to why the physical space that libraries occupy is so important. Again, libraries are not important because they contain books per se. They are, in my view, important because of how a place full of books make people feel. Great libraries, like all great buildings, change how you feel and this, in turn, changes how you think.

So what’s my idea here? Two thoughts. The first is that we should accept that a library without books would still a library because it would continue to be an important community resource – a neutral public space – where serendipitous encounters with people and ideas take place. This, surely, is an idea worth spreading.

My second idea is that we should consider funding libraries in new and novel ways. This could mean libraries going back to their philanthropic roots and asking wealthy individuals to buy or build libraries rather than football clubs or art galleries. Or it could mean getting governments to impose taxes on certain leisure pursuits that are known to provide no mental nourishment or social cohesion and use the revenue generated to subsidise other, more useful, things like public libraries or good books.

There is a considerable amount of discussion at the moment about obesity. The idea that we should watch what we eat or we will end up prematurely dead. But where is the debate about the quality of what and where we read or write? Surely what we put inside our heads – where we create or consume information – is just as important as what we put inside our mouths.