What if the screens that we look at all day could read our faces? I’m not talking about facial recognition, but rather software and devices that can read the mood of individual users or perhaps the mood of a large crowd.
Such technology already exists. In 2009 scientists from MIT created a project in which the facial reactions of volunteers watching adverts during the Super Bowl could be recorded using webcams. Data was then grouped by age to gauge reactions, by age, to certain types of advertisement.
In the future this could perhaps be done in real time and broken down by age, gender and a host of other factors for anyone and everyone watching an event on a screen. So a clever new tool for advertisers then? Yes, but think a little more broadly. The concept of giving people a non-verbal voice and, in particular, of assessing the emotion of a large crowd could be useful in politics, especially during elections.
It could also be valuable to authoritarian regimes interested in judging the mood of a country or perhaps in identifying small groups intent on disagreement. The technology that exists right now can can tell the difference between not only happiness and sadness, but between interest, disgust and contempt.
In fact it can even filter out joyful smiles from sad or frustrated smiles. There’s still a long way to go, because humans can read facial expressions and body language to a degree that machine still cannot, but expect such emotion-reading technology to develop, for better and for worse, in the decades ahead.