A Cambridge (UK) scientist has developed a prototype computer that can ‘read’ users’ minds by capturing and then interpreting facial expressions – such as concentration, anger or confusion. In experiments using actors the computer was accurate 85% of the time although this dropped to 65% with others. The technology raises a number of privacy-related issues, not least the collection of highly-sensitive personal data. Toyota is allegedly already working with Professor Peter Robertson, the inventor, to link the emotional state of drivers with various safety controls and mood sensitive features in cars. Other customers might include insurance companies wanting to crack down on dishonest claims, banks targeting identity fraud, teachers trying to teach (does the student really understand?) or governments wanting to identify terrorists or social security cheats. In the future, car companies or local councils could even tailor road maps or directional signage to the level of aggression of individual drivers. What intrigues me most, however, is whether you could link mood sensitivity to products like radios and televisions so that they tune into ‘happy’ music or programs. There is also the fascinating possibility of online retailers tailoring their home pages, product offerings and even product descriptions to the emotional state of their customers.