A raised eyebrow, quizzical look or nods of the head are just a few of the facial expressions computers are now using to read people’s minds. An “emotionally aware” computer developed by British and American scientists can read an individual’s thoughts by analysing a combination of facial movements that represent underlying feelings.
“The system we have developed allows a wide range of mental states to be identified just by pointing a video camera at someone,” said Professor Peter Robinson, of the University of Cambridge, England. “Or just imagine a computer that could pick the right emotional moment to try to sell you something, a future where mobile phones, cars and Web sites could read our mind and react to our moods,”
He and his collaborators believe the mind-reading computer’s applications could range from improving people’s driving skills to helping companies tailor advertising to people’s moods.
At present, these computers are only reading the physical manifestations of what is going on in the subject’s mind but remember this is only the beginning. Soon, the advancements and speed of convergence of information technology, neurology and psychology is going to determine that knee-trembling moment when, if we are willing, we make that extraordinary leap to meld our minds with intelligent machines.
The machines exist but are currently very expensive at around £2million each, using a technology that allows researchers to tell if a person is responding to consumer choices by detecting active parts of the brain that ‘light up’ specific types of mental activity.
What then will selling be like, when for example, as an alternative to salespeople painting mental pictures, the client will hook into a cyberspace-presentation that not only portrays the product, but also at the same time engenders just the right amount of logic and emotions that will interface specifically with that client’s dominant buying motives.