Create a free profile to get unlimited access to exclusive videos, sweepstakes, and more!
Westworld isn’t happening anytime soon because AI can barely even recognize human emotion
Maybe the cyborgs in Westworld have been programmed with some sort of uber-futuristic computer brains that switch on and recognize human emotions, but for real AI, feelings are a fail.
AI can’t even get a read on human emotions. Dr. Damien Dupré of Dublin City University in Ireland led a new study that proved robots can’t even figure out what the looks on our faces mean. Human subjects were found to have a 72% rate of interpreting the emotions on others’ faces right. Not so for things without actual neurons. Facial affect recognition AI, which is supposed to recognize emotions through facial expressions, scored as low as 48% and couldn't go higher than 62%. Those results were even worse when it came to spontaneous emotion. So AI isn’t just terrible at telling how we feel—it is also really inconsistent.
“Inspired by the vision of an emotionally intelligent machine, efforts have been targeted towards computer systems that can detect, classify, and interpret human affective states,” Dupré said in a study recently published in PLOS ONE. “This involves the ability to recognize emotional signals that are emitted by the face.”
Unfortunately, it doesn't seem that you can program that ability so easily. It would be genius if it somehow could be captured by AI, because Dupré and his team observed that automatic facial affect recognition can save time and money over human coding. Such a thing (if it was accurate) could potentially be applied to everything from security to medicine to education and even marketing. It will be some time before that happens. The team tested eight existing automatic classifiers, which are often used to recognize text or documents but can also use that type of technology to do the same with facial expressions.
Both humans and AI were shown 937 videos from two extensive databases. These videos covered the six basic emotions, which are happiness, sadness, anger, fear, surprise and disgust. Humans beat the AI at this. When it was tried again, the two automatic classifiers which had performed best the previous time came close when they were presented with posed expressions, but still flunked at identifying spontaneous emotion. The team feels that the solution to this problem could lie in more spontaneous facial databases. Those that are around now have mostly posed expressions stored. Algorithms obviously have a limit.
“Subjecting only deliberately displayed expressions to automatic classification, analysis, and benchmarking may provide insufficiently robust validation results,” Dupré said, adding that “This issue is further exacerbated by the general trend to train computer algorithms on posed expressions that are highly intense and homogeneous.”
So you can only imagine what would have to happen to get a Westworld level of facial recognition alone (never mind robots being able to feel human emotion). There are so many possibilities for spontaneous expression that databases of them would take forever to compile, and even then, they would probably still be missing something.
You have to really think about what kind of knowledge would have to be programmed into AI whose level of recognition is advanced enough to face off against actual humans and possibly win. Individual facial expressions vary depending on situation, personality and even facial features. Someone whose lips naturally point downward may be considered less happy than someone whose lips point upward, even if neither is happier than the other. Going back to Dupré’s concern about AI mostly being able to read only really intense expressions, there is also clearly a lack of subtlety. The facial affect recognition tech around today would probably be better off at telling us exactly how we feel if we were all walking emojis.
So don’t expect a robot to understand how you’re feeling, but depending on how you imagine a machine pulling that off, it might actually be a relief for now. You probably wouldn’t want a nosy cyborg to read your face so it can tell your boss what you’re really thinking.