Ohio State Scientists Just Created an AI That's Better Than Human Beings at Reading Emotions
A new artificial intelligence has been constructed that is better than humans at reading emotions. This isn't based on the facial expression a person might be making, but instead, on how red their face is.
Yes, that's right—robots can tell when you're blushing.
What's particularly interesting is that this AI was created as a side project for different research—it wasn't even the primary goal of the experiment.
A team at Ohio State University has been researching human displays of emotions. Their studies have been based on the belief that humans display subtle but noticeable changes in skin color, caused by blood flow within the face, in order to communicate how we're feeling. If parts of a person's face get redder, they might be experiencing surprise or happiness, while if other parts change color, it could be an indicator of sadness or disappointment.
Apparently, it is possible to detect if someone is telling you a bald-faced lie, all by looking at how much blood is coursing through their face.
In order to test this theory, the scientists created an AI that could measure the color of certain parts of a person's face in order to guess at their emotions.
The robot was given pictures of a person experiencing different emotions, and, without having any training on understanding expressions, the AI was forced to make assumptions based completely on color. As the photos showed people with blank expressions, it wasn't possible for humans to determine their emotions based on whether or not they were smiling, and so everyone involved had to use facial hue alone.
The test was a success—the AI was actually more capable of detecting emotion correctly than humans who were using years of experience at reading people's faces.
The artificial intelligence managed to guess happy faces correctly 90 percent of the time, sad faces 75 percent of the time, and angry faces 70 percent of the time. This compares with the human score of guessing happy faces 70 percent of the time, sad faces 75 percent of the time, and angry faces 65 percent of the time.
Not only does this prove the competence of artificial intelligence, but it also makes a strong case for the possibility that we can read people's faces based on their color.
According to the study:
Robots have been getting better at spotting patterns and trends for a while now. Machine learning has enabled computers to learn to spot tumors on X-rays long before symptoms appear, so it makes sense that, by checking faces against a complex database, they'd be able to also tell what a person is feeling.
This opens up a lot of possibilities for future technology. Imagine a world in which your phone can tell that you're upset, and can intuit your needs, ordering you a pizza or forcing you to call your mom to talk about it.
There's a limit to how intrusive this technology might feel, but in the right hands, it could be potentially very useful for allowing robots to help humans with self-care.
Of course, the technology could also be used to allow your phone to tell when you're more susceptible to advertising, or to collect personal information about your daily life, but that's thankfully a dystopian nightmare that's still a little way away.