EMOTION DETECTION
The concept of emotional artificial intelligence or ‘emotion AI’ conjures up visions of humanoid robots in customer service roles, such as the lifelike ‘receptionist’ welcoming guests at a Tokyo hotel. A number of companies have added emotion recognition to their personal assistant robots so they too can have more humanlike interactions.
New uses are evolving quickly
In the past two years, emotion AI vendors have moved into completely new areas and industries, helping organizations to create a better customer experience and unlock real cost savings. These uses include:
Video gaming. Using computer vision, the game console/video game detects emotions via facial expressions during the game and adapts to it.
Medical diagnosis. Software can help doctors with the diagnosis of diseases such as depression and dementia by using voice analysis.
Education. Learning software prototypes have been developed to adapt to kids’ emotions. When the child shows frustration because a task is too difficult or too simple, the program adapts the task so it becomes less or more challenging. Another learning system helps autistic children recognize other people’s emotions.
Employee safety. Based on Gartner client inquiries, demand for employee safety solutions are on the rise. Emotion AI can help to analyze the stress and anxiety levels of employees who have very demanding jobs such as first responders.
Patient care. A ‘nurse bot’ not only reminds older patients on long-term medical programs to take their medication, but also converses with them every day to monitor the their overall wellbeing.
Car safety. Automotive vendors can use computer vision technology to monitor the driver’s emotional state. An extreme emotional state or drowsiness could trigger an alert for the driver.
Autonomous car. In the future, the interior of autonomous cars will have many sensors, including cameras and microphones, to monitor what is happening and to understand how users view the driving experience.