Disrupt social robotics by adding emotions
How it works
Emotiva’s computer vision algorithms enable the analysis of emotional and attentive responses of a subject, helping robotics teams develop a more natural human-to-machine interaction. The technology uses webcam to apply the recognition and measurement of facial muscle activations, correlated to determine the expression of the emotion being manifested.
Detection via a standard webcam
Measure each individual facial muscle activation with computer vision.
Get the right data
Gather meaningful information using the most accurate coding of action units.
Trigger the proper reaction
Once you get the right feedback on the interaction, you can reply in the most natural way possible.
Defining four layers of emotional states using Action Units
We detect and measure 23 Action Units, such as Inner Brow Raiser, Brow Lowerer, and Cheek Raiser, and the correlation between them to define primary, compound, and complex emotional states.
ROBOTICS CASE STUDY
Powering robot Abel’s brain with emotions
Emotiva’s Emotion AI technology powers an empathetic humanoid named Abel, created by the "Enrico Piaggio" Research Center (Research Center) of the University of Pisa. Abel connects with neurodevelopmental and neurodivergent patients and uses Emotiva’s technology to learn how to improve its empathetic capabilities and human interactions by reading and extracting emotions and reacting appropriately.
Learn more about robotics capabilities
Our team of emotion recognition experts is available to answer any questions.