The generation of emotional responses involves two factors: stimulus and action. In fact, the process of generating emotions is based on external stimuli perceived by the subject who, even through the unconscious, is led to have a reaction. As William James – father of American psychology – once wrote “it is difficult to imagine emotions without their physical expression”.
Over the years, different lines of research have alternated. Some scientists think that emotions are the threads that make up our mental life and that define who we are both in our eyes and in those of others. Others take a more skeptical view; they think that emotions are mental states too complex to be tracked in the brain and analyzed.
However, these two schools agree on one point: “emotions are things that just happen and cannot be generated on command”. The only thing you can do is present external stimuli that have the potential to automatically trigger emotions. Proof of this is that every human being has a unique and distinct way of responding to stimuli.
The generation of emotions starts from the functionalist conception of the mind which is divided into mechanical, electronic and biological; it is understood as a program that can work on any type of machine.
Nowadays, we are still far from being able to say that a machine can feel emotions like a human being or that it is possible to program a computer to make it feel and experience certain emotions. However, technological developments and Emotion AI are making significant strides in that direction.
How does it work?
As the IEEE’s article points out: “…the generation of emotions requires synthetic actions that can represent the emotional state of artificial intelligence”.
One way to achieve this is through the use of natural language processing (NLP) which allows a machine or algorithm to parse and interpret the words and phrases used in a conversation. By analyzing the tone, syntax and context of a conversation, natural language processing can provide insight into the speaker’s emotions and help generate appropriate responses.
What are the potentials of the Emotion Generation?
With advances in natural language processing, machines and algorithms will be able to generate more natural and more human-like emotional responses by improving human-computer interactions. This would mean being able to use them in different fields, from customer service to education to health care.
For example, Abel is an empathic robot who can communicate with non-verbal or neuro-divergent healthcare patients. Abel’s “emotional brain” is powered using emotional artificial intelligence which helps interpret what the patient is feeling or trying to communicate and in turn decodes the emotions to respond appropriately. It is certain that social robotics has the potential to disrupt the healthcare sector.
Learn more about Emotion AI
Emotion generation is one of many topics related to the field of Emotion AI. If you want tolearn more, read our Emotion AI Glossary, which covers everything from the origins of Emotion AI to use cases.
You can also schedule a demo of Emotion AI (Emotiva’s EmPower tool) to see how it works.