How are you feeling today? A new generation of artificial intelligence is addressing this question. These technologies, known as emotional AI, use a variety of advanced methods, including computer vision, speech recognition, and natural language processing, to measure human emotions and respond accordingly.
Prof. Alan Smeaton, lecturer and researcher at the School of Computing at Dublin City University (DCU) and founding director of the Insight Center for Data Analytics, is working on the application of computer vision to detect a very specific condition: inattention.
Necessity is inventive and Help Me Watch was developed at DCU during the pandemic in response to student feedback on the challenges of online lectures.
âAttending lectures on Zoom is distracting when you have to do so in busy spaces like the family kitchen. And you are not among classmates, you are alone; It’s easy to get bored and let your attention wander, âsays Smeaton.
âWe developed an application that uses the students’ laptop webcam to recognize their faces. It does not matter whether they are far away, near the webcam or even in motion. Help Me Watch uses face tracking and eye tracking to measure attention throughout the presentation. “
Help Me Watch’s dashboard allows the instructor to observe general patterns to see which material was well received and which was less appealing. The use case for the individual student is probably more convincing: he notices when someone is missing a part of the lecture that he should have paid attention to, and sends him what he has missed.
Another DCU project – led by Susanne Little – also measures alertness levels, but within the scope of monitoring driver fatigue. Smitten explains that this application of computer vision is particularly challenging because of the fluctuating lighting conditions a driver is exposed to in different lighting conditions, but the use case is valuable: it would be an important function, especially for truck drivers.
Face image data
In the field of emotional AI, researchers and startups work with sensitive and personally identifiable data such as the facial image data mentioned above, but also with voice, text and even heart rate or galvanic skin reaction (how sweaty the skin is).
When capturing faces with the camera and processing the subsequent data, Smeaton points out that this is GDPR-compliant and that all data is anonymized so that a lecturer does not see the name of an individual student, but numerical identifiers such as “Student 123” .
In addition to data compliance laws, there are other ethical considerations. Dr. Alison Darcy, psychologist and founder of digital therapeutics start-up Woebot, says transparency is essential to build trust with end users.
âAI should always announce itself,â she says of Google Duplex, the human-sounding AI assistant that was first introduced in 2018. While some were delighted with the eerily natural voice of “ums” and “ahs”, AI ethicists were concerned the person on the other end of the phone mistakenly thought they were talking to someone else. Google responded by promising to include an automatic announcement that notifies the user that they are interacting with an AI assistant.
âIt should always be very clear that you are talking to a person or a bot. We have to be transparent or the world will get really weird very quickly, âDarcy adds.
Their creation, Woebot, is an AI-powered therapeutic chatbot designed to help the user apply cognitive behavioral therapy (CBT) principles, including mood monitoring or self-tracking.
âHow Woebot reacts to emotions depends on the emotional and cognitive state of the individual. If someone is really upset, they won’t start fooling around with them; it conveys appropriate empathy and invites the user to be guided through an evidence-based technique that will help them with the intense emotional state they are experiencing at that moment. “
The app also changes the tone or verbal complexity if necessary. As Darcy explains, someone who is in a really difficult emotional state has less cognitive abilities to analyze long, complex sentences, so Woebot’s verbosity drops and the warmth in tone is retained while there is less humor in it.
âMany people assume that Woebot is passively aware of the user’s emotional state [using sentiment analysis techniques] but we declared from day one that we weren’t going to do that, âsays Darcy.
âWoebot asks the user how they are feeling because it is more important to allow the user to experience their emotional self than for Woebot to recognize their emotional state. Telling a person that they sound angry, for example, can make them defensive and make them withdraw. ”
Another way AI can be empathic is to be there when it’s needed most. Woebot has developed a separate application to be used by women monitoring their wellbeing during pregnancy and for mental health after childbirth. Darcy says that 78 percent of all postpartum conversations happen between 10 p.m. and 5 a.m., times when new mothers don’t get a lot of sleep and talk. During these times it may be impossible to find a therapist, so this chatbot provides a lifeline.
While some think smart chatbots can’t interact with someone on a real therapist level, Woebot’s peer-reviewed study of 36,000 of its users suggests otherwise. It has been found that within three to five days of use, the user establishes a therapeutic connection with Woebot that was previously thought to be unique from person to person.
And while it seems counterintuitive, other studies suggest that people feel more relaxed about sharing personal information with an AI unit than they would another human, says Paul Sweeney, EVP of Product at conversational middleware platform Webio.
âIt’s easier to tell a smart assistant about sensitive issues like financial difficulties than it is to someone on the other end of the phone,â he says.
Webio creates intelligent chatbots for customers in the financial sector. These chatbots are far more advanced than the traditional FAQ or rule-based ones found on many websites. Customers can train a unique chatbot on their company data to teach them how to interact more effectively with their customers, and similar to Woebot’s functionality, the tone or formality of the language can be tweaked.
âJust changing the language can help. One of our clients saw a 30 percent improvement in responses because we changed the tone of voice and language.
âWebio automates human contact with customer service. The interface knows when it can’t help you and automatically connects you to a human agent who can. And over time, as long as a person is on the loop and making better decisions, it gets better, âsays Sweeney.
The emotionally intelligent part of Webio’s technology is that it can tell if a customer is afraid to pay for their credit card, for example. Part of the natural language processing techniques used include vulnerability analysis. For example, older customers can be more vulnerable and their request is therefore prioritized.
Sweeney is interested in other, as he puts it, more “noticeable” types of emotional AI, such as real-time recognition of speech feelings that uses voice biomarkers such as tone, stress levels, and emotional content. You can be very specific, he says, but this area will be tense and we must proceed with caution.
âPeople can say things they don’t mean. You can use idioms that mean one thing to you and another to you. You have to be very careful how these technologies are used.
âThe point of an emotionally intelligent AI is not to dazzle, but to understand the person and unobtrusively anticipate their needs in the context of the service they provide. Relax, be more welcoming, open and empathetic, use language better – and then invite people to talk. “