Further development of AI emotion sensors in the transport and logistics industry Series: Blog # 3

0


[ad_1]

In the first blog of the AI ​​transport and logistics series, I presented the AI ​​transformation innovations at Purolator, the second blog focused on accelerating a more intelligent AI telematics infrastructure in fleet management. This third blog explores AI emotion sensors and the impact the affective computer market will have on the transportation and logistics industries.

We are now entering a space where there are more intelligent things in everything.

We can already monitor the vehicle movements of a driver in order to identify safety risks from sharp bends, excessive speeding to braking intensity. Sensors move into our car steering wheels so that we can detect the tension a person feels from their grip on the steering wheel (too tight can be classified as tension / not relaxed and monitor whether the tension continues or subsides throughout the day?) Later in the day ). Going one step further, studies of gait (posture) and how people walk to and from their vehicles can also determine whether a person’s posture is upright with chin forward (can be classified as confident) or head down (can be) either lost in thought or sad?).

It seems that everyone in the automotive industry is thinking outside the box in order to get a complete picture of human behavior around the clock and bring humans and machines closer together.

How far these innovations go and are integrated into social norms and have ethical and private effects is still uncharted territory. What is clear is that the changes are in full swing.

You can already see how quickly Samsung is adding more vehicles to its branded SmartThings ecosystem. Although they currently focus on vehicle innovations where you can start or stop the engine of compatible vehicles using your SmartThings mobile app and turn on the heating or air conditioning before you get in your car, these earlier innovation projects create a future vision of only what is to come. Samsung has already partnered with Google to provide a SmartThings dashboard for Android Auto. This allows you to control your smart home products or open your garage door from your dashboard after connecting it to a Samsung phone.

In October, Samsung announced that Mercedes-Benz’s MBUX voice assistant supports SmartThings for hands-free control of smart home devices.

Understanding Affective Computing (Emotional AI segment)

Affective Computing, or Emotional AI, is a market that is expected to grow from $ 28.6 billion in 2020 to $ 140.0 billion by 2025, with a CAGR of 37.4% over the forecast period. Emotional AI or emotional artificial intelligence enables computer systems and algorithms to recognize and interpret human emotions by tracking facial expressions, body language or voice / language.

Emotion AI strives to bring man and machine closer together.

Using examples of facial recognition in vehicles, cars or trucks can recognize who the authorized driver is, automatically recognize you, and follow your voice commands. Computer vision algorithms are now very accurate and can identify and decompose a person’s face: eyes, tip of the nose, eyebrows, corners of the mouth, etc. and then track a person’s movement to identify a person’s emotions as well.

This is currently done by comparing large databases of facial expressions that can identify emotion types from facial expressions (joy, sadness, anger, contempt, disgust, fear and surprise). Additional software can extend emotion classification to include face recognition and verification, age and gender recognition, ethnicity and multiple face recognition, and much more.

Then we add speech recognition software to this mix that complements / correlates the facial recognition software to identify emotional states from acoustic speech and measure with high accuracy whether the speaker is happy, sad, surprised, angry, or neutral in mind.

According to the 7-38-55 rule of personal communication, words affect only 7% of our perception of the affective state. Body language makes up 55% and the tone of our voice makes up 38% of our non-verbal messages.

The majority of emotion AI developers agree that the main goal of multimodal emotion recognition is to make human-machine communication more natural. However, because these areas are still relatively new, there are questions about the ethical and privacy boundaries that affect what it means to be human.

Another example of AI-powered technology developed by the MIT Media Lab can track the wearer’s cardiovascular information and, if necessary, release various combinations of fragrances to treat certain mental health issues such as stress or anxiety.

So, below is a scenario that is not yet mainstream, but that gives you an idea of ​​what is possible today with the various AI support technologies that will impact the transportation and logistics industries.

scenario

Alexia Bolt is the fictional character in The AI ​​Dilemma, where she embodies a day in the life of each industry from a positive and a negative perspective, challenging executives to think wisely about the future world order we are creating in an AI-driven world want.

Below is a new scenario for Alexia Bolt to experience, and it is written from a positive perspective.

Alexia Bolt is a courier driver and she is a high performing driver who has no safety incidents or accidents in her track record. Your typical day is waking up at 6 a.m. and going to the New York Terminal to pick up your truck assigned for that day. She checks her cell phone and learns that she was assigned to Truck 3 today. When she drives into the terminal parking lot, truck 3 has already recognized her gear and flashes towards her, and when she comes closer, her door is automatically unlocked and her truck says “Good morning”.

Alexia also wears a smart, sensor-rich driving uniform. Your morning temperature will be checked for Covid 100 symptoms and your readings will be recorded in your vehicle’s telematics software. Her daily route is preprogrammed and her truck is already loaded with the packages she will deliver for the day. Their truck was being loaded by a robot named Robbie, who did the pre-loading earlier in the day.

If she checks her dashboard on her route summary, she can see that there are road works on her route, so she activates her detour system to minimize traffic and congestion. She asks Alexa how the weather is today and tells her that it is a sunny day and everything is clear and wishes her a wonderful, safe day. Alexia thanks with a generous smile on her face and is automatically classified as a happy person with a great attitude.

A refreshing scent of lavender is released momentarily to continue the calming mood as your truck automatically exits the parking lot to begin its route for the day. When she stops at her first customer destination, she brings the package to the door and rings the bell and meets a nice elderly lady who thanks her and wishes her a good day.

Her sensors automatically register that the customer she is interacting with is friendly and polite. She continues to make over twenty visits over a six hour period and pauses for lunch on a ride where she automatically receives her lunch by robotic arms assembling her previous phone order she placed while she was driving.

On the way back to the parking lot, the entire voice communication with all customer interactions was recorded and all driving behavior of the vehicle calculated a risk value for different profiles: the customer, the vehicle and the driver. All of these insights are aggregated at the terminal level to create a stronger health and safety footprint and keep people’s behavior confidential. Management is able to easily see the emotions of the overall driving behavior of the employees, and the employees can easily see the behavior of their team.

Open Transparent and collaborative communication is at the heart of the intelligent sensors contained in this modern T&L terminal – they bring man and machine closer together to create a smarter networked world.

diploma

If you are the CEO or board member of a small, medium-sized or large transport and logistics company, have you thought about your strategy for affective computing in your fleet management? Will you be one of the first innovators to pioneer wearable computing, with innovations in affective computing tied to smarter vehicles?

In summary, it can be said that the transport and logistics industry is subject to enormous change. Is your company ready for Emotional AI and Affective Computing? We have seen these methods work well in call centers to guide people to more effectively deal with customer emotions – will we see how these innovations are carried over to the logistics and transportation sectors?

Anyway we will.

We also know that China is making headway in these innovations, and the US government is already signaling concerns about being left behind in the AI ​​race.

Scenarios like this can help us advance our social context and realize that we can only design effective emotional AI solutions from experience.

Alexia Bolt is a viable character and she represents a world that is almost there. It’s just some of us can’t see it yet.

We have the opportunity to develop affective computing approaches to advance our health and safety strategies.

So much of being human is in our expressions and voices, so why not realize the full potential of human abilities and realize that well-designed AI can improve our safety and health in ways that improve our wellbeing.

[ad_2]

Share.

Leave A Reply