Robots may be mistaken for somebody who cares according to recent Amazon patent grant

  • Date

    January 3, 2023

  • Read time

    5 min

AI
ML

Yes, all robots are faking it, but how do you make a robot seem less moody, yet still emotionally invested in your life? Would you be more forgiving of an embarrassed robot?  Amazon points out that people find emotion a useful component of communication between one another. Can it be between people and robots? 

Amazon was recently granted a US patent that teaches a simulated emotional state of a device – like a physical robot – based on past emotions and contextual information.   

Amazon claims to improve upon the traditional system for simulating emotions, which exhibits “significant oscillation” from one emotion to another, such as laughing one moment to wanting to slap Chris Rock the next.  Amazon posits that machine learning alone is simply too costly and cumbersome to overcome these sudden mood swings.  

Instead, Amazon’s robot is sophisticated, picking up cues – feelings even –  from multiple devices, user inputs, time since interacting, 28 sensors, and, of course, words, ascribing pluses and minuses to each hint over time. The daily use is the training. No more erratic outbursts. This means your iPhone might really know when you’re being passive aggressive. 

The Tech

The system includes data processing modules responsible for external interactions and sensory. The robot interacts via audio receivers, cameras, and other sensors. It has speakers, displays, motors, projectors, and lights. The human-like sensors are augmented by complementary sensors such as lidar, radar, location sensor, ultrasonic sensor, bumper switches, temperature, air pressure, or floor optical motion sensor.  

The incoming data is processed and fed to the Emotion Module, using techniques such as OpenCV tools, OKAO machine vision library, Machine Vision Toolbox (MVTB) available in Matlab. Artificial neural networks (ANNs) are used, such as convolutional neural networks (CNNs), active appearance models (AAMs), active shape models (ASMs), principal component analysis (PCA). One of the challenges in speech recognition is the wake-word detection. Amazon applies several approaches, including large vocabulary continuous speech recognition (LVCSR) systems, Hidden Markov models (HMMs) together with Viterbi decoding. 

An emotionally-intelligent robot would provide a more fulfilling interaction and companionship while also monitoring its owner. Obvious use cases could be in the healthcare field or in any context requiring a person to be isolated.   

Of a particular interest is the emotional state which changes upon arrival of triggers. There are 10 trigger types, such as hardware/software failures, successful/unsuccessful tasks, or affirmative/negative user input. The robot then has a range of emotions, from happy to frustrated, which is shown to the human by facial expression and display movement. 

Congratulations to the inventors Yelin Kim, Amin Hani Atrash, Raumi Nahid Sidki, Vikas Deshpande, and Saurabh Gupta. 

CONTACT CLEDAR

Want to predict and measure future behaviors with accuracy?

Let’s talk to undertand how you can benefit from our CERN experience with machine learning, AI, data science and algorithms.