Honda to show EV concept with AI emotion engine from joint project with Softbank
06 December 2016
At CES 2017, Honda will showcase what it calls a future technology path toward a redefined mobility experience. The exhibit will include the NeuV, a concept automated EV commuter vehicle equipped with artificial intelligence (AI) called the “emotion engine” that creates new possibilities for human interaction and new value for customers.
The emotion engine is the focus of a joint research project with Softbank Corporation that Honda announced in July 2016 to apply the AI technology in mobility products. (Earlier post.) The “emotion engine” is a set of AI technologies developed by cocoro SB Corp., which enable machines artificially to generate their own emotions.
Pepper. In June 2016, Softbank Robotics launched “Pepper”—the world’s first personal robot that can not only read emotions but have its own. Pepper’s emotions use emotion functions developed by cocoro SB Corp. The emotion functions in Pepper are modeled on the human release of hormones in response to stimuli absorbed by the five senses which in turn generate emotions.
|
Pepper with humans. © Troy House - Corbis. Click to enlarge. |
In addition to Pepper’s emotion recognition functions, Pepper has capabilities to generate emotions autonomously by processing information from his cameras, touch sensors, accelerometer and other sensors within its “endocrine-type multi-layer neural network.”
With this emotion function, Pepper’s emotions are influenced by people’s facial expressions and words, as well as the surroundings, which in turn affects Pepper’s words and actions. For example, Pepper is at ease when around known people, happy when praised, scared when the lights go down.
Depending on the emotion at the time, Pepper raises its voice or sighs, for example. Pepper’s emotions can be seen on the robot’s heart display, which shows different colors and movements. A number of robot apps have been developed to make life fun with an emotional robot. Pepper’s Diary, for example, links Pepper’s emotions with daily family events that are recorded with pictures and photos.
Kawasaki. In August 2016, Kawasaki Heavy Industries, Ltd. (KHI) announced it was moving forward with plans to develop next-generation motorcycles that have a personality and can grow along with the rider. The motorcycles being developed will also use the “Emotion Generation Engine and Natural Language Dialogue System” being developed by cocoro SB.
From the words spoken by the rider, this AI-controlled system can pick up on the rider’s intent and emotional state. Enabling rider and motorcycle to communicate and share an understanding of purpose will open the door to a new world of unprecedented riding experiences, KHI suggested.
Accessing Kawasaki’s bank of analytical chassis and running data stored on a cloud-based data center or referencing the vast amount of information available on the Internet, the system will be able to offer the rider pertinent hints for enhanced riding enjoyment, or relay safety-related or reassuring advice as the situation dictates. Through advanced electronic management technology, having the system update machine settings based on the rider’s experience, skill and riding style will also be possible.
Through repeated interaction, this kind of communication between rider and motorcycle will allow the motorcycle to develop a unique personality reflecting the individual idiosyncrasies of the rider. With mutual trust established, both rider and motorcycle will be able to improve and grow, offering an all-new kind of enjoyment.
Continuing its pursuit of open innovation and collaboration, Honda also will announce at CES 2017 initiatives with startup companies and global brands that will create a more productive and enjoyable mobility experience.
The 5 January 2017 Honda press conference will feature a keynote address from Yoshiyuki Matsumoto, President & CEO of Honda R&D Co., Ltd, who will unveil a concept motorcycle demonstrating an application of the company’s robotics technology.
More useless nauseating shizzle for the Starbucks crowd that sees everything through a computer terminal or PDA.
My gut hurts.
Posted by: Dr. Strange Love | 06 December 2016 at 07:48 AM
Vehicles equipped with artificial intelligence is one thing, but emotions? What happens when your car gets mad at you? Or depressed and suicidal? Or develops a crush on the BMW you park next to?
And yes I am having a lark at this, so don't take it serious.
Posted by: ai_vin | 07 December 2016 at 08:37 AM