Autoliv Inc. will reveal its Learning Intelligent Vehicle (LIV) 2.0, a real human/machine interaction that the company says will help shape consumer acceptance of autonomous vehicles, at CES 2018 in Las Vegas.
LIV 2.0 is a “5th occupant in the vehicle,” capable of sensing driver and passenger instructions and moods through an array of sensors that track sound and hand movements, and then communicate with fellow human occupants. The company says LIV 2.0 is not a friendly voice overlay on a machine – it is the vehicle itself, equipped to sense and interact in various ways with drivers and passengers, while learning from those experiences.
“You do not use your car like a smartphone, but rather, you put your physical safety in your vehicle’s hands, so shared control and two-way trust between human and machine are crucial for the development and adoption of autonomous vehicles,” said Ola Boström, vice president of research at Autoliv.
At LIV’s core are deep learning algorithms that enable effective communication, including sensing driver gaze, emotion, cognitive load, drowsiness, hand position, posture and then fusing this information with data on the external environment to yield driving experiences that are not only safer, but feel that way too.
The LIV research platform will use the latest technologies developed by Autoliv to answer questions, such as: Under what unique circumstances will a driver need to assert control? Do technically accurate functions like deceleration or turning need to be modified to address passenger concerns? How should a smart car interact with people who are occupied with entertainment or other mobility experiences?
“Consumer reaction to this evolutionary change in how they operate or ride in vehicles remains an open question,” added Boström. “LIV will help innovate ways to make cars equally as intelligent about what’s going on in the interior of the cabin as they are about the road outside.”