Toyota is very invested in love. The particular automaker has a central philosophy of creating vehicles that inspire âAisha,â an idea that literally means âbeloved carâ within Japanese. But the nature of âAishaâ is changing, necessarily, just on the nature of automobiles themselves are essentially changing as we usher in automatic and semi-autonomous driving.
The key to making âAishaâ work in this particular new era, Toyota believes, is based on using artificial intelligence to expand its definition, and to transform vehicles from something that people are merely thinking about and passionate about, into something that individuals can actually bond with â? as well as come to think of as a partner.
To create a bond between an individual and a car thatâs more than just epidermis (or topcoat) deep, Toyota feels that learning and understanding motorists, combined with automated driving, and a good AI agent thatâs more friend than virtual assistant, is key. Thatâs precisely why it created âYui,â the digital copilot it has built into all of the Concept-i vehicles, including the Walk plus Ride, both of which debuted only at that weekâs 2017 Tokyo Motor Display.
Toyotaâs using strong learning to help make this work, analyzing user attentiveness and emotional says, based on observed body language, tone of voice as well as other forms of expression. Itâs also exploration user preferences based on signals extracted from social networks including Facebook and Tweets, as well as location data from GPS NAVIGATION and previous trips.
The goal is to combine this information to assist its Yui assistant anticipate the requirements of a driver, ensure their basic safety, and maximize their happiness along with routes and destinations that suit their mood and personal preferences.
Using technology created by partner SRI International, Toyota is doing this simply by assessing a driverâs emotional condition and classifying them as possibly neutral, happy, irritated, nervous or even tired. Based on which of these feelings or states it detects in the driver, itâll offer different programs of action or destination recommendations, and it can evaluate their reaction â? even doing things like finding momentary lapses in put-on psychological facades, such as feigned happiness.
Yui will offer up different opinions to try to guide a driver to a preferred state, and it may employ various types of feedback to help bring about this, including different sights (cabin lights, for instance), sounds (piped through the vehicleâs stereo), touch (warmth via the steering wheel, perhaps) and even odor using scent emitters.
This isnât just about making sure in order to wake up a sleepy driver in the event that theyâre in danger of nodding off â? though it can do that, too. Toyota wants its agent to be able to mix information gathered about an user through social sources, with emotion reputation, to suggest topics for conversation and enter into free discussions with all the user in a distraction-free manner, most with the end goal of building a connection between user and car.
A car is largely a symbol â? up until now, itâs often been the conduit to freedom, and a way to an end of escape, of pursuit, or of getting you where you require, under your own power. In upcoming, itâs bound to become something different whenever we have autonomous vehicles readily available.
Dealing with virtual assistants nowadays can often be a source of frustration (ahem, Siri), but Toyota thinks itâll one day be the key to unlocking a new type of bond between individual and machine: The carmaker considers that the best way to keep us caring our cars in this future would be to make it seem like they love all of us back.
Disclaimer: Toyota supplied accommodations and travel for this visit to the Tokyo Motor Show.