Toyota wants to get us truly mashing on cars

Toyota wants to get us truly crushing on cars

Toyota is very invested in love. The particular automaker has a central philosophy of creating vehicles that inspire ‘Aisha,’ an idea that literally means “beloved car” within Japanese. But the nature of ‘Aisha’ is changing, necessarily, just on the nature of automobiles themselves are essentially changing as we usher in automatic and semi-autonomous driving.

The key to making ‘Aisha’ work in this particular new era, Toyota believes, is based on using artificial intelligence to expand its definition, and to transform vehicles from something that people are merely thinking about and passionate about, into something that individuals can actually bond with â€? as well as come to think of as a partner.

To create a bond between an individual and a car that’s more than just epidermis (or topcoat) deep, Toyota feels that learning and understanding motorists, combined with automated driving, and a good AI agent that’s more friend than virtual assistant, is key. That’s precisely why it created ‘Yui,’ the digital copilot it has built into all of the Concept-i vehicles, including the Walk plus Ride, both of which debuted only at that week’s 2017 Tokyo Motor Display.

Toyota’s using strong learning to help make this work, analyzing user attentiveness and emotional says, based on observed body language, tone of voice as well as other forms of expression. It’s also exploration user preferences based on signals extracted from social networks including Facebook and Tweets, as well as location data from GPS NAVIGATION and previous trips.

The goal is to combine this information to assist its Yui assistant anticipate the requirements of a driver, ensure their basic safety, and maximize their happiness along with routes and destinations that suit their mood and personal preferences.

Using technology created by partner SRI International, Toyota is doing this simply by assessing a driver’s emotional condition and classifying them as possibly neutral, happy, irritated, nervous or even tired. Based on which of these feelings or states it detects in the driver, it’ll offer different programs of action or destination recommendations, and it can evaluate their reaction â€? even doing things like finding momentary lapses in put-on psychological facades, such as feigned happiness.

Yui will offer up different opinions to try to guide a driver to a preferred state, and it may employ various types of feedback to help bring about this, including different sights (cabin lights, for instance), sounds (piped through the vehicle’s stereo), touch (warmth via the steering wheel, perhaps) and even odor using scent emitters.

This isn’t just about making sure in order to wake up a sleepy driver in the event that they’re in danger of nodding off â€? though it can do that, too. Toyota wants its agent to be able to mix information gathered about an user through social sources, with emotion reputation, to suggest topics for conversation and enter into free discussions with all the user in a distraction-free manner, most with the end goal of building a connection between user and car.

A car is largely a symbol â€? up until now, it’s often been the conduit to freedom, and a way to an end of escape, of pursuit, or of getting you where you require, under your own power. In upcoming, it’s bound to become something different whenever we have autonomous vehicles readily available.

Dealing with virtual assistants nowadays can often be a source of frustration (ahem, Siri), but Toyota thinks it’ll one day be the key to unlocking a new type of bond between individual and machine: The carmaker considers that the best way to keep us caring our cars in this future would be to make it seem like they love all of us back.

Disclaimer: Toyota supplied accommodations and travel for this visit to the Tokyo Motor Show.



Please enter your comment!
Please enter your name here