Your Relationship with Your Car May Soon Be a Two-Way Street

Your Relationship with Your Car May Soon Be a Two-Way Street

January 30th, 2017 Michigan – With uncanny accuracy, your car may soon have the ability to learn your mood, needs and wants simply by the sound of your  voice.

We’re well on the road to developing the empathetic car, which might tell you a joke to cheer you up, offer advice when you need it, remind you of birthdays and keep you alert on a long drive, said Fatima Vital, senior director of Marketing Automotive at Nuance Communications, which helped Ford develop voice recognition for the SYNC in-car connectivity  system.

It’s anticipated that by 2022, nearly 90 percent of all new cars will offer voice recognition capability, and 75 percent will offer cloud-based voice control.* Plus, with advanced microphones and in-car camera systems that pick up tiny changes in facial expression, and modulations and inflections in our speaking voice, future systems could evolve into personal assistants. They will be able to shuffle our appointments, order takeout when we’re held up in traffic, learn which songs we like to hear when we’re stressed, or even if we’d prefer to enjoy  silence.

Ford is currently conducting a research project with RWTH Aachen University that’s using multiple microphones to improve speech processing and reduce the effect of external noise and potential  disruptions.

Nuance says that within the next two years, voice control systems could prompt us with: œWould you like to order flowers for your mom for Mothers’ Day?  œShall I choose a less congested but slower route home?  and œYou’re running low on your favorite chocolate and your favorite store has some in stock. Want to stop by and pick some up? 

By accessing cloud-based resources, future systems could also enable more drivers to speak their native  language.

œVoice commands like ˜I’m hungry’ to find a restaurant and ˜I need coffee’ have already brought SYNC 3 into personal assistant territory, said Mareike Sauer, voice control engineer of the Connectivity Application Team at Ford of Europe. œFor the next step, drivers will not only be able to use their native tongue, spoken in their own accent, but also use their own wording, for more natural  speech. 

œWe’re well on the road to developing the empathetic car which might tell you a joke to cheer you up, offer advice when you need it, remind you of birthdays and keep you alert on a long drive.  said Fatima Vital, Senior Director, Marketing Automotive, Nuance Communications.

Today at Ford, our in-car connectivity system SYNC 3 already enables voice control through Apple CarPlay™ and Android  Auto™.

Apple CarPlay™ provides a simplified way to use the iPhone interface on a car’s touch screen, giving users access to Siri Eyes-Free voice controls, as well as Apple Maps, Apple Music, Phone, Messages, and a variety of third-party apps. Android Auto™ delivers Google Maps and music to a car’s screen while enabling voice controls for phone calls and messaging.

Soon, SYNC 3 will enable drivers to connect to Amazon’s virtual assistant Alexa in 23 different languages, with the ability to discern varying local  accents.

œLots of people already love their cars, but with new in-car systems that learn and adapt, we can expect some seriously strong relationships to form, said Dominic Watt, senior lecturer of the Department of Language and Linguistic Science at the University of York. œThe car will soon be our assistant, travel companion and sympathetic ear, and you’ll be able to discuss everything and ask anything, to the point many of us might forget we’re even talking to a machine. 