Over the past decade or so, cars have become pretty complicated machines, with often complex user interfaces. Mostly, the industry has added touch to the near-ubiquitous infotainment screen—it makes manufacturing simpler and cheaper and UI design more flexible, even if there’s plenty of evidence that touchscreen interfaces increase driver distraction.
But as I’ve been discovering in several new cars recently, there may be a better way to tell our cars what to do—literally telling them what to do, out loud. After years of being, frankly, quite rubbish, voice control in cars has finally gotten really good. At least in some makes, anyway. Imagine it: a car that understands your accent, lets you interrupt its prompts, and actually does what you ask rather than spitting back a “Sorry Dave, I can’t do that.”
You don’t actually have to imagine it if you’ve used a recent BMW with iDrive 8 or a Mercedes-Benz with MBUX—admittedly, a rather small sample population. In these cars, some of which are also pretty decent EVs, you really can dispense with poking the touchscreen for most functions while you’re driving.
You can be general with your commands—if you tell the car “I’m cold,” it will bump up the cabin temperature, for example. Or you can be specific—telling the car to “set the front temperature to 75 degrees” or “turn on the seat heater to level 2” is, to me at least, a lot easier than remembering which segment of a touchscreen I’m supposed to poke.
The voice recognition is even good enough to understand me when I tell it to navigate to a specific address, to the point that I actually use the native navigation systems if I’m driving a modern BMW or Mercedes rather than relying on CarPlay like everyone else. Having passengers in the car doesn’t pose many problems, either—something you can’t say about BMW’s gesture control when the front-seat passenger talks with their hands.
Some of that credit should probably be directed at Cerence, which supplies (among other things) the voice assistant to both BMW and Mercedes (as well as BYD, Renault, VinFast, and others, Cerence told Ars). Because much of the software runs on the car, it has access to functions that cars using Google’s Android Automotive OS don’t. What’s more, Google’s once-heralded voice assistant feels like it has gotten worse at understanding speech over the past 12 months or so, for reasons I have yet to fathom.
My enthusiasm for talking to cars appears to put me in a minority. Despite a generation of nerds growing up with the adventures of KITT and Michael Knight, it seems like no one else wants to talk to their cars. Some of that is an exposure problem—as mentioned earlier, good voice control systems are not widely distributed yet.
But even among my colleagues who test the same cars for other outlets, I’m mostly greeted with skepticism when I praise good voice interfaces.
A 5,000-lb car is not the same as a smartphone
“I think part of it is just that there’s something inherently social about language. For thousands of years it has developed as an inherently social system. So I think there is something in human beings that is hesitant to talk to something that is not another sentient being,” said Betty Birner, a professor of linguistics and cognitive science at Northern Illinois University.
“We’ll talk to our dogs, but we may not want to talk to our toaster. So I think that’s a little part of it. That we use language to communicate and we have a notion of what communication means, and it means another mind. Right? My mind in communication with yours,” she told me.
“The other thing I mean, the obvious thing, is your car can kill you. Your toaster—I mean it could kill you, but you’ve really got to work at it. With a car there’s a real danger, so you need to really, really trust the artificial intelligence there, and I think people don’t understand how far artificial intelligence and natural language processing have gotten, and they’re not going to trust it with their life. Which, you know, is understandable,” Birner said.
https://arstechnica.com/?p=1916860