News > i3

Human-Machine Interfaces Evolve in Cars


Switches and dials have been the norm for controlling things in cars, from the side mirrors to audio volume. But norms evolve. As automakers prepare for a world of shared self-driving cars, they’re experimenting with an array of human-machine interface technologies, or HMIs, including interior-facing cameras, gesture and voice controls, and touch-sensitive surfaces — all augmented by ever-smarter computing platforms.

Voice controls are en route to be the second most-prevalent interface by 2022, when it’s forecast to be in 80 percent of car HMIs, up from 48 percent in 2016, according to the consulting firm Frost & Sullivan. Data published last year in the firm’s Global Connected Car Market Outlook show touchscreens on top, with 90 percent market share by 2022, up from 29 percent two years ago. Multifunctional controllers (50 percent from 16 percent), handwriting recognition (30 percent from nine percent), digital instrument clusters (25 percent from seven percent) and head-up displays or HUDs (20 percent from five percent) follow. Only gesture controls will remain relatively rare in four years, with just five percent HMI penetration worldwide, but still up tremendously from 0.02 percent in 2016, Frost & Sullivan predicts.

They’re helping the driver “get more accustomed to newer technologies, so that the user acceptance is there before he or she is going to give over control to the car in autonomous mode,” says Niranjan Manohar, research manager for connected car and automotive IoT (Internet of Things) at Frost & Sullivan in Detroit.

In fact, the “Revolutionizing of HMI” is one of seven trends that will be reshaping the global auto industry by 2030, says a June report titled Future Automotive Industry Structure — FAST 2030, by the consulting firm Oliver Wyman. Others include connected car tech, self-driving cars, e-mobility, digital industry, pay-per-use distribution channels and changing customer structures. HMI, though, is the glue that ties everything together, explains Juergen Reiner, partner in the automotive practice of Oliver Wyman in Munich, Germany.

Transformation in the auto industry is founded in convergence, Reiner says. “Consumer devices, digital services, car manufacturing, mobility services and passenger transport all will merge into a new service category embedded in the connected life. HMI will provide the customer interface. The revolution will be in a new simplicity.”

Even so, HMI in cars is in a messy transition phase, Reiner says. “Cars have become overloaded and more complex with all these diverse HMI controls in parallel,” he adds.

Seeing Beyond the Screen

The seeds of today’s HMIs were planted when automakers replaced some buttons and switches with in-dash screens and menu choices as well as with voice commands, allowing cars to start learning what the passengers and drivers need, Reiner says.

Those thin-film transistor (TFT) LCDs account for about 75 percent of the screens in cars, says Frost & Sullivan’s Manohar, while most of the remainder are newer passive-matrix or PMLCDs. A tiny percentage of in-car displays are now part of the newest AMOLED (active-matrix organic light-emitting diode) variety, which are also found on high-end TVs and smartphones like the Apple iPhone X, Manohar says.

Today there are only about 20,000 to 25,000 AMOLED screens in vehicles worldwide, he says. The Audi A8 uses AMOLED screens for its rear-seat entertainment system.

Many more luxury automakers will introduce AMOLED into their vehicles in 2020, Manohar says, because of the clarity it offers for infotainment features. He anticipates an AMOLED screen debuting in the dashboard of Mercedes-Benz’s E-Class sedan that year.

Source: Frost & Sullivan, Global Connected Car Market Outlook

Yet HMI advances are spreading beyond the screen. By 2025, Manohar reckons there’ll be five to seven million “Level 4” self-driving cars on roads worldwide — models that will necessitate driver’s taking back control under some circumstances and this is spurring the development of gaze-tracking technologies such as cabin-facing cameras and iris scanners.

“Voice and gesture controls are another dimension to this,” says Masa Hasegawa, principal in the global automotive practice at Deloitte Consulting. “From an HMI standpoint, you’re going to see more of those types of technologies become real.” The popularity of Amazon’s Alexa and the consumer adoption of voice recognition, especially with vehicles being connected to the cloud, offer a hint at where this is headed, he suggests.

“Machine learning will become mainstream very quickly,” Hasegawa says. “I can see a situation where you no longer have to take your eyes off the road to push any buttons if questions [from the car] are coming at you and all you have to say is yes or no.”

To be sure, passengers and drivers “interact with the machine” differently, “and that’s where you’re going to see a wide array of what the [automaker] considers to be necessary HMI versus not,” he adds.

With Level 3 self-driving cars blossoming in the next few years, “moments of autonomy are on the increase, creating new opportunities for secondary and tertiary tasks, which could mean a dial-down on the importance of driver distraction,” says Tim Smith, global auto and mobility lead at USTWO, based in London, U.K. “The more autonomous it gets, the more the vehicle needs to communicate that it knows what it’s doing, to instill trust, that the vehicle is safe.”

The BYTON press conference at CES 2018

Smith points to newcomer EV brand BYTON — which debuted at CES 2018 — as an example of an HMI that interacts differently with the driver and the passengers in self-driving mode. “It’s more about the comfort of the journey” than the performance of the vehicle, he says. HMI is undergoing an identity change to be more experiential for everyone in the car, rather than informational for the driver alone.

Toward improving its own automotive HMI work, Smith notes, USTWO has created a prototype motorcycle HMI first. Since driver distraction on a motorcycle can be deadly, delivery and “contextualization” of information is more critical for its HMI than one for a car. “It’s an R&D project that we’re using to learn,” and USTWO will apply the lessons to an HMI it’s developing for a Level 3 self-driving car slated to be on sale in 2020, he says.

UX Meets IQ: The Car as Intelligent Assistant

Mercedes-Benz showcased a plethora of controllers in the new MBUX (Mercedes-Benz User Experience) HMI that also debuted at CES this year — including a touchscreen, a touchpad, steering wheel buttons and natural speech interaction that eschews preset commands.

“We wanted people to be able to easily learn the system,” says Alexander Hilliger von Thile, senior manager of UI and advanced engineering at Mercedes-Benz Research & Development North America Inc. in Sunnyvale, CA. “It should be a no-compromise system” — and that means “you talk to your car like you talk to some person, and the car figures it out.” So, for example, saying “Hey Mercedes, I’m cold” triggers MBUX to adjust the car’s cabin temperature.

Powerful graphics processors (GPUs) from NVIDIA produce fluid animations and make it easy to navigate among the on-screen menus and controls with no latency, “like you would see on a tablet or a phone,” Hilliger von Thile says. For instance, touching the headlights on a real time 3D-rendered image of the car brings up all the related control options and visually demonstrates effects (such as having the headlights move in sync with the steering wheel).

Artificial intelligence (AI) and machine learning also enable MBUX to predict and proactively do what the driver is most likely to want based on past behavior — such as automatically tuning the radio to different favorite stations in the morning and the evening. “It’s like your companion in this regard,” Hilliger von Thile asserts. But unlike Apple’s Siri or Amazon’s Alexa smart speakers, MBUX doesn’t send any data to a cloud server. All the user’s data resides in the car, he adds.

MBUX launched this spring in Mercedes-Benz’s entry-level A-Class cars. He stresses that Mercedes-Benz is striving for consistency with the MBUX HMI over time. Thus, even as cars radically change from one generation to the next, MBUX will retain the same basic interface from one version to the next and across car lines. He says, “We don’t want you to have to relearn things across vehicle generations or between updates. There should be a consistent base layer, and this is what we wanted to introduce with MBUX.”

Meanwhile, Volvo has developed a different concept for its latest HMI, which prioritizes information by importance into “Now, in a While, and Whenever” areas, explains Malin Labecker, director of the user experience department at Volvo Cars in Gothenburg, Sweden. “Now” information appears in the head-up display projected onto the windshield in front of the driver; “In a While” information is relegated to the instrument cluster behind the steering wheel; “Whenever” information resides in the large touchscreen on the center stack. And going forward, this will evolve in the same direction as consumer electronics, towards smart devices that “know what you want,” she says.

In 2017, Volvo unveiled a partnership with Google to co-develop the automaker’s next-generation infotainment and connectivity features based on the Android operating system — to offer more user personalization plus connected and predictive services in and around the car. In May it was announced that these will include embedded Google Assistant, Google Play Store and Google Maps.

In essence, the car has evolved from being merely a means of transportation into being a connected device like a smartphone or tablet, Labecker says. But “just looking at it as a device is not enough. We have to bring in the full ecosystem,” she adds. For the automotive industry, Labecker says, “This is our biggest challenge — to keep up with the technology around us.”

Robert E. Calem

Tagged

Related