I recently had the opportunity to Uber home in a Tesla. Being a Tesla fanboy, I was ecstatic when the Model X pulled over and chauffeured me in with its falcon doors. The driver started giving me the rundown of the car’s features while tapping on its 17-inch infotainment system. This was especially exciting for me as I was in the midst of an auto-related project at work.
While I was excited to learn more about the car, it slowly became apparent to me that the driver’s eyes were more glued to the screen than the road. Something about interacting with a touchscreen when driving made me curious to know: just how distracting are they?
In order to learn more about driver behaviour around these interfaces, my team and I invited 21 participants to try out a driving simulator that we assembled on-site to analyze the cognitive stress that touchscreens put on our participants while performing their driving routines.
It should come as no surprise that interacting with a touchscreen requires more hand-eye coordination than traditional buttons and dials. The lack of tactility of a touchscreen means we are more likely to need our eyes to see where we are pressing than with traditional buttons and dials.
What came as a shock to us, however, was that even when our participants weren’t performing tasks associated with the touchscreen, their eyes were still drawn away from the road and towards the screen. They would routinely glance over to see if there was anything new to look at. This revelation was all the more surprising because screens have been in cars for as far back as 1986! (Ever heard of the Buick Riviera?)
When you think about it, though, this behaviour sort of makes sense. Rapidly glancing is something we do everyday with our phones, whether to check for new notifications or even just to kill time. Screens in general, whether or not they yield to touch, are a powerful draw for humans, distracting us from whatever it is we might be doing. Thinking, talking… even driving.
To really make sure that this was the case, we explored how our participants felt when the screen was turned off. You can guess how they responded…
“Now that there’s nothing to look at … my eyes can stay on the road… It allows me to be more focused.”
“Seeing the interface isn’t as important as being able to control it.”
Okay, so get rid of all the touchscreens, right? If only it were so simple. Even though our participants generally felt more attentive and safe, some of them also didn’t like that the information it displayed was no longer available to them…
“To be honest, I like having the map open at all times when I’m driving… so if this was able to operate in the background to get traffic updates, that’d be cool.”
In fact, some participants voiced how having a screen to help them get from point A to point B felt like an integral part of driving,something that buttons and dials couldn’t achieve on their own.
If screens are distracting yet at the same time a source of valuable information, what options do we have? How might we deliver that valuable information to our drivers without the use of screens?
An interface with increasing popularity and promise, voice immediately struck us as a viable candidate. A non-visual and non-tactile medium, voice assistants could provide drivers with valuable information while letting them keep their eyes on the road and both hands on the wheel.
Our hunch about voice was supported by the industry too. At this year’s CES conference, multiple automotive companies came out to show off their new in-car voice assistant systems.
However! Despite all our optimism, when it came down to actually validating our hunch, voice wasn’t the hero we were looking for. For starters, it was never our participants’ first choice: using voice to control temperature or find something to play felt laborious, and even when we reminded them that voice was an option, they still chose to use the centre console by hand.
It turns out that, as with touchscreens, using voice puts a lot of cognitive stress on the user. From thinking of what to ask for, to how to word it, to actually asking for it, the participant finds it much quicker and easier to just reach over and press a couple of buttons. Studies show that there is a huge drop-off rate on voice app adoption, owing in part to the effort involved. A driver’s eyes and hands may be free to focus on driving, but their minds are preoccupied by the process of forming utterances.
“I was trying to concentrate on driving, but also trying to think of what to play next. But I didn’t feel comfortable doing it. It took me away from the focus of driving.”
The inapplicability of voice in most cases is also about the physical space. Using voice at home is convenient because the user is not always within arm’s reach of the thermostat or TV remote. In an automotive environment, however, the driver is always able to manipulate the centre console by hand.
Voice is simply not efficient enough to be the primary method of input in a car, and should only be used as a complement to the core experience.In particular, we noticed that voice works best when performing tasks that would take multiple steps to get to by hand, such as finding a particular song to listen to, switching playlists, calling a friend, or asking for directions. Simpler actions like turning something on and off, however, are best reserved for good old buttons and dials. The fact that Tesla is moving their autopilot controls from the touchscreen to buttons on the steering wheel is good evidence of this insight in action.
As more and more products are invested with connectivity, a lot of people are starting to look at cars as effectively smartphones on wheels. Given that viewpoint, it’s only natural to want to replace the car’s traditional centre console with a giant touchscreen. Heck, it’s exactly what happened when the world moved from BlackBerries to iPhones.
But if there’s anything I learned from this research, it’s that few environments compare to driving or operating a car. And as a result, designing for the automotive use case can’t be approached with the same lens as designing for smartphone use cases. For each task a driver has to perform, we need to especially mindful of which input method most minimizes cognitive load. In my eyes, vehicles that primarily use a touchscreen are no different from driving a vehicle with low safety ratings.
I admit, the minimalistic look that a giant touchscreen gives is so elegant that it’s hard to resist. But aesthetics should never trump usability, especially for products that could put people’s lives in danger. As excited as I am for the future to feel like Knight Rider, our vehicles are not ready for that future just yet.
If you ask me, Acura’s new Absolute Positioning system is a step in the right direction. The screen is positioned much closer to the driver’s line of sight than most displays, and the touch pad below uses locked-in positional mapping rather than a free-ranging cursor, so the need to look at the screen is kept to a minimum.
…and when vehicles achieve level 5 autonomy, then we can just kick back and enjoy the ride.
Jacky is a Product Designer at Connected, a software product development firm that works with clients to deliver products that drive impact.
Originally posted on UX Collective