June 27, 2017

7 things that keep Elon Musk awake at night

Tesla takes the lead in making autonomous driving available for everybody. Google is currently testing a prototype that drives fully autonomous. But also Uber, Lyft and new companies such as Faraday Future and Rimac are focussing their propositions around achieving a similar outcome.

Everybody is talking about it; autonomous cars! After two years of doing thorough research, designing, prototyping and testing we have formed an opinion of our own on this topic. In this article, I will share our take on, what we believe are, the success criteria for future autonomous driving.

During this process, in the summer of 2016, Bas Bruining and myself joined global leading automotive and technology companies at the CAR HMI conference in Berlin. There we had a chance to listen to, and take part in discussions to understand where the industry is heading. Talks were given by Audi, BMW, Tata, Ford, Jaguar Land Rover, PSA (Peugeot Citroen), Opel, Bosch, Valeo, Continentals and Volkswagen. But also tech brands such as Google (Android Auto) shared their insights.

From our research we distilled 7 major themes (or success criteria). They are:

1. Mode & Automation Confusion
2. Building Trust
3. Outside HMI (Human Machine Interface)
4. Take Over Sequence
5. Redefined UI paradigms
6. Direct Augmented Reality
7. Motion Sickness

1. Mode & Automation confusion - Driving vs supervising

Everybody knows it is going to happen. Cars will become autonomous. But, first they will be SEMI-autonomous. SEMI in a sense that they will be fully capable of driving themselves, but only on a dedicated trajectory or in uncluttered traffic situations. In this “transition-period" the driver will have to remain pro-active and is expected to intervene when a situation asks for it.

A car working independently from the driver may feel awkward at first, but will probably become a trustworthy companion as systems will present very smooth piloting. When my car becomes this trustworthy companion, I am increasingly tending to believe it is always right; leading to disbelief when it's actually not, resulting in lack of action.

Besides false confidence, lag of action can also be induced by confusion over the level of automation that my car is currently serving. Today's cars already have different levels of automation (e.g. Lane-keeping and adaptive cruise control) and it's expected that the complexity will increase significantly in the near future. In this sense we can learn from an industry that has been familiar with the concept of auto-piloting for years: the aviation industry. Various kinds of auto piloting modes can be found in airplanes. Unfortunately, accidents have happened due to confusion about the mode of automation that was active. Thus, different automation modes must be clearly identifiable to the driver and it is key to avoid subtleness.

Different automation modes must appear clearly different to the driver. Avoid subtleness.

2. Trust building - Does the car really see the danger?

Ask just about anybody to take their hands away from the steering wheel whilst doing 120 km/h on a busy commute. Like me, they will be very reluctant to do this. It goes against every instinct of self-preservation. It will take time for us to adapt and eventually trust the technology that will be driving us. The cars of the future need to be clear about this. Tesla does a good job by showing what the car 'sees' and thus telling why it is performing the maneuvers to keep us safe.

Show what the car 'sees'. Tell why it is doing what it is doing

3. Outside HMI - Communicating with other road users

Many challenges arise when the car is driving autonomously. On the highway it is the least challenging; there are not so many variables and direction and behaviors are quite predictable. But once we drive in a more populated area with other road users some interactions are missing. A pedestrian intending to cross the road will seek eye contact with me when I am approaching, to be sure they are seen. Our non-verbal communication tells him/her to either cross the road or wait. When they try to do this when my car is driving autonomously, I will not be engaged and will (probably) not look at the pedestrian. My car itself needs to communicate with the pedestrian here. It is time that in addition to the flashing light, we get more HMI solutions in the car exterior. It might even be a great opportunity to strengthen the identity of a car brand here; is the car facilitating in traffic or taking priority when it can? And how assertive should the car be?

Outside HMI solutions are needed to communicate with other road users.

4. Take over sequence - Automation takes focus away

In a semi-autonomous car, there is a need to take over and give away control of the vehicle.

You want to be really sure about this when driving around with more than 100km/h. Imagine giving away control, how do you know that your vehicle is not going to hit that tree you are approaching? Tesla's push-button input and blue icon feedback on the instrument cluster seem a bit poor with regards to the huge consequence it has.

But then, taking over control again. I was just reading my messages on my iPhone… How do I know where I am, where I need to go to and what's around me? In more sophisticated terms, this is called 'situational awareness', which you lack when you were doing other things. So we need to design a sequence of steps to prepare the driver for driving again.

As a system becomes more reliable, take over time gets worse. Our confidence in the system will grow, we will tend more to do other things and need to drive less ourselves. This will definitely have an impact on our driving capabilities.

Drivers need to be prepared through a sequence of steps for driving again.

5. Redefined UI paradigms - Bias towards actions

So we are driving ourselves again. Ping! I get a message. What to do? I really want to read and reply! Also I want to increase the temperature a bit and skip the song that is currently playing with my infotainment system. All this and more whilst keeping my eyes on the road!!

Everybody can imagine that using a touch screen (Tesla!) requires too much eye hand coordination in this non static environment. Sometimes a big rotary button is just better fitting the context.

So instead of applying tablet paradigms and giving the user many options, we need to be smarter and use opt-outs and suggest most likely actions, rather than doing all the UI navigation by yourself.. Facilitate those really accessibly. We have to make sure that the amount of interactions is as little as possible.

On infotainment systems bias towards actions and make sure that the amount of interactions is as little as possible.

6. Direct Augmented Reality - Introducing big Heads Up Displays


Speaking to suppliers on the conference and listening to their contributions we can expect bigger HUDs (Head-Up Display) in the future. This is a great opportunity to enrich the driving experience. Information on an HUD is more accessible as it is closer to the point of focus and does not require the eyes to adjust depth. We think that when the manufacturers can make the combiner HUDs (with depth difference), that a direct link between real situation and driving assistance is possible. However as the road is static but the car is moving in all directions, there are immense technical challenges that need to be overcome in the coming years.

With AR a direct link between real situation and driving assistance is possible.

7. Motion sickness - Visual vs physical experience

An effect that occurs often with virtual reality or simulators is that users experience forms of motion sickness. They become nauseous and/or dizzy. This is because the mind gets confused as it experiences motion visually but not physically.

The same effect is there when the body experiences this the other way around. We already see this happening with passengers in a car. As they are reading a book or at least not looking outside, they can get car sick. This is not pleasant but from a safety perspective fine, as they are not steering the vehicle. It becomes more risky when the driver gets this in autonomous driving mode and has to take over again. He will be less suited for the job.

The main way to solve this is to allow for an outside, forward view. We don't expect cars to become fully closed boxes. Also, informing the driver and passengers on the expected trajectory has a considerable benefit. Even in interior design we think we can help to prevent motion sickness; by using ambient displays to mimic the outside motion on the inside of the car.

Allow for an outside, forward view. Inform on the expected trajectory. Mimic the outside motion on the inside of the car.

So there it is. These are, what we think, the seven themes that will need to be solved from a HMI perspective to ensure smooth and safe autonomous driving. Although these success criteria are tailored to the future of driving cars they can be applied to basically anything that has a cockpit and deals with some level of automation. You are probably not Elon Musk, but even if you are, please call us. We are happy to discuss opportunities for your business to see where and how automation can serve you. Our studio is always open to anyone who is interested in this field.

If you like this post and want to know more, drop me a line!

Like to hear more about this item?

Get in touch with Pieter via email.

Related news

Look who’s talking: chatbots and voice interfaces

View news

The SITA Drop&Fly soars to victory at the Red Dot awards

View news