Nissan believes robots will have to learn to second-guess

Machines will have to learn how to interact with humans on the road if an autonomous driving future is to become a near horizon reality.

That’s because driverless car technology will have to join the community of road users who are constantly assessing each other’s actions and perceived intentions, said Melissa Cefkin, design anthropologist working with Nissan on its autonomous driving projects.

Speaking exclusively to TU-Automotive after the Nissan Futures event in Barcelona, Cefkin said: “So we have been looking at what happens more generally among road users, how do they identify and establish expectations and negotiate when there is misunderstanding to proceed together down the road?”

Her research shows that road relationships between users are paramount to allow the free movement of traffic and reduce the potential hazards that exist in every car manoeuvre. Cefkin explained: “On the road, where people negotiate about who goes first, it’s an interaction between at least two parties while, very often, there are other parties that are informing what happens. For example, I’m letting you go first because I see I’m about to be blocked by a slower moving pedestrian. A lot of this is happening in the cars too, with a certain amount of what you’d call ‘social driving’ where the cue for the driver to pay attention is when someone else in the car stops talking, it’s ‘oh, they must have seen something I must pay attention to’.

“We are looking at how can you take advantage of the multiple parties that are engaged? So things like braking or taking back control of the car does not all have to be involved in a single agent. These are the ideas we are playing with to form different approaches to this problem.”

As a professional anthropologist, Cefkin’s work has been focused on HMI outside of the car cabin environment. She said: “This is a comparatively new area of research for the industry when we’re thinking about autonomous vehicles. Once you have these vehicles on the road, where you do not have a human driver involved, there are a lot of ways in which other road users have counted on the fact there is a knowledgeable human thinking person behind the wheel and be able to interpret what each other is doing to the extent of making eye contact and using hand signals and things like that.”

These interactions are clearly displayed during a driver’s lane discipline on highways and at every busy intersection. Cefkin said: “We are asking what do people do today and how do those situations play out in a collective manner? As anthropologists and sociologists, are looking at the world through collective eyes and not just individual cognitive or psychological views. A lot of what we do is not just ‘what I am doing and seeing’ but how it interacts with what others are doing in that situation. We are trying to discern what might be the typical patterns of interactions in these situations and what rules and signals people are often tacitly acknowledging to make their judgements.

“In the example of an outside lane hog, there are two options: you hold back and let them take their decision based on the perceptions that they may be elderly or a stranger to the area; alternatively, there’s the option of undertaking and going around the car in front.

“Practices and norms occur in different contexts and this is one of the insights we, as anthropologists and sociologists, bring the situation in recognising these different patterns and responses. Therefore, what you might do in one country may play our differently in another, what you do on a busy city road might be different to what you might do in a small sleepy beach town.

“If we can begin to establish some of the parameters that go into forming these patterns we are hoping this will let us ‘sees’ the decisions an autonomous vehicle will make by giving them these patterns and knowing what data to use to train them how to see the world.”

Cefkin said Nissan does not want to wait until a car’s connectivity with infrastructure is in place before fielding a driverless car in the market. She explained: “Right now we can identify things like moving objects but it will see a box of activity what it doesn’t tell us it’s a small boy chasing a boy. We’re taking the stance that we don’t want to wait for all the infrastructure and connected capability is in place before we know we’re safe. We believe that none of us today when we drive have that kind of information and while it’s not as safe as we’d like it, it’s not that bad either and we have managed our lives without getting too caught up in the problems that occur.

“We think there is a lot to work on so that we don’t have to rely on that complete connectivity.”

— Paul Myles is a seasoned automotive journalist based in London. Follow him on Twitter @Paulmyles_

Leave a comment

Your email address will not be published. Required fields are marked *