AVs Will Have to ‘Read’ Pedestrian Body Language

Fatal car accidents cost the lives of more than a million people globally each year but few incidents have received as much attention as the self-driving Uber collision that occurred last spring.

The deadly accident sent shockwaves through the auto industry, with many of its players unsure how to respond. It also highlighted the potential vulnerability of testing the technology on public roads. “Autonomous vehicles are extremely inefficient today because they only see a bounding box of a person, cyclist, car and the infrastructure, but not what people do,” said Maya Pindeus, co-founder and CEO of Humanising Autonomy. “Not the body language or the actions that people take.”

In the case of the Uber accident, those actions involved a woman walking a bike across a dark road. There are some reports that claim the vehicle did, in fact, see the woman – but it’s not a visual that AVs can easily identify. Pindeus hopes her start-up can change that.

“Our software looks at the body language to determine what will happen next,” she explained. “The Uber accident is an exact use case we solve with what we do. We would have been able to understand the likelihood for her to be crossing the street with a focus on her body language. What is she doing at the moment? Okay, she was walking a bike. Was she alert or not? Was she looking at the car or not? With that we would have come up with a prediction – ‘likelihood of crossing X’  – and with this, the vehicle would be able to react accordingly.”

Pedestrian behavior varies throughout the world, so autonomous cars must be able to differentiate between locations and human behavior. “I think the cultural aspect, how widely it differs from city-to-city, is interesting,” said Pindeus. “If you take Mumbai, for example, people have a completely different crossing behavior. People will just cross the street and by not looking left or right, they’re indicating intent. If you do this in London, people would be there for a long time. Therefore, it’s very important that people make eye contact with their car and, through eye contact, there is a mutual understanding of the traffic involved.”

That mutual understanding can be difficult to attain if a pedestrian is completely distracted but that’s why driverless cars must be able to identify and interpret even the subtlest of actions. For example, what is a car supposed to do if a pedestrian is staring at his or her phone while leaning forward? What happens if someone is merely thinking about crossing the street and then suddenly starts walking? These are just a few of the challenges.
“If the vehicle is able to understand that, then we have a very strong interaction for the vehicle to communicate in real-time with what’s happening,” said Pindeus. “It makes it safer but also more trustworthy.”

Alcohol presents an issue as well, one that has nothing to do with getting behind the wheel. While AVs may be able to prevent drunk driving, they can’t stop people from wandering aimlessly into the street. Jaywalking is yet another human behavior that autonomous cars must identify in order to navigate roads safely and its frequency may differ depending on the location, for example New York City residents might jaywalk more often than those who live in Detroit.

Pindeus isn’t sure how AVs will mindfully navigate these situations but passenger expectations will remain high. As a result, these vehicles will be expected to anticipate any and all behaviors that a pedestrian can muster. That could prove to be too much for any machine to accomplish, let alone one that weighs 2,000 pounds.

“I think re-education will happen along the way,” she said. “I think it is very important that people don’t have to change their behavior completely because of new technology, but to get the technology to understand how people act. Of course, people will, over time, act differently as cars become autonomous but I do think there needs to be an evolution of that.”

That could lead to new regulations. “I don’t think it’s always about the pedestrian being abusive, I think it’s about vehicles being abusive,” Pindeus added. “We need a standard for how machines interact with humans, and how humans interact with machines. Autonomous vehicles are the biggest argument for this type of thing. Ideally, it needs to be globally regulated for how a vehicle indicates intent, what the vehicle needs to know or understand about pedestrians, cyclists and any road user in order to be safe to cross the road.”

Despite the challenges ahead, Pindeus remains optimistic about what can be achieved with driverless cars. “There are huge opportunities in understanding the specific needs of people,” she said. “For example, if someone has a disability or is in a wheelchair. Is someone elderly crossing the street or about to enter a vehicle? There are various types of challenges where technology can be used to make transportation more inclusive.”


Leave a comment

Your email address will not be published. Required fields are marked *