Visions of bot-friendly consumers

Making driverless consumer-ready investigated by Siegfried Mortkowitz.

The development of autonomous vehicle technology has suffered more setbacks this year.  First, a 49-year-old woman died after being struck by a driverless Uber automobile being tested on a street in Tempe, Arizona, while the vehicle was in autonomous mode and with a technician in the car. Less than a week later, a Tesla X crashed into a roadside barrier in California, killing the driver, while Autopilot was engaged. It was the second fatality in two years involving the Tesla Autopilot system.

Of the three fatal accidents involving a car equipped with self-driving technology, the crash involving the death of a pedestrian on a public city street has created by far the most negative publicity and provoked a greater public backlash. Although the investigation into the crashes is ongoing, it is not very difficult to gauge their consequences for the development of the technology. According to Zac Doerzaph, director of the Centre for Advanced Automotive Research at the Virginia Tech Transportation Institute (VTTI), these accidents will, paradoxically, both quicken and slow the momentum of the research.

“In a way, it justifies the need for what we do,” he says. “So, I expect we’ll be doing a lot more research on how connectivity can improve automated systems and provide another layer of safety. In that sense, the pace of research may increase but if we’re talking about the pace of getting these systems widely deployed, it’s a reminder that we need to be careful and diligent and really take our time to ensure safety.”

For Harri Santamala, CEO of the Finnish start-up Sensible 4, fatal accidents involving driverless vehicles were just a matter of time and they provide a necessary pretext for stock-taking. “I think this was inevitable,” he says. “Now we need to adjust our deployment plans accordingly. This is a strong wake-up call to take a stand on what we are willing to accept and what we are not in terms of what we put out on the road. If it had to happen, it’s better that it happened before any full-scale deployment, because this is a very good time to reconsider policies and practices on what can be allowed and what not, and what can be approved as being road-capable.” See Robots will make fatal mistakes, Continental concedes.

Regaining public confidence

It is certainly too soon to tell what the long-term effects on public confidence in the technology will be. More important, in the near term, will be the reaction of communities where public testing of self-driving cars will be held. It’s unclear to what extent pedestrians and drivers in Tempe were informed of the presence and location of the driverless test car, if at all. An article in the UK daily The Guardian, based on emails between Uber and the office of Arizona Governor Doug Ducey, reported: “Remarkably, the public appears to have been kept in the dark. Because of Arizona’s regulatory vacuum, neither Uber nor Ducey were obliged to inform the public that Uber’s cars would now be driving themselves on public roads.”

Though the cases are different, Santamala’s experience while testing autonomous technology in Helsinki may be instructive about the importance of making the public an informed part of the testing process. When he was project director for Smarter Mobility at the Helsinki Metropolia University of Applied Sciences, he ran a pilot using a self-driving shuttle bus in a suburb of the Finnish capital.

As part of the project, the on-board technician explained to passengers how the bus functions and that they were part of a pilot project. This was filmed by one of the vehicle’s on-board cameras to fulfil a condition agreed with the Finnish Transport Safety Agency. “As far as I know, no one refused to board the bus,” Santamala says. “People are generally quite positive about the technology here. They were excited by being part of the trial.”

Passengers were also interviewed after the trip to collect feedback on their reaction. “The most common negative feeling about the ride was that it was too slow,” he notes. “The most typical positive comments were that it felt safe and that they would have no problem using it. I don’t believe we got a single negative comment about safety.”

The ‘social’ learning of robots

Santamala says Sensible 4 is currently developing Level 4 autonomous technology for last-mile solutions. The company is building a bus to be used as a shuttle and is about to run a public pilot using two pods on a suburban route. The technology on the bus will be the same as in the pods, he explains. “We will begin a pilot with pods in one town. We will have a few test-campaign trips to Lapland and get ready for open-road pilots. In Lapland we are testing on snow and also trying to learn how to stay on the map when the environment changes, because Lapland has ‘eight’ seasons. We’ll start with two pods, so we can also start testing some fleet features.”

These pods will be tested first on a low-traffic route, because dense traffic requires solutions based on data and machine learning, Santamala explains: “For example, if you come to a Yield sign, the AV wants to follow the rules. So, if there’s a rush of endless cars in the intersection, the AV will be stuck. That’s a basic problem. High-traffic volumes need the other drivers to be mindful of the AVs or the AVs to learn to bend the rules, just the way we do. Eventually we need to cut in. This is a very difficult dilemma for the deployment of AVs.”

Human drivers are familiar with this manoeuvre and have developed strategies for cutting into busy intersections, more or less aggressively, and always assuming that other drivers will eventually allow them to enter the traffic stream. “That’s very tricky for the AVs, just as it is for inexperienced drivers,” Santamala says. “For now, in the trials, we are avoiding this problem, but the idea is to build up data to learn how to identify that this might be a good slot to get in.”

According to Dave McNamara, director, connected vehicle services and new business at Brandmotion, there is a social-technical issue regarding how and if robots and real people understand each other on the street. “The answer is, they don’t,” he said. “A pedestrian will probably do some sort of non-verbal handshake with me before they step out in front of my car. They look at you and think, are you looking at me and are you going to let me do this? And they step out and know you’re going to stop, or they think, that guy’s not going to stop so I’m not going to step out. There’s a very complex interaction between pedestrians and drivers that has been learned. There is a lot to be learned about the robot-human interaction. Robots are still confounded by difficult environments – lots of pedestrians and weather.”

Learning to evaluate human behaviour appears to be crucial to the technology and its deployment, particularly while fleets on the roads consist of both human and robot drivers. “The vehicles that have all of these [autonomous] capabilities are going to have one limitation – they don’t have the ability to anticipate what a human driver around them is going to do,” says Donny Seyfer, executive officer, National Automotive Service Task Force (NASTF). “Good drivers have the ability to look and say, ‘I’m just going to play this safe because that driver is not sure what he wants to do’. That kind of algorithm or machine intelligence needs significant time to develop. When you have a mixed fleet of human drivers and driverless cars there will be a difficult transition period for a number of years.”

Beating the weather

Snow, heavy rain and dense fog pose difficult problems for optical sensors, such as cameras and LiDAR. In Finland, unlike in California, adverse weather is part of the daily mix for drivers. That is why Finnish engineers like Santamala are hard at work developing autonomous technology to cope with adverse weather conditions.

Another Finnish engineer, Matti Kutila, a senior project manager at the VTT Technical Research Centre of Finland, in December ran what might have been the first test of an autonomous vehicle on a public snow-covered road. The vehicle, a 2004 Volkswagen Touareg nicknamed Martti, drove itself for about 3 kilometres (1.86 miles), over a snow-covered public road in Lapland, reaching a top speed of 40kph.  In February, Martti successfully drove at 80kph in a closed testing area in snowy conditions without lane markings.

Kutila says one of the solutions he and his team were testing was the effectiveness of the laser scanner in snow. “Laser scanning provides a three-dimensional measurement of the environment when the vehicle is moving,” he explains. “We use that in Martti to detect snowbanks on both sides of the road and to find the real trajectory when you don’t have specific lane markings and there is snow on the ground, which means you cannot see the lanes.”

The aim of this development section, Kutila says, was “to understand what is the best solution under hard weather conditions and also [to test] actuation devices for trajectory planning and to understand what the right speed is when we are approaching some object or curve”. In addition to Martti, VTT currently uses two other test vehicles: a 2008 Citroen C4 named Marilyn which is used in urban environments and a modular industrial vehicle tested on off-road terrains and meant for use in commercial vehicles, such as tractors.

In addition to adverse weather conditions, Martti is also used to test technology to respond to rare and unexpected incidents, such as an animal suddenly appearing in front of the vehicle. “That’s a really key challenge for us now,” Kutila says. “We work at it with our data fusion solution and optimising the different sensor systems, in all kinds of scenarios.” In addition, Marilyn has been put in V2X mode and can exchange information with Martti. “Eventually Marilyn will be connected to an infrastructure, such as traffic lights, which Martti will probably not do,” Kutila says. “But it can communicate its learning to the other car and exchange information.”

He is cautious about the rapidity with which the solutions he is working on will come to the market. “Some of the weather solutions will be commercially available in 2020, when automation comes to the market,” Kutila says. “Solutions for adverse weather and changing environmental scenarios may be ready in eight to 10 years. Our solutions are for semi-automated driving, where the driver supports the computer, and not where the computer is supporting the driver.”

A connected corridor

Zac Doerzaph and the VTTI work with the Virginia Department of Transportation, which owns and manages most of the state’s primary and secondary roads and provides most of the funding for the Vehicle-to-Infrastructure research. The applications are initially developed on a controlled-access Virginia Smart Road complex in Blacksburg, Virginia. “There is a lot there that we can tune, things like differential GPS, so I can get optimal localisation and then detune it so that I can see what it would look like in different environments,” he explains.

When a certain level of performance is achieved with the application, it is brought to the Virginia Controlled Corridor (VCC). “This is a kind of living laboratory intended to facilitate rapid prototyping of applications in V2V and V2I,” Doerzaph says. “There, it’s real drivers and actual roadways where we can take a look at small deployment, feature by feature, application by application, and measure whether or not the performance is effective and then tweak and tune before it’s brought to a broader deployment.”

Currently the focus of the research is on connectivity but Doerzaph says that it will slowly shift more towards automation over the next year. “Because the focus is connectivity, we are working on the underlying pipelines to share the data and the data itself. We’ve been working, for example, with our road operator and signal controllers to get very good timing on our intersections. We have 35 [intersections] altogether.”

Because the area is one of the most congested in the US, and the Virginia DOT is constantly maintaining and expanding the roadways, another focus has been the accurate mapping of work zones. “We are working on static elements, such as lane shifts, and more dynamic elements, such as workers, where they are located,” he says. “You can pass very custom messages to drivers when the car is manually driven and, looking a little more into the future, to get very accurate information into automated cars so they can make more intelligent decisions in those environments.” Doerzaph says the VTTI shares and exchanges research data with carmakers, through the Crash Avoidance Metrics Partnership (CAMP), a consortium of automobile manufacturers, and with other states, via the Connected Vehicle Pooled Fund Study.

The research network

There are three federally funded test sites in the US focusing on V2X technology – located in Tampa, New York City and Wyoming – and Brandmotion is active in all three, working with V2V and V2I. The company is developing solutions using connected vehicle technology, to discover “how to give drivers and systems a 360-degree view of the car and how to deal with threats around the car, outside the visibility of cameras,” Dave McNamara says. “Our connected vehicle projects are an important input to the autonomous driving software but we’re just a part of it.”

These three test sites are only part of a large network of research centres around the country developing connectivity and self-driving solutions in which his company is active. “There are a lot of smart-city deployments around the United States, focused on the vehicle-to-infrastructure part that we’re involved in, like Smart Columbus,” he explains. “There are a number of state efforts, such as California, Georgia, and many in Florida outside of Tampa. These are all stepping stones to autonomy, not only demonstrating robust sensing systems but also collecting important performance data for traffic systems.”

In Tampa, the company’s team is testing applications such as wrong-way entry, alerting vehicles that are going the wrong way on a one-way street. “When the public and governments see that this works and it has a compelling use case, you start thinking about what other life-saving applications are possible and the cost of the implementations becomes less of an issue. The business model for deployment becomes practical as more and more useful applications are identified.”

McNamara notes that a mere 12% of the fleet is new cars and only 20-30% of cars on the road have the ADAS sensors that come with new cars. “Aftermarket has a tremendous opportunity in the area of safety and mobility, of putting life-saving technology on pre-owned vehicles,” he says.

For its rear-vehicle camera systems, Brandmotion uses commercial-grade cameras from other industries and adapts them for automotive applications, which provides excellent price points for the company because it is not saddled with non-recurring engineering costs. “We’re in a sweet spot relative to being able to learn from these early deployments,” he says. “We have solved real engineering issues, like how well does this integrate into the vehicle, can we make it a seamless and robust part of the vehicle. This is an inherent challenge of taking technology that has been tested and proven in, say, the consumer electronics market and moving it to the vehicle side.”

The independent repair technician

The unsung hero of the autonomous-car disruption may very well be the person trained to repair the vehicle because, says Seyfer, repair technicians will also be charged with explaining the technology to the consumers and easing their acceptance of it. “We train the technicians to repair the cars and they, in turn train the consumers to understand how their vehicle works,” he says. “I think the dealers and independents are in the same position. They’re struggling with the fact that there may be service information but this is the type of stuff that you have to put your hands on to learn. It’s not the kind of stuff you can read the operation and learn. You have to do it.”

However, there is a problem, Seyfer says. “The service technician has a fundamental misunderstanding of what’s coming at them. Or they’re not even aware of it. I still get people in the classes I teach that firmly believe that this idea of having vehicles that drive themselves is not going to survive. They think the technology will fail and it won’t happen.”

The technicians will have to repair sensors and cameras that have been damaged in a collision, so they will have to know how to install and calibrate them correctly. “And they will have to have such knowledge as the effect of paint thickness,” Seyfer explains, “because if you put too much paint on a component, for example, radar behind that component loses its resolution and effectiveness. Or if you use some aftermarket bumpers that weren’t designed for or don’t match that radar application, that can cause the system not to work. Technicians have to become experts in ADAS and self-driving technology.” Much of this repair work is now being carried out by collision shops, he notes, adding: “Collision shops don’t tend to be cutting edge in their diagnostic capabilities yet. There are very active groups within the industry working to catch up to the needs.”

Because the sensors and cameras are usually on the car’s front and rear, they can be damaged even in a low-speed collision. It’s easy to imagine that a bad repair job can sour a driver on the technology itself. “Independent repair technicians service 70% of the vehicles on the road,” Seyfer says. “That number is not likely to change much in the foreseeable future. There should, therefore, be a vested interest on everybody’s part to make sure that these folks can fix these cars.”


Leave a comment

Your email address will not be published. Required fields are marked *