Night Blindness Blinkers the Bots

The need for a cocktail of sensors challenges early mass adoption of driverless tech, Eric Volkman discovers.

It’s interesting how a little darkness can make a lot of trouble for drivers. According to data from the US National Highway Traffic Safety Administration, although there is significantly less traffic crawling around at night, in 2016 nearly 60% of car accidents in the country’s urban areas occurred during the hours of darkness.

There are solid reasons why. “Simply put, it’s harder to see at night. Less color, less contrast,” said Alex Epstein, director of transportation safety for America’s National Safety Council. “This creates a challenge of reduced visibility, as well as reduced depth perception and peripheral vision.” There are other variables that ratchet up the hazard factor. One is the risk of animals darting out suddenly on a road; many of them move around at night. Many drivers have had the scary experience of having to swerve around a creature to avoid killing it. In extreme cases, it’s not the animal that ends up losing its life, rather the swerving driver.

Alcohol consumption tends to happen at night, plus the evening hours are when most of us start to get sleepy. Add volatile factors, such as those to the ones mentioned above, and you’ve got a substantially riskier environment for vehicle operation. The increasing sophistication of assisted driving solutions promises to greatly reduce this risk but eliminating it entirely… well, that’s another story.

Of waves and pulses

Humans can’t see in the dark but some machines can. Particularly if they’re LiDAR products. The technology uses laser pulses to measure ranges from a vehicle to the objects around it. This data can be used to help provide a precise topographical map on the fly. What makes LiDAR particularly useful in darkness is that it needs no ambient light in order to operate.

In fact, according to John Eggert, director of automotive sales and marketing at Velodyne LiDAR, it’s “the only technology that offers full high-resolution environmental perception at night.” Louay Eldada, CEO of solid-state LiDAR maker Quanergy, explains that cutting-edge LiDAR solutions “are active sensors, which means that the signals they sense are the signals they send, with no reliance whatsoever on ambient light. Passive sensors, such as cameras, can only detect in the presence of ambient light (sun, street lights).”

Not everyone is a true believer in the LiDAR gospel, though. One of these people is Yakov Shaharabani, CEO of Israel-based AdaSky. The company’s Viper autonomous sensing platform is built around a shutter-less camera that uses far infrared (FIR) technology, rather than laser pulses, to “see” its surroundings. Shaharabani said that: “Because Viper’s FIR cameras scan the infrared spectrum just above visible light, they are able to detect objects that may not otherwise be perceptible to other sensing technologies, such as camera, radar, or LiDAR.”

Shaharabani also claimed that Viper’s small size, lack of moving parts and minimal power consumption makes it “an ideal solution” for self-driving vehicles. He wouldn’t say that of LiDAR, citing a handy example: “High-resolution LiDAR sensors are too expensive for mass market use, and less expensive, lower-resolution LiDAR sensors cannot effectively detect obstacles that are far away and are, therefore, incapable of providing the accurate detection and coverage needed to traverse map-less, rural environments,” he said.

However, like any cutting-edge sensing platform, cost is a concern with Viper’s brand of technology. Infrared has historically been seen as prohibitively expensive for widespread adaptation outside of the luxury segment, although Shaharabani said that Viper “is scalable for the mass market and available now.”

The total package

Aside from the cost factor, the lack of a single go-to solution for 100% perfect sensing at night is also a challenge. In an interview last year with IEEE Spectrum, the magazine of the Institute of Electrical and Electronics Engineers, senior research scientist at self-driving car specialist Voyage Tarek El-Gaaly said that “there is no one-sensor-does-it-all for autonomous vehicles, it really comes down to fusing multiple sensors together into a sensor suite to complement each other”.

Even some who are up to their eyeballs in LiDAR and competing technology concede this point. Said Quanergy’s Eldada: “LiDARs, radars and cameras work together as a sensor suite; LiDAR is the primary sensor used for perception, localization, and navigation; radar acts as a redundant sensor for obstacle detection; cameras show color (traffic lights) and read street signs using [optical character recognition].”

AdaSky admits its platform needs mates. Viper is “billed as a supplemental sensor, as it is used to generate an additional layer of information and detect objects that may not otherwise be perceptible to a camera, radar, or LiDAR,” said Shaharabani. We should also bear in mind that a sensor array on its own won’t get the job done, no matter how sophisticated the underlying technology and how affordable it becomes. Autonomous driving will also require enhancements to more classic means of piercing the dark.

Ellen Edmonds of the American Automobile Association federation of vehicle clubs, says that this is happening. For instance, according to her: “Automakers continue to improve safety for night driving through the introduction of improved vehicle lighting. LED stop and signal lamps increase vehicle visibility to other drivers, while expanded availability of high-intensity discharge and LED headlights improve drivers’ ability to see the road ahead.” The National Safety Council’s Epstein agreed, adding that there’s little problem with take-up. “Technology is improving with new headlight designs and advanced safety features such as adaptive headlight systems,” he said. “By and large, carmakers welcome these types of new technologies.”

Let’s not forget that classic accident prevention device: the humble brake, which is poised for a dramatic upgrade. “Automatic emergency braking systems will be standard on most cars by 2022 and hold the potential to reduce collisions, even at night,” the AAA’s Edmonds said. This is a firm date: in 2016, a group of 20 top US automakers pledged to include AEB as a standard feature in their models by that year. Combined, according to the country’s National Highway Traffic Safety Administration, that group represents nearly all of the domestic vehicle market. Fortunately for those automakers, AEB technology has become increasingly more cost-effective and can now be found in cars with sticker prices under $40,000.

Flesh and blood

The many cutting-edge technological innovations and improvements going into perfect vehicle night vision are impressive but we’re still a long way off from the ideal cocktail of sensors, systems and hardware that will help tackle all of dark driving’s challenges.

We are also virtually guaranteed that this won’t happen anywhere close to full autonomy – there’s the human factor, after all. “These advancements will decrease the number of night-time collisions,” said AAA’s Edmonds. “But as long as there are humans driving cars in unpredictable and unsafe ways (sometimes under the influence of drugs and/or alcohol) there are going to be situations in which a collision cannot be foreseen or prevented.”

The National Safety Council’s Epstein agreed, saying that we flesh-and-blood drivers have to lead the charge. “We shouldn’t wait around for a magical solution to appear,” he said. “We need to create a culture where we take driving for the important task that it is, use our seat belts, and avoid impairment, speeding, texting and all of the other potential pitfalls that lead to tragedies that might have been completely prevented.”


Leave a comment

Your email address will not be published. Required fields are marked *