Industry Voices: Embedding Vehicles with “Super Senses”

Opinion piece by Kobi Marenko, CEO of Arbe Robotics

Everyone is familiar with the old catchphrase “spidey sense” from Spiderman – an innate super-human sense that would intuitively understand what was happening in the world and if there was danger lurking. This same “spidey sense” type of sensor is required to make autonomous vehicles (AV) and ADAS understand what the vehicle recognizes around it, detect it, and map it to direct the vehicle safely on the road.

Purdue University studied bats, spiders, and birds that have a built-in “sense” and created sensors that operate similarly. The spidey sense of their sensors are connected to nerve endings that are bound to mechanoreceptors. The nerve endings, or mechanosensors, filter out information based on what is necessary for safety, similar to the sense of the animal that filters out information based on what is needed to survive. At Purdue, they plan to integrate the animal/insect-inspired sensors with autonomous vehicles and drones to mimic the similar abilities that the animal’s natural sensors use in the wild.

But is mimicking the wild or human beings our best bet or do we need to operate better than that? Essentially developers are building autonomous vehicles to drive better than a human being. For a machine/robot to function like a human is nearly impossible – they are not human and cannot operate in the same way, they need to perform better than humans. People can accept vehicle collisions and fatalities that were accidentally caused by human error but they will not tolerate that an accident was caused by a machine.  This is similar to the idea brought up in Assimov’s first law, “a robot may not injure a human being or, through inaction, allow a human being to come to harm”. While there may be aspects to the tool that can cause injury, the injury should only be a result of the person using it’s error, not a result of the design or function of the tool. In other words, any accident that occurs from an autonomous vehicle should not be a result of the design or performance of the vehicle itself.  On the contrary autonomous vehicles should be the catalyst to advancement and safety on the road, eliminating fatalities completely.

The goal of autonomous vehicle development is to bring the highest level of safety or zero fatalities to the road, something the world has never experienced as of yet. And superhuman goals call for superhuman sensors. Unfortunately, some vehicles rely on vision alone because they are trying to replicate how a human being operates; however,  there are so many senses, cognition, etc involved when a person drives a vehicle. Vision alone won’t work effectively.(It wouldn’t work for a person either.) Creating a vehicle with superhuman senses or spider sense involves sensor fusion, which is understandably why most OEMs are relying on multiple sensors to create a superhuman approach to full autonomy.

One sensor that is providing super human abilities to autonomous vehicles is 4D Imaging Radar. Unlike typical radar, 4D Imaging Radar provides a conglomeration of abilities that other sensors on the market can’t provide like visibility in the dark and in poor weather conditions, it can see right through the object blocking you on the road as well as long distance viewing so it knows what is coming up ahead before the vehicle is in close proximity. 4D Imaging Radar can also sense the entire field of view at the same time and distinguish between all objects in its view while simultaneously determining speed and distance accurately. Because it does this concurrently, reaction time to any of these elements is extremely fast, faster than it would take any human being to even detect, never mind process and react.

4D Imaging Radar works in tandem with the cameras in the sensor suite and in some cases LiDAR as well. Each of the sensors contribute a different element that creates a superhuman sensory experience for autonomous vehicles. Cameras are visually accurate but often have trouble operating in challenging weather conditions and LiDAR provides a 3D view, operates in the dark but cannot function effectively in poor environmental conditions. Putting sensors together is called sensor fusion, which provides the vehicle with data to make accurate decisions on the road. This sophisticated union enhances the Imaging Radar’s spidey sense even further.

Providing autonomous vehicles with the data necessary to perform at a very high level of safety. Next-generation Imaging Radar and sensor fusion equip vehicles with a myriad of super capabilities that are revolutionizing the roads and providing a level of safety that is both “superhuman” and super powerful.

Leave a comment

Your email address will not be published. Required fields are marked *