Future of Automotive Sensor Fusion & Data Handling 

Chris Clark

Mar 04, 2021 / 5 min read

If you rolled off the lot with a brand new car five or six years ago, it would likely contain around 60 to 100 sensors. Today, that number is realistically much closer to 200 or more. As vehicles continue to get smarter and more autonomous, the evolution and sophistication of sensors has kept pace.

For instance, LiDAR, traditionally a large device that is as high on the vehicle as possible, is getting smaller, more data and power efficient, while offering a higher level of definition.

But what’s seeing even greater advancement than the types of sensors themselves is the ability to take the data coming from multiple sensor types and mesh this data on a powerful computational platform.

As the autonomous vehicle industry continues to advance, tech providers and automakers need to factor in the cost/performance tradeoffs between edge computing capability, sensor fusion, sensor degradation, monitoring, and the maintenance/servicing of the software over the lifespan of the vehicle.

Autonomous Vehicle Connectivity

Keeping Up with the “Automotive Jones”

One of the biggest challenges that auto manufacturers and OEMs have is keeping up with the rapid pace of both sensor and data development. Sensors need to provide the necessary level of data fidelity for vehicle systems to meet design requirements. For instance, if you drive any of the latest vehicles on the market, their sensors will look for lane markers and give you visual cues that you are not in a lane, if necessary.

Those types of capabilities are helpful, but still relatively rudimentary when it comes to autonomous vehicles. Driving in dense metropolitan areas with large numbers of drivers, pedestrians, bicyclists, and motorcycles on the road requires vehicles to make a split-second decision to stop or swerve to avoid hitting a pedestrian or another vehicle. We are starting to see sensor fusion come into play, making decisions that a human driver may be incapable of in the time the ADAS system reacts.

Another challenge is the number of solutions on the automotive market, whether it be competing software architectures or hardware platforms. Each player in the market has their own plan of how their technology should be implemented. Building an adaptable software framework is the limiting factor (not the ability to manufacture high-quality sensors). As autonomous vehicles continue to improve, whether it be with satellite technology that allows cars to see a mile ahead with real-time satellite imagery or using vehicle-to-vehicle communication to navigate dense roadways, technology providers are going to need to work together and build in some consistency in order to make those capabilities a reality.

Sensor Fusion vs. Edge Computing

Sensor fusion is the process of combining data from multiple different sensors to generate a more accurate “truth” and help the computer make safe decisions even though each individual sensor might be unreliable on its own.

In the case of autonomous vehicles, signals from a wide variety of sensors are traditionally transmitted into a central computing unit to estimate the position and type of object (i.e., human beings, animals, other cars, etc.), the velocity at which the object is moving, and the trajectory of the movements to help make navigation decisions.

Edge computing enables the vehicle to compute data near or at the source of information (the sensor itself) rather than relaying the information to the central processor, reducing latency, and enabling close to real-time processing.

The greater number of edge computing sensors that a car has, the better it can be at responding in real time; however, this also increases the expense and complexity of the solution.

What About Sensor Degradation?

Sensor degradation is a natural part of the autonomous vehicle equation, especially given the fact that a car today typically has a lifespan of 10 to 15 years. Top causes of degradation include general wear and tear of sensors, harsh operating environments, and degradation of other electronic system elements.

Additionally, there is temporary sensor degradation that comes from environmental factors such as direct sunlight obstructing a camera’s view or high radio frequency (RF) noise in congested areas. To guard against this type of degradation, automakers usually install several types of sensors to provide an additional layer of information for the ADAS subsystem to make a better decision.

In terms of gradual degradation, automakers and technology providers must factor in the ability of the installed sensors such as LiDAR, camera, ultrasound, etc., to perform at the same level it did at the beginning, if not greater, for the entire expected lifespan of the vehicle.

Additionally, they must answer what happens if the sensor begins to fail. How will the vehicle alert the driver that it requires maintenance? Will the vehicle automatically suspend its autonomous driving ability until that maintenance is completed?

As autonomous vehicles become more common, there will be additional regulations that come into play as well as various consumer habits that will help shape the future of the market. Perhaps it will become an expectation that vehicle cameras need to be replaced every three years, just like consumers need to replace tires on a regular basis.

Synopsys provides a number of different tools that help customers model and design semiconductors and other components within the vehicle to create predictive failure rates and alternatives. For instance, Saber is used to design and verify the interaction of multiple technologies (electrical, mechanical, hydraulic, magnetic, etc.). It can also model environmental conditions such as heat, temperature, humidity, etc., and how those will change the characteristics of the device.

Cybersecurity and Automotive Sensors

There are a few different factors that need to be considered when making sure autonomous vehicle sensors are secure. Hacking into a system is always a concern and needs to be addressed in both the software and hardware design phases.

The other less obvious security factor is attackers influencing the machine-learning technology embedded in the vehicle to react in a malicious way. For example, there was a U.K. study that introduced a video billboard that had been tampered with to show a stop sign for as little as a fraction of a second. Autonomous vehicles would sense the stop sign and stop because they were able to pick up that image and reacted to it the same way they would have had they sensed a stop sign on the roadway.

As the complexity and the capability of these devices increase, the different avenues in which cybersecurity attacks take place are also going to grow in proportion. Designers and automakers are going to have to find a way to update their defenses and manage these types of unintended reactions to protect against these nontraditional cyberattacks. Part of that defense should be a robust software lifecycle management program which allows organizations to apply lessons learned from their experience.

Vehicles being built today now have tens of thousands of lines of code; Tier 1 automotive suppliers need to manage and develop vehicles from a software security perspective over the lifetime of the vehicle to make sure it continues to operate as safely as it did when it first rolled off the production line (or even better).

As we discussed earlier, that same philosophy goes for the physical performance of the sensors themselves to combat against degradation. Keeping pace with all of the advancements in the sensor fusion space is challenging, but rewarding, as we continue to see autonomous vehicles drive into the mainstream.

Continue Reading