Jun 26 2023
In previous articles of this ADAS blog series, we discussed the increasing prominence of ADAS and its impact across the auto industry, as well as some of the challenges with these advanced safety technologies. As ADAS systems continue to grow in their prominence, with a solid amount of data accumulated over the last few years to prove their efficacy, as well as their limitations, manufacturers are constantly looking for ways to improve these systems and prevent even more accidents.
This article includes some of the significant trends shaping the future of ADAS as the auto industry works to transform automotive safety and autonomous driving.
While ADAS systems have proven to be effective in reducing vehicle accidents, as referenced in the LexisNexis white paper linked above, there’s always room to improve the technology. One area in which ADAS functionality has not been perfected yet and continues to improve is in the accuracy of the various sensors that power these safety features, including surround-view cameras, short to mid-range and long-range radar, ultrasonic sensors, and LiDAR. Some of these improvements include the accuracy of object detection, the perception of their distance and speed, and the navigation of complex driving conditions with greater precision and safety.
Additionally, manufacturers have been working to integrate multi-sensor data, also referred to as “sensor fusion,” to help enhance these vehicles’ ability to understand and respond to surrounding environments, such as certain road conditions, or obstacles.
Check out our latest white paper that gets into even more detail about vehicle sensors/cameras, and how an option-level VIN decoding solution can help navigate complex vehicle safety features.
The journey towards fully autonomous vehicles (AV), level-5 autonomy, requires not only greater sensor and perception accuracy, especially in difficult driving situations (inclement weather, dense traffic, etc.), but a better understanding of “human common sense,” according to a Forbes article. Many ADAS systems have done a great job providing alerts and actively correcting distracted drivers, however, these are semi-autonomous systems (designated levels 1 or 2) that still require driver input, and are essentially a stepping stone to fully autonomous vehicles.
There is still a significant opportunity for ADAS systems to become even more effective, particularly when it comes to the active ADAS features, such as automatic braking and lane keeping features. Over the next few years, we should start to see less involvement from the driver as manufacturers strive for greater autonomy.
According to Canada’s Victoria Transport Policy Institute, level-5 autonomous vehicles may be commercially available and legal to use in some jurisdictions by the late 2020s. However, many sources, including McKinsey and Verdict, seem to think level 5 autonomy won’t be available until sometime in the next decade. The good news is that manufacturers are actively working to increase vehicle autonomy, with Mercedes-Benz already approved for “level 3” development.
AI (artificial intelligence) and machine learning will play a significant role in the future of ADAS. Machine learning will enable these systems to learn from real-world driving scenarios to continue improving their performance and adaptability. Data gathered from the various sensors and driver behavior will be valuable in recognizing certain patterns to help avoid potential hazards and make split-second decisions in the future.
Complex software systems are now intrinsic to modern vehicles, so it’s essential that the software can be updated with the latest functionality and even bug fixes. Thankfully this can be done with over-the-air (OTA) updates, which are transmitted wirelessly and installed automatically without requiring a visit to the dealership. This is particularly important when dealing with the software that powers ADAS, as one software error could be very costly for the vehicle and even the driver depending on the severity of the accident it may cause. As drivers become accustomed to the reliance on ADAS features, they need to be dependable and up-to-date.
As valuable as they are, ADAS systems still have some limitations, especially when dealing with hidden objects or poorly marked roads. This is where V2V and V2I communications can offer additional layers of safety. According to the NHTSA, “V2V communication messages have a range of more than 300 meters and can detect dangers obscured by traffic, terrain, and weather. V2V communication extends and enhances currently available crash avoidance systems that use radars and cameras to detect collision threats.”
Additionally, infrastructure data captured from V2I sensors can be valuable to drivers by providing real-time alerts to increase situational awareness of such things as road conditions, pedestrians crossing in a busy intersection, traffic, accidents, and roadwork.
The future of ADAS is bright, with a rapid evolution of advanced safety technologies and an exponential growth of data unlocking the significant potential for improved safety and greater autonomy. Key pieces of the puzzle, including sensor advancement, use of AI, OTA updates, and V2V/V2I communications are starting to take shape. Automotive organizations must diligently track the progress of these technology advances to effectively market them, understand their integration within the vehicle, and effectively evaluate their overall safety impact.