Sensor integration technology for reliable AVs

Think of it from a human point of view.

We perceive our environment through various senses. Although vision is the main sensor of everyday life, the information available on this single sensor is very limited. So we need to complete our local information, including other sensors – hearing and massage.

This concept is for A.V. The sensors currently used in AVs and advanced driver support systems range from cameras to LEDs, radar and sonar. Each of these sensors has its own advantages, but each has its own limitations.

Safety is one of the most important values ​​in the automotive field. This is especially true for EVs, as each mistake can lead to more mistrust about EVs. Therefore, the ultimate goal of most companies with automotive technology is to create a safe driving environment.

With sensor integration technology, each sensor is unique in finding and analyzing specific information. Thus, the pros and cons are clear, and there are some limitations that are difficult to resolve.

Cameras are very effective at assigning vehicles and pedestrians, or by reading road signs and traffic lights. However, their ability may be limited by fog, dust, darkness, snow, and rain. Radar and leadar accurately distinguish the position and speed of an object, but do not have the ability to classify objects in detail. They also cannot identify different road signs because they cannot classify colors.

Sensor integration technology eliminates distortion or lack of information by combining different types of available data. The sensor integration software algorithm complements the information of blind spots that a single sensor may not recognize, or combines overlapping information from multiple sensors at once and balances the information. With this general information, this technology provides more accurate and reliable environmental modeling and allows for smarter driving.

Leave a Comment