Evaluating Perceptual Uncertainty for Safe Automated Driving Under Consideration of Contextual Awareness
Human error is involved in larger part of vehicle crashes leading to increased crash rate. Safe automated driving systems are therefore needed to save thousands of lives on our roads, and eliminate human driving fallacies. In addition, these systems extend the potential for extra safety, increased productivity, greater accessibility, better road efficiency, and positive impact on the environment. In order to achieve this automated driving system demands a dependable embedded automotive perception system employed in Autonomous Vehicles. An Automotive-sensing suite consists of Radar, Lidar, Cameras, Ultrasonic, Infrared & Thermal Camera sensors etc. as well as AI-based algorithms have the ability to collect information, and extract relevant knowledge from the real world environment. This perception information is then subsequently utilized by Intended Functionalities for safety-critical control decisions during different driving maneuvers. As all types of sensors and data processing steps in the perception chain may produce measurement errors due to various environmental or situational conditions namely perceptual uncertainty, additional safety measures are required to ensure deterministic perception capabilities within safety-critical systems. In this work, to address this issue, DS framework based sensor data fusion operator in the perception chain is employed to take advantage of the accessible overlapping sensor level evidences(available in the form of probabilistic knowledge) w.r.t existence of the target at high level fusion and also, with the help of situational awareness information such as weather condition enables uncertainty evaluation of sensor and fusion level performances by creating meaningful, reliable, certain and accurate target object data to the Intended Functionalities of the EGO vehicle. In addition, a 2D environment perception visualization tool is designed and developed, in order to assess and simulate the performance of the fusion operator obtained in different sensor configuration and traffic scenarios. The simulation results show an increase in situation-aware high level sensor data fusion performance for fully and partially overlapping configurations of the sensor.
Kaiserslautern, TU, Master Thesis, 2020