Teaching a self-driving AI to see, analyze and act. A case study.

5 minutes read

Highly automated and autonomous driving places enormous pressure on the safety and reliability of its technology. Flawed reference data can mess up the entire training process of an autonomous driving algorithm, from perception and situation analysis to behavior planning. Bosch showed how to train neural networks efficiently thanks to highly accurate annotated data from understand.ai.

Efficient development of particularly reliable control systems for autonomous driving up to SAE Level 5 requires fitting technologies. Conventional control-based approaches had to give way to capabilities of trained neural networks. When executed on fast graphics processing units (GPUs), neural networks are particularly suitable for the processing of vast amounts of data from high-resolution sensors. This is how Bosch’s approach worked.

1. Where do we start with AI?

The first step was to investigate potential application areas for artificial intelligence (AI) along the entire processing chain from perception to situation analysis and behavior planning. Bosch identified AI application areas focusing on multimodal perception, i.e., the environment perception of a vehicle with the merged data of video, RADAR, and LiDAR sensors.

Autonomous driving - annotated objects with attributes

2. The difference between highways and a city rush hour: Setting up a diverse data set

To properly train self-driving algorithms you have to feed it with diverse training data. Raw sensor data had to be recorded in very diverse traffic environments — highways, country roads, urban areas, traffic objects, traffic scenarios, weather conditions etc. This was achieved by defining the ideal types and characteristics of routes as well as route categories. Tracks that match the definitions and cover all defined categories were then selected for the actual recorded rides.

3. How does an algorithm learn to see? Training neural networks using supervised learning

Like the human brain, neural networks learn by means of positive and negative examples: The paths to correct results are maintained, the ones to incorrect results are discarded. Both tasks and solutions are required to identify correct results. In the context of systems for autonomous driving, these tasks and solutions are typically the raw sensor data and the detected objects, respectively. This approach is called supervised learning. The solutions are made available in a previous step by markings (labels/ annotations) in the form of reference data (raw data plus label/ annotation).

4. Only high-quality training data will do the trick

Bosch knew that successful (machine) learning requires the use of high-quality learning material. Relevant objects that the AI has to recognize by itself later (pixel patterns etc.) must be precisely marked and classified in the data. This step entails huge volumes of annotated data, which is done manually to a large extent. To maintain the efficiency of this process, automation proved to be essential. understand.ai, a dSPACE group company that specializes in labeling automation was chosen to tackle the challenge. Bosch and understand.ai agreed on data quality and precision goals to ensure successful training.

UAI Annotator - the understand.ai algorithmic annotation tooling

5. 3D annotation of LiDAR data and its quirks

Objects in the LiDAR sensor data were marked with high-precision cuboids placed in the 3D point clouds of the LiDAR recordings. The camera sensor data was used for plausibility checks to identify object classes and respective attributes. An iterative approach was applied to review the defined quality goals. After the initial feedback loop, the team of understand.ai was able to incorporate the requirements and processed the data with little final rework required.

The annotation journey for autonomous driving has its challenges such as differentiating between cars and vans or detecting vehicles with roof boxes or bicycle racks. The particular AD/ADAS knowledge and powerful tools make the difference in quality and throughput — they save precious project time and resources. With its AI-powered automated labeling tools for object detection, regression and attribute setting, understand.ai was able to identify edge cases and set a new standard in annotation automation.

Feeding the Machine: Supervised learning with annotated sensor data

Finally, neural networks are being trained with the high-quality labelled data. Training can take several days or weeks depending on the depth of the network and the amount of data. A prerequisite for successful training processes is a coordinated IT infrastructure equipped with powerful GPU-based computer clusters.

The Moment of Truth

What were the lessons Bosch learned during the process?

  1. Highly accurate annotation is indispensable for supervised learning — quality of the reference data determines the AI’s ability to clearly identify objects.

  2. Annotation is a learning process in which labeling specifications, processes and tools are continuously optimized in order to achieve the highest possible quality and automation. Powerful tools and efficient feedback cycles lead to the desired results.

  3. When it comes to the expertise of annotation specialists, experience in autonomous driving is critical for a successful training project.

„Highly accurate annotation is an indispensable prerequisite for supervised machine learning. We rely on the labeling service and tools from understand.ai.“

Dr. Florian Faion, Research Scientist LiDAR Perception at BOSCH

What’s coming next? Annotating Surround View Data

A new measurement campaign is planned for 360° environment detection, in which the vehicle environment is recorded in high resolution with camera, LiDAR, and RADAR sensors. It creates new challenges in terms of data volume, synchronous processing and annotation of the merged data. The subsequent steps are currently being discussed by the experts from understand.ai and Bosch.

Read more about the project in an article in the dSPACE Magazine 01/2020 edition.

Benjamin Dilger, Head of Customer Success at understand.ai