Sensor fusion development for Self-Driving vehicles
Volvo Cars will play a leading role in the world’s first large-scale autonomous driving pilot project in which 100 self-driving Volvo cars will use public roads in everyday driving conditions around the Swedish city of Gothenburg. A self-driving car make use of multiple sensors, such a radars, cameras and lasers, to gather information about the surrounding environment. This information is combined in a sensor fusion system to both estimate the global position of the ego vehicle, and the distance to and properties of surrounding objects.
This presentation will focus on:
Short description of the Drive Me project
- Goal and motivation
- Need for environment perception and localization
- Sensor configuration
Sensor fusion functional architecture
- Object tracking – a combination of data from 15-20 sensors to create a 360 degree environment perception
- Localization and mapping – creation of a high density map and localization algorithms for an accurate and robust positioning
- Grid fusion – low level fusion for detecting small objects on the road
Sensor fusion software development
- SW-development environment (programming language, hardware platform, hardware acceleration...)
- SW-development process
Joakim Sörstedt, Chalmers University of Technology
Joakim Sörstedt received M.Sc. and PhD degrees in electrical engineering from Chalmers University of Technology, 2001 and 2007 respectively. Since 2007 he is with Volvo Car Group in the active safety electronics department, and since 2011 he is a technical expert in sensor data fusion. Currently, he is leading the sensor fusion team in Volvo’s self-driving car project.