The cooperative perception by an inter-vehicle network promises a multitude of improvements for advanced driver assistance systems. First, for the simultaneous estimation of the vehicle ego pose and the road network infrastructure, the authors propose a local, highly model-based fusion architecture for digital map and video information. Second, to extend the vehicle's field of view, they include remote information concerning the state of each participating vehicle and its object detections. For the object-level fusion of these detections a centralized tracking process by Kalman filtering is employed. The algorithms are evaluated in simulated vehicle network scenarios which are based on real sensor data. With the presented fusion approach, they obtain a comprehensive description of complex traffic scenarios including knowledge about the traffic infrastructure from a digital map and video. The centralized fusion enables a tracking of information from multiple vehicles in a joint frame.
Information fusion for cooperative vehicles
Informationsfusion für zusammenarbeitende Fahrzeuge
2006
5 Seiten, 1 Bild, 4 Quellen
Conference paper
English
Data fusion considering 'negative' information for cooperative vehicles
Tema Archive | 2007
|Cooperative Data Fusion Amongst Multiple Uninhabited Air Vehicles
Springer Verlag | 2003
|COOPERATIVE POSITIONING AND RADAR SENSOR FUSION FOR RELATIVE LOCALIZATION OF VEHICLES
British Library Conference Proceedings | 2016
|