In the field of autonomous driving, collaborative perception has emerged as a promising solution for augmenting the capabilities of individual sensors by enabling vehicles to share their sensor information across each other, thereby enhancing their situational awareness. This paper addresses the limitations of classical perception in autonomous vehicles by proposing a novel intermediate collaborative perception methodology employing graph attention network (GAT) to incorporate multiple feature maps and to selectively emphasize important regions within the feature maps. We construct the graph structure as a set of nodes embedding the ego and the neighboring connected vehicles feature maps, as well as establish edge weights between those nodes based on their relationship to each other which is defined by the attention coefficients. The proposed approach leverages both channel and spatial attention-based aggregation and enables the model to determine inter-feature map relationships at a specific channel and spatial regions, while adaptively highlighting the informative regions. This adaptive highlighting mechanism directs the aggregation algorithm towards the most informative areas within the ego and the received feature maps, thereby enhancing the representation power of the ego vehicle’s feature map leading to improved precision in object detection. We quantitatively and qualitatively evaluate the performance of our proposed approach against existing state-of-the-art in collaborative perception. We validate our methodology using V2XSim, a large-scale multi-agent perception dataset. The results demonstrate that our methodology achieves superior performance in enhancing object detection average precision.
Graph Attention Based Feature Fusion For Collaborative Perception
2024 IEEE Intelligent Vehicles Symposium (IV) ; 2317-2324
2024-06-02
2290566 byte
Conference paper
Electronic Resource
English
Transportation Research Record | 2025
|Vehicle infrastructure collaborative perception fusion method and system
European Patent Office | 2023
|European Patent Office | 2023
|