Understanding pedestrian behaviors is significant for safe interaction between autonomous vehicles and pedestrians. Though numerous achievements have been reached for autonomous driving, this is still an open issue due to the highly uncertainty and dynamicity of pedestrians. Early studies treated pedestrians as moving rigid objects, predicting their future positions by trajectory prediction. More recent studies attempted to recognize pedestrian intentions by their body pose and actions. However, the intention of pedestrians is simply recognized as a binary result, cross or not-cross. This is not sufficient to describe the dynamic communication process between pedestrians and vehicles. We argue that pedestrian behaviors should be further explored and interpreted instead of concluding whether or not to cross, especially when pedestrians use hand gesture to communicate with vehicles. Hence, we established a taxonomy of pedestrian interactive behaviors. A new large-scale dataset, involving nine types of behaviors and 3600 video clips, is published. We adopted pose estimation to obtain 2D key points on pedestrian skeleton. A covariance descriptor is applied to depict high-level spatio-temporal features based on body pose, which is independent of sequence length and robust to missing key points. Then pedestrian behaviors are interpreted using Random Forest. Experiments both on the proposed dataset and publicly available JAAD dataset proved that the proposed method outperformed the state-of-the-art by approximately 5%.
Pedestrian Behavior Interpretation from Pose Estimation
19.09.2021
1102454 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Pedestrian Motion State Estimation From 2D Pose
IEEE | 2020
|PEDESTRIAN MOTION STATE ESTIMATION FROM 2D POSE
British Library Conference Proceedings | 2020
|