To maximize safety and driving comfort, autonomous driving systems can benefit from implementing foresighted action choices that take different potential scenario developments into account. While artificial scene prediction methods are making fast progress, an attentive human driver may still be able to identify relevant contextual features which are not adequately considered by the system or for which the human driver may have a lack of trust into the system’s capabilities to treat them appropriately. We implement an approach that lets a human driver quickly and intuitively supplement scene predictions to an autonomous driving system by gaze. We illustrate the feasibility of this approach in an existing autonomous driving system running a variety of scenarios in a simulator. Furthermore, a Graphical User Interface (GUI) was designed and integrated to enhance the trust and explainability of the system. The utilization of such cooperatively augmented scenario predictions has the potential to improve a system’s foresighted driving abilities and make autonomous driving more trustable, comfortable and personalized.
Human-Vehicle Cooperation on Prediction-Level: Enhancing Automated Driving with Human Foresight
2021-07-11
3113955 byte
Conference paper
Electronic Resource
English
Speech improves human-automation cooperation in automated driving
DataCite | 2016
|Method for processing failure of vehicle-mounted foresight camera in automatic driving
European Patent Office | 2024
|V2X automatic driving decision-making system based on human-vehicle-road cooperation
European Patent Office | 2020
|Foresight vehicle: drive-by-tyre
Automotive engineering | 2002
|