One way to increase user acceptance in automated vehicles is to explain their driving decisions, but current methods still involve human interpretations and are thus prone to errors. Therefore, the presented method formulates summaries that clarify the automated vehicle’s driving decision by extracting all necessary information automatically from the planning algorithm. This paper shows the generation of three exemplary statement types and their validation with an online survey that investigated users’ preferences. The results suggest that participants favor statements describing information that affect the driving decision as well as applicable traffic rules. Additionally, individual information needs should be considered when constructing modular explanations. Although this analysis does not consider sophisticated human machine interfaces nor real traffic scenarios, it does show, for the first time, how satisfying statements can be generated using a planning algorithm without any human-induced bias. This is an important step towards self-contained transparency of automated driving functions and can therefore lay the basis for future human machine interfaces.
How Can Automated Vehicles Explain Their Driving Decisions? Generating Clarifying Summaries Automatically
05.06.2022
503850 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
NOAA: Clarifying Priorities, Guiding Policy Decisions
British Library Online Contents | 1994
|Springer Verlag | 2021
|Automated driving trajectory generating device and automated driving device
Europäisches Patentamt | 2023
|AUTOMATED DRIVING TRAJECTORY GENERATING DEVICE AND AUTOMATED DRIVING DEVICE
Europäisches Patentamt | 2022
|Europäisches Patentamt | 2021
|