Abstract In this paper we follow a participatory design approach to explore what novice users find to be intuitive ways to control an Unmanned Aerial Vehicle (UAV). We gather users’ suggestions for suitable voice and gesture commands through an online survey and a video interview and we also record the voice commands and gestures used by participants’ in a Wizard of Oz experiment where participants thought they were manoeuvring a UAV. We identify commonalities in the data collected from the three elicitation methods and assemble a collection of voice and gesture command sets for navigating a UAV. Furthermore, to obtain a deeper understanding of why our participants chose the gestures and voice commands they did, we analyse and discuss the collected data in terms of mental models and identify three prevailing classes of mental models that likely guided many of our participants in their choice of voice and gesture commands.
Exploring User-Defined Gestures and Voice Commands to Control an Unmanned Aerial Vehicle
2016-11-15
16 pages
Article/Chapter (Book)
Electronic Resource
English
Mental Model , Unmanned Aerial Vehicle , Horizontal Pole , Novice User , Unmanned Ground Vehicle Computer Science , Media Design , User Interfaces and Human Computer Interaction , Artificial Intelligence (incl. Robotics) , Computer Applications , Computers and Education , Computer Imaging, Vision, Pattern Recognition and Graphics