The collaboration between humans and robots is one of the most disruptive and challenging research areas. Even considering advances in design and artificial intelligence, humans and robots could soon ally to perform together a number of different tasks. Robots could also became new playmates. In fact, an emerging trend is associated with the so-called phygital gaming, which builds upon the idea of merging the physical world with a virtual one in order to let physical and virtual entities, such as players, robots, animated characters and other game objects interact seamlessly as if they were all part of the same reality. This paper specifically focuses on mixed reality gaming environments that can be created by using floor projection, and tackles the issue of enabling accurate and robust tracking of off-the-shelf robots endowed with limited sensing capabilities. The proposed solution is implemented by fusing visual tracking data gathered via a fixed camera in a smart environment with odometry data obtained from robot’s on-board sensors. The solution has been tested within a phygital gaming platform in a real usage scenario, by experimenting with a robotic game that exhibits many challenging situations which would be hard to manage using conventional tracking techniques.


    Access

    Download


    Export, share and cite



    Title :

    Robust robot tracking for next-generation collaborative robotics-based gaming environments



    Publication date :

    2020-01-01



    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English


    Classification :

    DDC:    629





    Gaming robot

    ADEKUNLE SILAS ADEDOTUN / BECK CHRISTOPHER JAMES / DIDEY ARNAUD et al. | European Patent Office | 2020

    Free access

    GAMING ROBOT

    ADEKUNLE SILAS ADEDOTUN / BECK CHRISTOPHER JAMES / DIDEY ARNAUD et al. | European Patent Office | 2018

    Free access