This thesis presents the study, analysis, and implementation of a framework to perform trajectory prediction using an event-based camera for robotics applications. Event-based perception represents a novel computation paradigm based on unconventional sensing technology that holds promise for data acquisition, transmission, and processing at very low latency and power consumption, crucial in the future of robotics. An event-based camera, in particular, is a sensor that responds to light changes in the scene, producing an asynchronous and sparse output over a wide illumination dynamic range. They only capture relevant spatio-temporal information - mostly driven by motion - at high rate, avoiding the inherent redundancy in static areas of the field of view. For such reasons, this device represents a potential key tool for robots that must function in highly dynamic and/or rapidly changing scenarios, or where the optimisation of the resources is fundamental, like robots with on-board systems. Prediction skills are something humans rely on daily - even unconsciously - for instance when driving, playing sports, or collaborating with other people. In the same way, predicting the trajectory or the end-point of a moving target allows a robot to plan for appropriate actions and their timing in advance, interacting with it in many different manners. Moreover, prediction is also helpful for compensating robot internal delays in the perception-action chain, due for instance to limited sensors and/or actuators. The question I addressed in this work is whether event-based cameras are advantageous or not in trajectory prediction for robotics. In particular, if classical deep learning architecture used for this task can accommodate for event-based data, working asynchronously, and which benefit they can bring with respect to standard cameras. The a priori hypothesis is that being the sampling of the scene driven by motion, such a device would allow for more meaningful information acquisition, improving the prediction accuracy ...


    Access

    Download


    Export, share and cite



    Title :

    Trajectory Prediction with Event-Based Cameras for Robotics Applications



    Publication date :

    2021-05-26


    Remarks:

    doi:10.15167/monforte-marco_phd2021-05-26



    Type of media :

    Theses


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    629



    ToF cameras for active vision in robotics

    Alenyà Ribas, Guillem / Foix Salmerón, Sergi / Torras, Carme | BASE | 2014

    Free access

    ToF cameras for active vision in robotics

    Alenyà, Guillem / Foix, Sergi / Torras, Carme | BASE | 2014

    Free access


    Holographical image based vibrometry with monochromatic and event based cameras

    Hartlieb, Simon / Boguslawski, Maciej / Haist, Tobias et al. | SPIE | 2022


    Independent motion detection with event-driven cameras

    Vasco, Valentina / Glover, A. / Müggler, Elias et al. | BASE | 2017

    Free access