Animals by far outperform current technology when reacting to visual stimuli in low processing requirements, demonstrating astonishingly fast reaction times to changes. Current real-time vision based robotic control approaches, in contrast, typically require high computational resources to extract relevant information from sequences of images provided by a video camera. Most of the information contained in consecutive images is redundant, which often turns the vision processing algorithms into a limiting factor in high-speed robot control. As an example, robotic pole balancing with large objects is a well known exercise in current robotics research, but balancing arbitrary small poles (such as a pencil, which is too small for a human to balance) has not yet been achieved due to limitations in vision processing. At the Institute of Neuroinformatics we developed an analog silicon retina (http://siliconretina.ini.uzh.ch), which, in contrast to current video cameras, only reports individual events ("spikes") from individual pixels when the illumination changes within the pixel's field of view. Transmitting only the "on" and "off" spike events, instead of transmitting full vision frames, drastically reduces the amount of data processing required to react to environmental changes. This information encoding is directly inspired by the spike based information transfer from the human eye to visual cortex. In our demonstration, we address the challanging problem of balancing an arbitrary standard pencil, based solely on visual information. A stereo pair of silicon retinas reports vision events caused by the moving pencil, which is standing on its tip on an actuated table. Then our processing algorithm extracts the pencil position and angle without ever using a "full scene" visual representation, but simply by processing only the spikes relevant to the pencil's motion. Our system uses neurally inspired hardware and a neurally inspired form of communication to achieve a difficult goal.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Balancing pencils using spike-based vision sensors


    Beteiligte:
    Conradt, J (Autor:in) / Berner, R (Autor:in) / Lichtsteiner, P (Autor:in) / Douglas, R J (Autor:in) / Delbruck, T (Autor:in) / Cook, M (Autor:in)

    Erscheinungsdatum :

    2009-10-02


    Anmerkungen:

    Conradt, J; Berner, R; Lichtsteiner, P; Douglas, R J; Delbruck, T; Cook, M (2009). Balancing pencils using spike-based vision sensors. In: Bernstein Conference on Computational Neuroscience 2009 (BCCN 2009), Frankfurt am Main, DE, 30 September 2009 - 2 October 2009, online.



    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    004 / 629



    Pause-and-Go Self-Balancing Formation Control of Autonomous Vehicles Using Vision and Ultrasound Sensors

    Mwaffo V. / Curry J. S. / Lo Iudice F. et al. | BASE | 2021

    Freier Zugriff



    Spike und Luftreifen mit Spike

    YASUNAGA TOSHIKAZU | Europäisches Patentamt | 2015

    Freier Zugriff

    Gait disorder rehabilitation using vision and non-vision based sensors: A systematic review

    Ali, Asraf / Sundaraj, Kenneth / Ahmad, Badlishah et al. | BASE | 2012

    Freier Zugriff