Object shape information is an important parameter in robot grasping tasks. However, it may be difficult to obtain accurate models of novel objects due to incomplete and noisy sensory measurements. In addition, object shape may change due to frequent interaction with the object (cereal boxes, etc). In this paper, we present a probabilistic approach for learning object models based on visual and tactile perception through physical interaction with an object. Our robot explores unknown objects by touching them strategically at parts that are uncertain in terms of shape. The robot starts by using only visual features to form an initial hypothesis about the object shape, then gradually adds tactile measurements to refine the object model. Our experiments involve ten objects of varying shapes and sizes in a real setup. The results show that our method is capable of choosing a small number of touches to construct object models similar to real object shapes and to determine similarities among acquired models. ; QC 20131104


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Enhancing Visual Perception of Shape through Tactile Glances


    Beteiligte:

    Erscheinungsdatum :

    2013-01-01



    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    629



    SIDE GLANCES

    Egan, Peter | Online Contents | 2013


    SIDE GLANCES

    Egan, Peter | Online Contents | 2013


    SIDE GLANCES

    Egan, Peter | Online Contents | 2013


    SIDE GLANCES

    Egan, Peter | Online Contents | 2013


    SIDE GLANCES

    Egan, Peter | Online Contents | 2013