© 20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works ; We propose an efficient Human Robot Interaction approach to efficiently model the appearance of all relevant objects in robot’s environment. Given an input video stream recorded while the robot is navigating, the user just needs to annotate a very small number of frames to build specific classifiers for each of the objects of interest. At the core of the method, there are several random ferns classifiers that share the same features and are updated online. The resulting methodology is fast (runs at 8 fps), versatile (it can be applied to unconstrained scenarios), scalable (real experiments show we can model up to 30 different object classes), and minimizes the amount of human intervention by leveraging the uncertainty measures associated to each classifier. We thoroughly validate the approach on synthetic data and on real sequences acquired with a mobile platform in outdoor and challenging scenarios containing a multitude of different objects. We show that the human can, with minimal effort, provide the robot with a detailed model of the objects in the scene. ; Peer Reviewed ; Postprint (author's final draft)


    Access

    Download


    Export, share and cite



    Modeling robot's world with minimal effort

    Villamizar, Michael / Garrell, Anaís / Sanfeliu, Alberto et al. | BASE | 2015

    Free access

    Modeling a robot's tools

    Cattani, L.C. / Eagle, P.J. | Tema Archive | 1990


    Robot's charging station and robot's charging method

    CHOI DONG KYU / HONG EUL PYO | European Patent Office | 2023

    Free access

    This robot's gone fishing

    Buckingham, R. / Davey, P. | Tema Archive | 1995


    Robot's Apperance ; Roboto apipavidalinimas

    Popa, Inna | BASE | 2016

    Free access