Existing approaches to object detection address the generation of object hypotheses by extracting several cues in natural and automotive images, relying on objects with sufficiently high resolution. Very little to almost no approaches, however, address the generation of hypothesis of very small or distant objects in images such as on motorways. Here, we propose a simple yet effective approach to generating hypotheses of small and distant objects in images. Our key contribution is a novel voting scheme that makes efficient use of the different appearance of small candidate objects to their environment. We model the environment as being composed of very few regions with homogeneous appearance, extracted by evaluating the inner statistics of an image in an unsupervised fashion. Small regions that can not be assigned to the environment form potential candidate locations. Experimental results on motorway scenes with cars, traffic signs, and other automotive objects based on a variety of performance evaluation metrics show that our approach provides promising results, and outperforms one of the currently leading approaches in generating hypotheses for small and/or distant objects in images.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Generic hypothesis generation for small and distant objects


    Contributors:


    Publication date :

    2016-11-01


    Size :

    3065976 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English




    Scanned linear illumination of distant objects

    RAMTHUN KENT ALLAN / PESIK JOSEPH T | European Patent Office | 2020

    Free access

    SCANNED LINEAR ILLUMINATION OF DISTANT OBJECTS

    RAMTHUN KENT ALLAN / PESIK JOSEPH T | European Patent Office | 2019

    Free access


    Image capturing apparatus for close and distant objects

    KIYONO MITSUHIRO / SUGAWARA RYOICHI / OYAMA KO | European Patent Office | 2016

    Free access