In this paper we propose a novel part-based approach to scene understanding, that allows us to infer the properties of traffic scenes, such as location and geometry of lanes and roads. Lanes and roads are parts of our undirected graphical model in which nodes represent parts or sub-parts of scenes and edges represent spatial constraints. Spatial constraints are statistically formulated and allow us to take advantage of low-level relations as well as high-level contextual information. The estimation of scene properties is formulated as an inference problem, which is solved using non-parametric belief propagation. Inferring about high-level scene properties, while relying on error-prone sensory cues is challenging and computational expensive. Therefore, we introduced a novel depth-first message passing scheme. This scheme is applied to several challenging real world scenarios showing robust results and real-time performance.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Efficient scene understanding for intelligent vehicles using a part-based road representation


    Beteiligte:
    Topfer, Daniel (Autor:in) / Spehr, Jens (Autor:in) / Effertz, Jan (Autor:in) / Stiller, Christoph (Autor:in)


    Erscheinungsdatum :

    2013-10-01


    Format / Umfang :

    1506385 byte




    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch





    Hierarchical Scene Understanding for Intelligent Vehicles

    Spehr, J. / Rosebrock, D. / Wahl, F.M. et al. | British Library Conference Proceedings | 2011


    Understanding Road Scene Situation and Semantic Representation of Road Scene Situation for Reliable Sharing

    YALLA VEERAGANESH / PARUNDEKAR RAHUL RAVI / PILLAI PREETI J et al. | Europäisches Patentamt | 2017

    Freier Zugriff

    Understanding road scene situation and semantic representation of road scene situation for reliable sharing

    YALLA VEERAGANESH / PARUNDEKAR RAHUL RAVI / PILLAI PREETI J et al. | Europäisches Patentamt | 2017

    Freier Zugriff