Vision should provide an explanation of the scene in terms of a causal semantics-what affects what, and why. For mobile agents, the structural integrity of the immediate environment is a major concern. Thus, an important part of the causal explanation of static scenes is what supports what, or, counterfactually: Why aren't things moving? The authors use simple naive physical knowledge as the basis of a vertically integrated vision system that explains arbitrarily complex stacked block structures. The semantics provides a basis for controlling the application of visual attention, and forms a framework for the explanation that is generated. They show how the program sequentially explores scenes of complex blocks structures, identifies functional substructures such as arches and cantilevers, and develops an explanation of why the whole construction stands and the role of each block in its stability.<>


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Using causal scene analysis to direct focus of attention


    Contributors:
    Birnbaum, L. (author) / Brand, M. (author) / Cooper, P. (author)


    Publication date :

    1993-01-01


    Size :

    1067544 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Using Causal Scene Analysis to Direct Focus of Attention

    Birnbaum, L. / Brand, M. / Cooper, P. et al. | British Library Conference Proceedings | 1993


    Causal Scene Understanding

    Cooper, P. R. / Birnbaum, L. A. / Brand, M. E. | British Library Online Contents | 1995


    Focus-aided scene segmentation

    Pertuz, S. / Garcia, M. A. / Puig, D. | British Library Online Contents | 2015


    SCENE CONTENT AND ATTENTION SYSTEM

    BROOKS BRIAN E / LONG ANDREW W / SMITH KENNETH L et al. | European Patent Office | 2022

    Free access

    SCENE CONTENT AND ATTENTION SYSTEM

    BROOKS BRIAN E / LONG ANDREW W / SMITH KENNETH L et al. | European Patent Office | 2022

    Free access