Abstract In augmented reality, accurate geometric adjustment of real scene and virtual 3D models is important. In this paper, we propose a new method for generating arbitrary views of 3D motion events accurately by using the mutual projections between user’s cameras and cameras around the user. In particular, we show that the trifocal tensors computed from the mutual camera projections can be used efficiently for generating accurate user’s views of 3D motion events from multiple camera images. We also show a method for identifying cameras projected in other cameras by using the invariance in multiple view geometry. The proposed method is implemented and tested in the real scene.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Generating Free Viewpoint Images from Mutual Projection of Cameras


    Beteiligte:
    Kato, Koichi (Autor:in) / Sato, Jun (Autor:in)


    Erscheinungsdatum :

    2006-01-01


    Format / Umfang :

    10 pages





    Medientyp :

    Aufsatz/Kapitel (Buch)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch




    Generating Free Viewpoint Images from Mutual Projection of Cameras

    Kato, K. / Sato, J. | British Library Conference Proceedings | 2006



    Real-time free viewpoint video from a range sensor and color cameras

    Pelletier, S. p. / Cooperstock, J. R. | British Library Online Contents | 2013


    Free-Viewpoint Image Synthesis from Multiple-Viewimages Taken with Uncalibrated Moving Cameras

    Ito, Y. / Saito, H. | British Library Conference Proceedings | 2005


    Non-Single Viewpoint Catadioptric Cameras: Geometry and Analysis

    Swaminathan, R. / Grossberg, M. D. / Nayar, S. K. | British Library Online Contents | 2006