Abstract In augmented reality, accurate geometric adjustment of real scene and virtual 3D models is important. In this paper, we propose a new method for generating arbitrary views of 3D motion events accurately by using the mutual projections between user’s cameras and cameras around the user. In particular, we show that the trifocal tensors computed from the mutual camera projections can be used efficiently for generating accurate user’s views of 3D motion events from multiple camera images. We also show a method for identifying cameras projected in other cameras by using the invariance in multiple view geometry. The proposed method is implemented and tested in the real scene.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Generating Free Viewpoint Images from Mutual Projection of Cameras


    Contributors:
    Kato, Koichi (author) / Sato, Jun (author)


    Publication date :

    2006-01-01


    Size :

    10 pages





    Type of media :

    Article/Chapter (Book)


    Type of material :

    Electronic Resource


    Language :

    English




    Generating Free Viewpoint Images from Mutual Projection of Cameras

    Kato, K. / Sato, J. | British Library Conference Proceedings | 2006



    Real-time free viewpoint video from a range sensor and color cameras

    Pelletier, S. p. / Cooperstock, J. R. | British Library Online Contents | 2013


    Free-Viewpoint Image Synthesis from Multiple-Viewimages Taken with Uncalibrated Moving Cameras

    Ito, Y. / Saito, H. | British Library Conference Proceedings | 2005


    Non-Single Viewpoint Catadioptric Cameras: Geometry and Analysis

    Swaminathan, R. / Grossberg, M. D. / Nayar, S. K. | British Library Online Contents | 2006