During the past years, camera-equipped Unmanned Aerial Vehicles (UAVs) have revolutionized aerial cinematography, allowing easy acquisition of impressive footage. In this context, autonomous functionalities based on machine learning and computer vision modules are gaining ground. During live coverage of outdoor events, an autonomous UAV may visually track and follow a specific target of interest, under a specific desired shot type, mainly adjusted by choosing appropriate focal length and UAV/camera trajectory relative to the target. However, the selected UAV/camera trajectory and the object tracker requirements (which impose limits on the maximum allowable focal length) affect the range of feasible shot types, thus constraining cinematography planning. Therefore, this paper explores the interplay between cinematography and computer vision in the area of autonomous UAV filming. UAV target-tracking trajectories are formalized and geometrically modeled, so as to analytically compute maximum allowable focal length per scenario, to avoid 2D visual tracker failure. Based on this constraint, formulas for estimating the appropriate focal length to achieve the desired shot type in each situation are extracted, so as to determine shot feasibility. Such rules can be embedded into practical UAV intelligent shooting systems, in order to enhance their robustness by facilitating on-the-fly adjustment of the cinematography plan.
Shot type constraints in UAV cinematography for autonomous target tracking
01.01.2020
oai:zenodo.org:6141173
Elsevier Information Sciences 506 273-294
Aufsatz (Zeitschrift)
Elektronische Ressource
Englisch
DDC: | 629 |
Autonomous UAV Cinematography: A Tutorial and a Formalized Shot-Type Taxonomy
BASE | 2019
|AIAA | 2003
|British Library Online Contents | 2004
|Coptervision Advances Robotic VTOL Cinematography
Online Contents | 2000