Abstract Missions requiring autonomous, close-proximity operations of spacecraft, such as On-Orbit Servicing, On-Orbit Assembly and Active Debris Removal, have become a thriving topic in the aerospace research community over the last decades, not only from an economic, operative, and scientific perspective, but also as a mean of ensuring the sustainability of the space environment. These operations involve a variety of technological challenges, most of which are related to the need of autonomous and safe Guidance, Navigation and Control systems. Since the future of these mission scenarios is strictly tied to spacecraft standardisation and modularity, relative navigation employing monocular cameras on servicing platforms to approach targets equipped with artificial markers for pose estimation purposes has drawn great attention. Following this trend, this paper presents an original vision-based pose estimation architecture for relative navigation with respect to passively cooperative targets equipped with ArUco markers. The proposed architecture foresees two operative modes, namely Acquisition and Tracking. The first features ArUco's detection through hue-saturation-value image representation, their identification by reading their built-in code and the computation of the pose without a-priori knowledge. The second, instead, takes advantage of prior pose estimates to speed up the entire processing pipeline. Performance is assessed through an extensive numerical simulation campaign, considering as test scenario the final approach phase of a rendezvous manoeuvre to reach a satellite belonging to a large constellation in Low Earth Orbit, and using the Planet and Asteroid Natural scene Generation Utility tool for realistic synthetic image generation. The dedicated tests on the Acquisition mode show that successful marker detection and pose initialization is achieved from up to 99.76% of the possible relative position and attitude states of the chaser with respect to the target at beginning of the final approach trajectory. As the chaser gets closer to the target, results highlight significant robustness of both operative modes against illumination conditions and uncertainties in the knowledge of camera intrinsic parameters. Overall, the architecture shows pose estimation accuracies up to millimetric and sub-degree levels.

    Highlights A visual-based pose determination algorithm for spacecraft close-proximity operations is presented. ArUco markers are used as fiducials placed on the target platform. Simulations are conducted to assess pose estimation accuracy and robustness against several sources of disturbance. The PANGU tool is used for generation of realistic synthetic images. Results highlight millimetric and sub-degree accuracies in pose estimation.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Pose determination of passively cooperative spacecraft in close proximity using a monocular camera and AruCo markers


    Beteiligte:

    Erschienen in:

    Acta Astronautica ; 201 ; 22-38


    Erscheinungsdatum :

    2022-08-14


    Format / Umfang :

    17 pages




    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch




    ArUco markers pose estimation in UAV landing aid system

    Marut, Adam / Wojtowicz, Konrad / Falkowski, Krzysztof | IEEE | 2019


    A review of cooperative and uncooperative spacecraft pose determination techniques for close-proximity operations

    Opromolla, Roberto / Fasano, Giancarmine / Rufino, Giancarlo et al. | Elsevier | 2017



    Optical multi-camera UAV positioning system via ArUco fiducial markers

    De Corso, Tony / De Vito, Luca / Picariello, Francesco et al. | IEEE | 2023


    Spacecraft Pose Estimation using Principal Component Analysis and a Monocular Camera

    Shi, Jian-Feng / Ulrich, Steve / Ruel, Stephane | AIAA | 2017