The National Aeronautics and Space Administration (NASA) aims to land crew on the lunar surface to establish a sustainable presence and develop operational concepts for future long-duration missions. New technologies will be necessary to extend planning and execution capabilities for lunar surface activities. NASA’s Joint Augmented Reality Visual Informatics System (Joint AR) is one such technology. Joint AR is a suit-mounted augmented reality (AR) display and computes system which facilitates unprecedented information exchange and data visualization capabilities between mission support operators and suited crew. This paper describes challenges associated with developing AR technology for an envisioned work domain by applying a sociotechnical lens to the iterative testing and development of novel AR technology through virtual reality (VR). A VR testbed was established to simulate a representative lunar surface environment, enabling a series of three human-in-the-loop (HITL) tests evaluating AR navigation interfaces for exploration extravehicular activity (xEVA). Our findings identify several considerations for future Joint AR design and testing efforts, including challenges with data overload, attentional demands, and environment-related perceptual challenges. Trade-offs and potential approaches are discussed to mitigate these challenges and improve future Joint AR testing fidelity.


    Access

    Access via TIB

    Check availability in my library


    Export, share and cite



    Title :

    Using Virtual Reality to Envision Deployment of Spacesuit-Compatible Augmented Reality Displays for Lunar Surface Operations


    Contributors:
    J. Keller (author) / L. Ma (author) / M. Miller (author) / S. Ray (author) / D. Welsh (author) / L. Brady (author) / F. Porter (author) / J. Vacca (author) / P. Mitra (author) / M. Noyes (author)

    Publication date :

    2023


    Size :

    14 pages


    Type of media :

    Report


    Type of material :

    No indication


    Language :

    English