First-person-view ground control stations are commonly used for experimental flights of unmanned aircraft systems. They help to overcome the limitations of visual line of sight pilots, such as restricted flight area and difficulty in observing the aircraft attitude and distance. However, the camera based first-person view can fail in poor visibility conditions or if the video feed is lost. This paper discusses whether and how a synthetic-vision could serve as a critical fallback solution by displaying the current state of the aircraft, particularly during the approach and landing phases of experimental flights. A synthetic-vision-based landing system utilizing third-party visualization software is presented. Emphasis is placed on the visualization of the real height above ground during landing. Different methods for determining and visualizing the current height are evaluated. Pilot-in-the-loop simulations assess these methods based on landing performance, pilot control strategy and subjective pilot ratings. A method that combines an estimated height above ground, based on a distance sensor and a digital elevation model, with the geodetic altitude received the most positive feedback from pilots and demonstrated improved control strategy. Initial flight testing results indicate that data rate and latency issues need to be addressed before the system is ready to use.
A Synthetic-Vision-Based Landing System for Remotely Piloted Experimental UAS
2024
10 pages
Miscellaneous
Electronic Resource
English