State-of-the-art intelligent-vehicle, autonomous-guided-vehicle and mobile-robotics applications usually can be modelled as a collection of interacting, highly autonomous, complex dynamical systems (entities). In these application domains each individual entity has a unique range of possible interactions with its environment and should rely on its own "information gathering" methods to understand its environment. This characteristic puts emphasis on sensing and sensory data interpretation. The resulting sensor data interpretation problem demands sophisticated evaluation/test environments, which incorporate high-fidelity sensor models. The paper describes a multi-agent real-time simulation framework, which allows high-fidelity virtual sensors (incl. imaging sensors) to be incorporated in HIL experiments. The resulting tool provides reproducibility, full control of the environment and a flexible mix of real and virtual components in simulation experiments. A pre-crash control solution with a laser range finder sensor is used to illustrate the approach.
Multi-agent based HIL simulator with high fidelity virtual sensors
2003-01-01
474126 byte
Conference paper
Electronic Resource
English
Physiological Based Simulator Fidelity Design Guidance
NTIS | 2012
|Physiological Based Simulator Fidelity Design Guidance
NTRS | 2012
|Real Time GPS Simulator Integrated with the High Fidelity Manned Flight Simulator
British Library Conference Proceedings | 1995
|A SOLAR SIMULATOR WITH SPECTRAL FIDELITY
AIAA | 1964
|