A mobile inspection robot has been proposed for the NASA Space Station. It will be a free flying autonomous vehicle that will leave a berthing unit to accomplish a variety of inspection tasks around the Space Station, and then return to its berth to recharge, refuel, and transfer information. The Flying Eye robot will receive voice communication to change its attitude, move at a constant velocity, and move to a predefined location along a self generated path. This mobile robot control system requires integration of traditional command and control techniques with a number of AI technologies. Speech recognition, natural language understanding, task and path planning, sensory abstraction and pattern recognition are all required for successful implementation. The interface between the traditional numeric control techniques and the symbolic processing to the AI technologies must be developed, and a distributed computing approach will be needed to meet the real time computing requirements. To study the integration of the elements of this project, a novel mobile robot control architecture and simulation based on the blackboard architecture was developed. The control system operation and structure is discussed.
Distributed cooperating processes in a mobile robot control system
01.08.1988
Aufsatz (Konferenz)
Keine Angabe
Englisch
Analysis Of Control Of Cooperating Robot Arms
NTRS | 1991
|Distributed, Cooperating Knowledge-Based Systems
NTIS | 1991
|Distributed, cooperating knowledge-based systems
NTRS | 1991
|Cooperating Intelligent Agents for Distributed Satellite Systems
British Library Conference Proceedings | 1998
|