Exploring the unknown environment is a very crucial task where human life is at risks like search and rescue operations, abandoned nuclear plants, covert operations and more. Autonomous robots could serve this task efficiently. The existing methods use uncertainty models for localization and map building to explore the unknown areas requiring high onboard computation and time. We propose to use Deep Reinforcement Learning (DRL) for the autonomous exploration of unknown environments. In DRL, the agent interacts with the environment and learns based on experiences (feedback/reward). We propose extrinsic and curiosity-driven reward functions to explore the environment. The curiosity-based reward function motivates the agent to explore unseen areas by predicting future states, while the extrinsic reward function avoids collisions. We train the differential drive robot in one environment and evaluate its performance in another unknown environment. We observe curiosity-driven reward function outperformed the extrinsic reward by exploring more areas in the unknown environment. The test results show the generalization capability to explore unknown environments with the proposed methods.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Exploration of Unknown Environment using Deep Reinforcement Learning


    Contributors:
    Ali, Asad (author) / Gul, Sarah (author) / Mahmood, Tallat (author) / Ullah, Anayat (author)


    Publication date :

    2023-03-03


    Size :

    1802440 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English