In this paper, we present an approach for estimating the leaf density of trees while navigating in a forest. To this end, we consider an Unmanned Aerial Vehicle (UAV) equipped with a biosonar sensor that mimics the sonar sensors of echolocating bats. Such sensors provide a light-weight and cost-effective alternative to other widely used sensors such as camera, LiDAR and are gaining popularity among the robotics research community. The obtained echo signals during UAV navigation are processed to obtain the leaf density in the main lobe of the sonar first using a mel spectogram and then a Deep Convolutional Neural Network (CNN) trained on a set of known environment. We further evaluate our approach in simulation by considering trees with different leaf density (that is, resolution). It is seen that our method achieves promising results with an accuracy of 98.7%.
Mel-spectrogram and Deep CNN Based Representation Learning from Bio-Sonar Implementation on UAVs
08.01.2021
1849526 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Vision-based deep learning for UAVs collaboration
SPIE | 2019
|Vision-based deep learning for UAVs collaboration
British Library Conference Proceedings | 2019
|DOAJ | 2022
|