Radar sensors are utilized for detection and classification purposes in various applications. In order to use deep learning techniques, lots of training data are required. Accordingly, lots of measurements and labelling tasks are then needed. For the purpose of pre-training or examining first ideas before bringing them into reality, synthetic radar data are of great help. In this paper, a workflow for automatically generating radar data of human gestures is presented, starting with creating the desired animations until synthesizing radar data and getting the final required dataset. The dataset could then be used for training deep learning models. A classification scenario applying this workflow is also introduced.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Human Motion Training Data Generation for Radar Based Deep Learning Applications


    Beteiligte:


    Erscheinungsdatum :

    01.04.2018


    Format / Umfang :

    1241049 byte




    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    USING RADAR DATA FOR AUTOMATIC GENERATION OF MACHINE LEARNING TRAINING DATA AND LOCALIZATION

    ABDELGAWAD SHEHABELDIN | Europäisches Patentamt | 2025

    Freier Zugriff

    USING RADAR DATA FOR AUTOMATIC GENERATION OF MACHINE LEARNING TRAINING DATA AND LOCALIZATION

    ABDELGAWAD SHEHABELDIN | Europäisches Patentamt | 2025

    Freier Zugriff

    A Deep Learning Model for Human Activity Recognition Using Motion Sensor Data

    Teng, Qianli / Kelishomi, A. Esmaeili / Cai, Zhongmin | British Library Online Contents | 2018


    Omnidirectional Human Motion Recognition With Monostatic Radar System Using Active Learning

    Zhou, Zhengkang / Yang, Yang / Li, Beichen et al. | IEEE | 2025


    Training Data Generation Method, Training Data Generation Apparatus, And Training Data Generation Program

    KOHASHI OSAMU / OHASHI SHINJI / TADAHIRA YOSHIO et al. | Europäisches Patentamt | 2021

    Freier Zugriff