In this paper, we focus on designing an unified deep neural demodulation network for recovering multiple QAM signals, which can adapt to the adaptive QAM modulation signal. We specifically introduce the convolution block, identity block, self-attention block to extract the input complex signal feature, such that the unified demodulation network can jointly decide the modulation type and symbol constellation index. In the meanwhile, our proposed demodulation network can compensate the time and frequency offset induced by the channel and receiver oscillator. We evaluate the symbol error rate (SER) of our proposed demodulation network for BPSK, QPSK, and 8QAM signals in AWGN, Nakagami and Rician channels under different constellation mapping schemes. Simulation results show that, even when the constellation graph overlaps with each other for BPSK, QPSK, and 8QAM signals, for AWGN channel, the SER of our proposed unified demodulation network can approach the theoretical bound. While for the Nakagami and Rician channels, our proposed demodulation network can also provide a competitive SER performance.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Unified Deep Neural Demodulation Network Design for QAM Signal Recovery


    Contributors:
    Xiao, Bowen (author) / Zheng, Shilian (author) / Zhu, Jiawei (author) / Zhang, Ziyi (author) / Long, Yan (author) / Ju, Honghao (author)


    Publication date :

    2023-10-10


    Size :

    988925 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    LoRa Signal Demodulation Using Deep Learning, a Time-Domain Approach

    Dakic, Kosta / Al Homssi, Bassel / Al-Hourani, Akram et al. | IEEE | 2021


    On a Unified Deep Neural Network Decoding Architecture

    Artemasov, Dmitry / Andreev, Kirill / Frolov, Alexey | IEEE | 2023


    Wavelet-Based Demodulation Design for Vehicular Communication Network

    Ge, Yao / Shu, Zhan / Daut, David | Springer Verlag | 2017