In this paper, we propose a robust federated meta-learning framework for training end-to-end communication systems in a cell-free scenario to address the challenges posed by channel variations and adversarial attacks. By combining federated learning (FL) and meta-learning, our framework enables collaborative training of distributed learnable transceivers to obtain a common initial model that can rapidly adapt to new channel conditions with few training pilots. Specifically, the base stations (BSs) and their associated user devices perform local updates for calculating gradients, which are then aggregated at a central processing unit (CPU) to update the common initialization. To enhance the resilience of the system against channel-aware attacks, we integrate a smoothness-enhancing strategy inspired by the defensive distillation into the framework. This strategy mitigates the impact of adversarial perturbations without incurring additional computational overhead. The approach is validated via numerical results, demonstrating a significant improvement in convergence speed and accuracy, while exhibiting notable robustness against attacks.
Federated Meta-Learning based End-to-End Communication Against Channel-Aware Adversaries
2024-10-07
1436925 byte
Conference paper
Electronic Resource
English
SYSTEMS AND METHODS FOR COMMUNICATION-AWARE FEDERATED LEARNING
European Patent Office | 2024
|SYSTEMS AND METHODS FOR CONTRIBUTION-AWARE FEDERATED LEARNING
European Patent Office | 2024
|