Hierarchical federated learning (HFL) allows edge devices aggregate trained parameters at local before global aggregation. However, data distribution as well as the service or application types of edge devices should be considered at the same time. In this paper, we propose a selective HFL (SHFL) algorithm that not only aims at accelerating model convergence but also at improving model robustness for independent and identically distributed (IID) data and non-IID data. The SHFL firstly performs clustering based on the data distribution of each client, then each cluster performs their training for local FL. After that, an extra FL training is performed as finetuning the aggregated model. The simulation results demonstrate that SHFL algorithm is superior to classic FL algorithms and an existing FL algorithm with hierarchical clustering scheme, especially for non-IID data.
SHFL: Selective Hierarchical Federated Learning for Non-IID Data Distribution
2024-06-24
799677 byte
Conference paper
Electronic Resource
English
CAV decision-making method applying selective federated learning
European Patent Office | 2023
|