857
Views
1
CrossRef citations to date
0
Altmetric
Articles

A dynamic global backbone updating for communication-efficient personalised federated learning

ORCID Icon & ORCID Icon
Pages 2240-2264 | Received 08 Jun 2022, Accepted 12 Aug 2022, Published online: 25 Aug 2022
 

ABSTRACT

Federated learning (FL) is an emerging distributed machine learning technique. However, when dealing with heterogeneous data, a shared global model cannot generalise all devices' local data. Furthermore, the FL training process necessitates frequent parameter communication, which interferes with the limited bandwidth and unstable connections of participating devices. These two issues have a significant impact on FL's effectiveness and efficiency. In this paper, an enhanced communication-efficient personalised FL technique, FedGB, is proposed. Different from existing approaches, FedGB believes that only interacting common information from training results on different devices can improve local personalised training results more effectively. FedGB dynamically selects the backbone structures in the local models to represent the dynamically determined backbone information (common features) in the global model for aggregation. Only interacting common features between different nodes reduce the impact of heterogeneous data to a certain extent. The dynamic adaptive sub-model selection avoids the impact of manually setting the scale of sub-model. FedGB can thus reduce communication overheads while maintaining inference accuracy. The results obtained in a variety of experimental settings show that FedGB can effectively improve communication efficiency and inference accuracy.

Disclosure statement

No potential conflict of interest was reported by the author(s).