Logo image
SAFA: A Semi-Asynchronous Protocol for Fast Federated Learning With Low Overhead
Journal article   Peer reviewed

SAFA: A Semi-Asynchronous Protocol for Fast Federated Learning With Low Overhead

Wentai Wu, Ligang He, Weiwei Lin, Rui Mao, Carsten Maple and Stephen Jarvis
IEEE transactions on computers, Vol.70(5), pp.655-668
01/05/2021

Abstract

Convergence Data models Distributed computing Distributed databases edge intelligence federated learning Machine learning Optimization Protocols Training
Federated learning (FL) has attracted increasing attention as a promising approach to driving a vast number of end devices with artificial intelligence. However, it is very challenging to guarantee the efficiency of FL considering the unreliable nature of end devices while the cost of device-server communication cannot be neglected. In this article, we propose SAFA, a semi-asynchronous FL protocol, to address the problems in federated learning such as low round efficiency and poor convergence rate in extreme conditions (e.g., clients dropping offline frequently). We introduce novel designs in the steps of model distribution, client selection and global aggregation to mitigate the impacts of stragglers, crashes and model staleness in order to boost efficiency and improve the quality of the global model. We have conducted extensive experiments with typical machine learning tasks. The results demonstrate that the proposed protocol is effective in terms of shortening federated round duration, reducing local resource wastage, and improving the accuracy of the global model at an acceptable communication cost.

Metrics

Details

Logo image

Usage Policy