Towards Mitigating Device Heterogeneity in Federated Learning via Adaptive Model Quantization
Type
Conference PaperKAUST Department
Computer Science ProgramComputer, Electrical and Mathematical Science and Engineering (CEMSE) Division
Date
2021-04-26Online Publication Date
2021-04-26Print Publication Date
2021-04-26Permanent link to this record
http://hdl.handle.net/10754/669058
Metadata
Show full item recordAbstract
Federated learning (FL) is increasingly becoming the norm for training models over distributed and private datasets. Major service providers rely on FL to improve services such as text auto-completion, virtual keyboards, and item recommendations. Nonetheless, training models with FL in practice requires significant amount of time (days or even weeks) because FL tasks execute in highly heterogeneous environments where devices only have widespread yet limited computing capabilities and network connectivity conditions. In this paper, we focus on mitigating the extent of device heterogeneity, which is a main contributing factor to training time in FL. We propose AQFL, a simple and practical approach leveraging adaptive model quantization to homogenize the computing resources of the clients. We evaluate AQFL on five common FL benchmarks. The results show that, in heterogeneous settings, AQFL obtains nearly the same quality and fairness of the model trained in homogeneous settings.Citation
Abdelmoniem, A. M., & Canini, M. (2021). Towards Mitigating Device Heterogeneity in Federated Learning via Adaptive Model Quantization. Proceedings of the 1st Workshop on Machine Learning and Systems. doi:10.1145/3437984.3458839Publisher
ACMConference/Event name
The 1st Workshop on Machine Learning and Systems (EuroMLSys)ISBN
9781450382984Additional Links
https://dl.acm.org/doi/10.1145/3437984.3458839https://mcanini.github.io/papers/aqfl.euromlsys21.pdf
ae974a485f413a2113503eed53cd6c53
10.1145/3437984.3458839