Information-theoretic perspective of federated learning

Linara Adilova, Julia Rosenzweig, Michael Kamp

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

An approach to distributed machine learning is to train models on local datasets and aggregate these models into a single, stronger model. A popular instance of this form of parallelization is federated learning, where the nodes periodically send their local models to a coordinator that aggregates them and redistributes the aggregation back to continue training with it. The most frequently used form of aggregation is averaging the model parameters, e.g., the weights of a neural network. However, due to the non-convexity of the loss surface of neural networks, averaging can lead to detrimental effects and it remains an open question under which conditions averaging is beneficial. In this paper, we study this problem from the perspective of information theory: We measure the mutual information between representation and inputs as well as representation and labels in local models and compare it to the respective information contained in the representation of the averaged model. Our empirical results confirm previous observations about the practical usefulness of averaging for neural networks, even if local dataset distributions vary strongly. Furthermore, we obtain more insights about the impact of the aggregation frequency on the information flow and thus on the success of distributed learning. These insights will be helpful both in improving the current synchronization process and in further understanding the effects of model aggregation.
Original languageEnglish
Title of host publicationITML 2019 - NeurIPS 2019 Workshop on Information Theory and Machine Learning
EditorsShengjia Zhao, Jiaming Song, Kristy Choi, Pratyusha Kalluri , Yanjun Han, Jiantao Jiao, Alex Dimakis, Ben Poole, Tsachy Weissman, Stefano Ermon
Place of PublicationAtlanta Georgia USA
PublisherAssociation for Information Systems
Number of pages5
Publication statusPublished - 2019
EventWorkshop on Information Theory and Machine Learning 2019 - Vancouver, Canada
Duration: 13 Dec 201813 Dec 2018
https://sites.google.com/view/itml19/home

Conference

ConferenceWorkshop on Information Theory and Machine Learning 2019
Abbreviated titleITML 2019
CountryCanada
CityVancouver
Period13/12/1813/12/18
Internet address

Cite this

Adilova, L., Rosenzweig, J., & Kamp, M. (2019). Information-theoretic perspective of federated learning. In S. Zhao, J. Song, K. Choi, P. Kalluri , Y. Han, J. Jiao, A. Dimakis, B. Poole, T. Weissman, & S. Ermon (Eds.), ITML 2019 - NeurIPS 2019 Workshop on Information Theory and Machine Learning Association for Information Systems. https://sites.google.com/view/itml19/home