Learning private neural language modeling with attentive aggregation

Shaoxiong Ji, Shirui Pan, Guodong Long, Xue Li, Jing Jiang, Zi Huang

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

14 Citations (Scopus)

Abstract

Mobile keyboard suggestion is typically regarded as a word-level language modeling problem. Centralized machine learning techniques require the collection of massive user data for training purposes, which may raise privacy concerns in relation to users' sensitive data. Federated learning (FL) provides a promising approach to learning private language modeling for intelligent personalized keyboard suggestions by training models on distributed clients rather than training them on a central server. To obtain a global model for prediction, existing FL algorithms simply average the client models and ignore the importance of each client during model aggregation. Furthermore, there is no optimization for learning a well-generalized global model on the central server. To solve these problems, we propose a novel model aggregation with an attention mechanism considering the contribution of client models to the global model, together with an optimization technique during server aggregation. Our proposed attentive aggregation method minimizes the weighted distance between the server model and client models by iteratively updating parameters while attending to the distance between the server model and client models. Experiments on two popular language modeling datasets and a social media dataset show that our proposed method outperforms its counterparts in terms of perplexity and communication cost in most settings of comparison.

Original languageEnglish
Title of host publicationInternational Joint Conference on Neural Networks (IJCNN) 2019
EditorsPlamen Angelov, Manuel Roveri
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages8
ISBN (Electronic)9781728119854
ISBN (Print)9781728119861
DOIs
Publication statusPublished - 2019
EventIEEE International Joint Conference on Neural Networks 2019 - Budapest, Hungary
Duration: 14 Jul 201919 Jul 2019
https://ieeexplore.ieee.org/xpl/conhome/8840768/proceeding (Proceedings)

Conference

ConferenceIEEE International Joint Conference on Neural Networks 2019
Abbreviated titleIJCNN 2019
CountryHungary
CityBudapest
Period14/07/1919/07/19
Internet address

Keywords

  • attentive aggregation
  • federated learning
  • language modeling

Cite this