Resource-constrained on-device learning by dynamic averaging

Lukas Heppe, Michael Kamp, Linara Adilova, Danny Heinrich, Nico Piatkowski, Katharina Morik

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

1 Citation (Scopus)

Abstract

The communication between data-generating devices is partially responsible for a growing portion of the world’s power consumption. Thus reducing communication is vital, both, from an economical and an ecological perspective. For machine learning, on-device learning avoids sending raw data, which can reduce communication substantially. Furthermore, not centralizing the data protects privacy-sensitive data. However, most learning algorithms require hardware with high computation power and thus high energy consumption. In contrast, ultra-low-power processors, like FPGAs or micro-controllers, allow for energy-efficient learning of local models. Combined with communication-efficient distributed learning strategies, this reduces the overall energy consumption and enables applications that were yet impossible due to limited energy on local devices. The major challenge is then, that the low-power processors typically only have integer processing capabilities. This paper investigates an approach to communication-efficient on-device learning of integer exponential families that can be executed on low-power processors, is privacy-preserving, and effectively minimizes communication. The empirical evaluation shows that the approach can reach a model quality comparable to a centrally learned regular model with an order of magnitude less communication. Comparing the overall energy consumption, this reduces the required energy for solving the machine learning task by a significant amount.

Original languageEnglish
Title of host publicationECML PKDD 2020 Workshops
Subtitle of host publicationWorkshops of the European Conference on Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2020): SoGood 2020, PDFL 2020, MLCS 2020, NFMCP 2020, DINA 2020, EDML 2020, XKDD 2020 and INRA 2020 Ghent, Belgium, September 14–18, 2020 Proceedings
EditorsIrena Koprinska, Michael Kamp, Annalisa Appice, Corrado Loglisci, Luiza Antonie, Albrecht Zimmermann, Riccardo Guidotti, Özlem Özgöbek
Place of PublicationCham Switzerland
PublisherSpringer
Pages129-144
Number of pages16
ISBN (Electronic)9783030659653
ISBN (Print)9783030659646
DOIs
Publication statusPublished - 2020
EventParallel, Distributed, and Federated Learning Workshop 2020 - Virtual, Ghent, Belgium
Duration: 14 Sept 202014 Sept 2020
Conference number: 3rd
https://pdfl.iais.fraunhofer.de/
https://link.springer.com/book/10.1007/978-3-030-65965-3 (Proceedings)
https://pdfl.iais.fraunhofer.de (Website)

Publication series

NameCommunications in Computer and Information Science
PublisherSpringer
Volume1323
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Workshop

WorkshopParallel, Distributed, and Federated Learning Workshop 2020
Abbreviated titlePDFL 2020
Country/TerritoryBelgium
CityGhent
Period14/09/2014/09/20
OtherPart of ECMLPKDD 2020
Internet address

Cite this