Resource-constrained on-device learning by dynamic averaging

Lukas Heppe, Michael Kamp, Linara Adilova, Danny Heinrich, Nico Piatkowski, Katharina Morik

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

1 Citation (Scopus)


The communication between data-generating devices is partially responsible for a growing portion of the world’s power consumption. Thus reducing communication is vital, both, from an economical and an ecological perspective. For machine learning, on-device learning avoids sending raw data, which can reduce communication substantially. Furthermore, not centralizing the data protects privacy-sensitive data. However, most learning algorithms require hardware with high computation power and thus high energy consumption. In contrast, ultra-low-power processors, like FPGAs or micro-controllers, allow for energy-efficient learning of local models. Combined with communication-efficient distributed learning strategies, this reduces the overall energy consumption and enables applications that were yet impossible due to limited energy on local devices. The major challenge is then, that the low-power processors typically only have integer processing capabilities. This paper investigates an approach to communication-efficient on-device learning of integer exponential families that can be executed on low-power processors, is privacy-preserving, and effectively minimizes communication. The empirical evaluation shows that the approach can reach a model quality comparable to a centrally learned regular model with an order of magnitude less communication. Comparing the overall energy consumption, this reduces the required energy for solving the machine learning task by a significant amount.

Original languageEnglish
Title of host publicationECML PKDD 2020 Workshops
Subtitle of host publicationWorkshops of the European Conference on Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2020): SoGood 2020, PDFL 2020, MLCS 2020, NFMCP 2020, DINA 2020, EDML 2020, XKDD 2020 and INRA 2020 Ghent, Belgium, September 14–18, 2020 Proceedings
EditorsIrena Koprinska, Michael Kamp, Annalisa Appice, Corrado Loglisci, Luiza Antonie, Albrecht Zimmermann, Riccardo Guidotti, Özlem Özgöbek
Place of PublicationCham Switzerland
Number of pages16
ISBN (Electronic)9783030659653
ISBN (Print)9783030659646
Publication statusPublished - 2020
EventParallel, Distributed, and Federated Learning Workshop 2020 - Virtual, Ghent, Belgium
Duration: 14 Sep 202014 Sep 2020
Conference number: 3rd (Proceedings) (Website)

Publication series

NameCommunications in Computer and Information Science
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937


WorkshopParallel, Distributed, and Federated Learning Workshop 2020
Abbreviated titlePDFL 2020
OtherPart of ECMLPKDD 2020
Internet address

Cite this