King's College London

Research portal

Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent

Research output: Contribution to journalArticlepeer-review

Original languageEnglish
Pages (from-to)2180-2192
Number of pages13
JournalIEEE Transactions on Signal Processing
Volume70
Early online date26 Apr 2022
DOIs
Accepted/In press12 Apr 2022
E-pub ahead of print26 Apr 2022
Published2022

Bibliographical note

Publisher Copyright: © 1991-2012 IEEE.

Documents

King's Authors

Abstract

This paper introduces Distributed Stein Variational Gradient Descent (DSVGD), a non-parametric generalized Bayesian inference framework for federated learning. DSVGD maintains a number of non-random and interacting particles at a central server to represent the current iterate of the model global posterior. The particles are iteratively downloaded and updated by a subset of agents with the end goal of minimizing the global free energy. By varying the number of particles, DSVGD enables a flexible trade-off between per-iteration communication load and number of communication rounds. DSVGD is shown to compare favorably to benchmark frequentist and Bayesian federated learning strategies in terms of accuracy and scalability with respect to the number of agents, while also providing well-calibrated, and hence trustworthy, predictions.

Download statistics

No data available

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454