Towards Decentralized Parameter Servers for Secure Federated Learning
Muhammad El-Hindi, Zheguang Zhao, Carsten Binnig
2022
Abstract
Federated learning aims to protect the privacy of data owners in a collaborative machine learning setup since training data does not need to be revealed to any other participant involved in the training process. This is achieved by only requiring participants to share locally computed model updates (i.e., gradients), instead of the training data, with a centralized parameter server. However, recent papers have shown that privacy attacks exist which allow this server to reconstruct the training data of individual data owners only from the received gradients. To mitigate this attack, in this paper, we propose a new federated learning framework that decentralizes the parameter server. As part of this contribution, we investigate the configuration space of such a decentralized federated learning framework. Moreover, we propose three promising privacy-preserving techniques, namely model sharding, asynchronous updates and polling intervals for stale parameters. In our evaluation, we observe on different data sets that these techniques can effectively thwart the gradient-based reconstruction attacks on deep learning models, both from the client side and the server side, by reducing the attack results close to random noise.
DownloadPaper Citation
in Harvard Style
El-Hindi M., Zhao Z. and Binnig C. (2022). Towards Decentralized Parameter Servers for Secure Federated Learning. In Proceedings of the 11th International Conference on Data Science, Technology and Applications - Volume 1: DATA, ISBN 978-989-758-583-8, pages 257-269. DOI: 10.5220/0011146300003269
in Bibtex Style
@conference{data22,
author={Muhammad El-Hindi and Zheguang Zhao and Carsten Binnig},
title={Towards Decentralized Parameter Servers for Secure Federated Learning},
booktitle={Proceedings of the 11th International Conference on Data Science, Technology and Applications - Volume 1: DATA,},
year={2022},
pages={257-269},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011146300003269},
isbn={978-989-758-583-8},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 11th International Conference on Data Science, Technology and Applications - Volume 1: DATA,
TI - Towards Decentralized Parameter Servers for Secure Federated Learning
SN - 978-989-758-583-8
AU - El-Hindi M.
AU - Zhao Z.
AU - Binnig C.
PY - 2022
SP - 257
EP - 269
DO - 10.5220/0011146300003269