Authors:
Maria Hartmann
1
;
Grégoire Danoy
2
;
1
;
Mohammed Alswaitti
1
and
Pascal Bouvry
2
;
1
Affiliations:
1
SnT, University of Luxembourg, Luxembourg, Esch-sur-Alzette, Luxembourg
;
2
FSTM/DCS, University of Luxembourg, Esch-sur-Alzette, Luxembourg
Keyword(s):
Machine Learning, Distributed Machine Learning, Federated Learning, Vertical Federated Learning.
Abstract:
Federated learning is a particular type of distributed machine learning, designed to permit the joint training of a single machine learning model by multiple participants that each possess a local dataset. A characteristic feature of federated learning strategies is the avoidance of any disclosure of client data to other participants of the learning scheme.
While a wealth of well-performing solutions for different scenarios exists for Horizontal Federated Learning (HFL), to date little attention has been devoted to Vertical Federated Learning (VFL). Existing approaches are limited to narrow application scenarios where few clients participate, privacy is a main concern and the vertical distribution of client data is well-understood. In this article, we first argue that VFL is naturally applicable to another, much broader application context where sharing of data is mainly limited by technological instead of privacy constraints, such as in sensor networks or satellite swarms. A VFL s
cheme applied to such a setting could unlock previously inaccessible on-device machine learning potential. We then propose the Joint-embedding Vertical Federated Learning framework (JoVe-FL), a first VFL framework designed for such settings. JoVe-FL is based on the idea of transforming the vertical federated learning problem to a horizontal one by learning a joint embedding space, allowing us to leverage existing HFL solutions. Finally, we empirically demonstrate the feasibility of the approach on instances consisting of different partitionings of the CIFAR10 dataset.
(More)