Figure 3: CIFAR10 test accuracy VS Global epoch.
Figure 4: FashionMNIST average train loss VS Global
epoch.
Figure 5: FashionMNIST average test loss VS Global
epoch.
Figure 6: FashionMNIST test accuracy VS Global epoch.
method in the same FL system. In heterogeneous
data, experimental results show that our proposed al-
gorithm performs better than FedAvg and FedProx in
terms of average train loss, average test loss and test
accuracy. For CIFAR10, to acheive 55% of test accu-
racy, FedAvg, FedProx and our proposed method take
95, 84 and 57 number of global epochs respectively.
For FashionMNIST, to acheive 75% of test accuracy,
FedAvg, FedProx and our proposed method take 25,
25 and 12 number of global epochs respectively. We
observed that for FashionMNIST Non-IID data, Fed-
prox performs similar to FedAvg.
6 CONCLUSIONS
In federated learning, data heterogeneity across all the
participating clients is one of the critical challenge.
Data heterogeneity causes client drift which results
in degradation of the performance of FL model in
terms of higher loss (both train and test) and lower
test accuracy. To mitigate this problem, we proposed
a GMM based approach where we handle data hetero-
geneity by generating new local samples from glob-
ally trained GMMs. Our experimental results show
that our proposed method handles data heterogene-
ity in FL system better than exiting FedAvg and Fed-
Prox algorithm. We show that the performance of FL
model is improved in terms of train loss, test loss and
test accuracy by our proposed method.
REFERENCES
Deng, Y., Kamani, M. M., and Mahdavi, M. (2020).
Adaptive personalized federated learning. CoRR,
abs/2003.13461.
Dinh, C. T., Tran, N. H., and Nguyen, T. D. (2020). Person-
alized federated learning with moreau envelopes. In
Advances in Neural Information Processing Systems
33: Annual Conference on Neural Information Pro-
cessing Systems 2020, NeurIPS 2020, December 6-12,
2020, virtual.
Duan, M., Liu, D., Chen, X., Liu, R., Tan, Y., and Liang, L.
(2021). Self-balancing federated learning with global
imbalanced data in mobile systems. IEEE Trans. Par-
allel Distributed Syst., 32(1):59–71.
Fallah, A., Mokhtari, A., and Ozdaglar, A. E. (2020). Per-
sonalized federated learning with theoretical guaran-
tees: A model-agnostic meta-learning approach. In
Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M.,
and Lin, H., editors, Advances in Neural Information
Processing Systems 33: Annual Conference on Neural
Information Processing Systems 2020, NeurIPS 2020,
December 6-12, 2020, virtual.
Jeong, E., Oh, S., Kim, H., Park, J., Bennis, M., and Kim, S.
(2018). Communication-efficient on-device machine
learning: Federated distillation and augmentation un-
der non-iid private data. CoRR, abs/1811.11479.
Karimireddy, S. P., Kale, S., Mohri, M., Reddi, S. J., Stich,
S. U., and Suresh, A. T. (2020). SCAFFOLD: stochas-
tic controlled averaging for federated learning. In Pro-
ceedings of the 37th International Conference on Ma-
chine Learning, ICML 2020, 13-18 July 2020, Virtual
Event, volume 119 of Proceedings of Machine Learn-
ing Research, pages 5132–5143. PMLR.
Li, D. and Wang, J. (2019). Fedmd: Heterogenous
federated learning via model distillation. CoRR,
abs/1910.03581.
Li, Q., He, B., and Song, D. (2021). Model-contrastive
federated learning. In IEEE Conference on Computer
Vision and Pattern Recognition, CVPR 2021, virtual,
IMPROVE 2023 - 3rd International Conference on Image Processing and Vision Engineering
124