
Table 1: Comparison of Model Performance using DAIC-WOZ Dataset.
Paper Training Time (mins) Battery Used (%) CPU Usage (%) RAM Usage (MB)
(Ma et al., 2022) 45 8 85 600
(Shenaj et al., 2023) 51 10 90 700
(Huang et al., 2022) 44 7 80 600
(Lee et al., 2022) 57 9 88 660
Our Approach 38 7 80 550
in federated learning such as non-IID datasets that
might occur in our use case. Moreover, we plan to
deal with the challenge of heterogeneous DL mod-
els that might be deployed on the clients. Finally, we
want to propose a server-less federated learning ap-
proach that does not depend on a centralized aggrega-
tion server.
ACKNOWLEDGEMENTS
This work is supported by the German Academic Ex-
change Service (DAAD) in the Ta’ziz Science Coop-
erations Program (AirFit Project; 57682841).
REFERENCES
American Psychiatric Association (2013). Diagnostic and
Statistical Manual of Mental Disorders (DSM-5).
American Psychiatric Association Publishing, Wash-
ington, D.C., 5 edition.
Dwork, C., McSherry, F., Nissim, K., and Smith, A. (2006).
Calibrating noise to sensitivity in private data analysis.
In Proceedings of the Third Conference on Theory of
Cryptography.
Furlanello, T., Decker, M. S. F. T., and Alabau,
H. M. P. (2018). Born-again neural networks.
arXiv:1805.04770.
Goodman, S. H. and Gotlib, I. H. (2002). Transmission of
risk to children of depressed parents: Integration and
conclusions. Psychological Bulletin, 128(5):768–795.
Gratch, J., Artstein, R., Lucas, G., Stratou, G., Scherer,
S., Nazarian, A., Wood, R., Boberg, J., DeVault, D.,
Marsella, S., Traum, D., Rizzo, S., and Morency, L.-
P. (2014). The distress analysis interview corpus of
human and computer interviews. In Calzolari, N.,
Choukri, K., Declerck, T., Loftsson, H., Maegaard,
B., Mariani, J., Moreno, A., Odijk, J., and Piperidis,
S., editors, Proceedings of the Ninth International
Conference on Language Resources and Evaluation
(LREC’14), pages 3123–3128, Reykjavik, Iceland.
European Language Resources Association (ELRA).
Hinton, G., Vinyals, O., and Dean, J. (2015). Distilling
the knowledge in a neural network. In NIPS Deep
Learning and Representation Learning Workshop.
Huang, W., Ye, M., and Du, B. (2022). Learn from others
and be yourself in heterogeneous federated learning.
pages 10133–10143.
Lee, G., Jeong, M., Shin, Y., Bae, S., and Yun, S.-Y. (2022).
Preservation of the global knowledge by not-true dis-
tillation in federated learning.
Ma, Y., Xie, Z., Wang, J., Chen, K., and Shou, L. (2022).
Continual federated learning based on knowledge dis-
tillation. International Joint Conferences on Artificial
Intelligence Organization, pages 2182–2188. Main
Track.
McMahan, B., Moore, E., Ramage, D., and Y., S. H. C.
Y. B. R. (2017). Communication-efficient learning of
deep networks from decentralized data. In Proceed-
ings of the 20th International Conference on Artificial
Intelligence and Statistics (AISTATS), pages 1273–
1282.
Mdhaffar, A., Cherif, F., Kessentini, Y., Maalej, M., Tha-
bet, J. B., Maalej, M., Jmaiel, M., and Freisleben,
B. (2019). DL4DED: Deep learning for depressive
episode detection on mobile devices. In Proceedings
of the 17
th
International Conference on Smart Homes
and Health Telematics: How AI Impacts Urban Living
and Public Health, (ICOST), Lecture Notes in Com-
puter Science, pages 109–121, New York City, NY,
USA. Springer.
National Institute of Mental Health (2023). Bipolar disor-
der.
Powers, D. M. (2011). Evaluation: from precision, recall
and f-measure to roc, informedness, markedness and
correlation. arXiv preprint arXiv:2010.16061.
Rosenthal, N. E., Sack, D. A., Gillin, J. C., Lewy, A. J.,
Goodwin, F. K., Davenport, Y., Mueller, P. S., New-
some, D. A., and Wehr, T. A. (1984). Seasonal af-
fective disorder: A description of the syndrome and
preliminary findings with light therapy. Archives of
General Psychiatry, 41(1):72–80.
Shenaj, D., Toldo, M., Rigon, A., and Zanuttigh, P. (2023).
Asynchronous federated continual learning.
World Health Organization (2023). Depression: Key facts.
Yim, D., Li, D., and Liu, B. (2017). A gift from knowledge
distillation: Fast optimization and efficient inference.
In ICML.
Zhang, J., Chen, C., Zhuang, W., and Lv, L. (2023). Target:
Federated class-continual learning via exemplar-free
distillation.
Zhang, Y., Wang, R., and Xu, H. (2020). Deep mutual learn-
ing. In Proceedings of the IEEE International Confer-
ence on Computer Vision.
ICAART 2025 - 17th International Conference on Agents and Artificial Intelligence
1480