Multilingual Depression Detection Based on Speech Signals and Deep Learning: 2024 IEEE 10th International Conference on Big Data Computing Service and Machine Learning Applications (BigDataService)

Lidan Liu*, F. Tydeman, Wanqing Xie, Y. Wang

*Corresponding author for this work

Research output: Contribution to conference typesPaperpeer-review

Abstract

Current assessments for depressive disorder are often influenced by cognitive function making them more susceptible to biases. Deep learning could provide more objective diagnoses with less access barriers for individuals who are unable to complete traditional assessments. In our study, we aim to explore the relations among speech, languages, and depression to demonstrate the feasibility of multi-lingual speech depression detection, and then build deep learning models using multi-lingual speech samples to support depression diagnosis. We first used a newly collected Chinese speech depression dataset to build a convolutional neural network (CNN) to conduct depression detection, and the accuracy of the test set reached 0.85. Besides, we tested the English depression speech dataset, DAIC-WOZ, using the same CNN model, and the accuracy of the test set was 0.73. While training the model using both Chinese and English speech samples and testing on mixture speeches, the accuracy achieved 0.74. We found that the CNN model can be applied across languages with a relatively stable performance of depression detection. This provides evidence that it is possible to develop a language-independent depression detection tool to support depression diagnostic and achieve worldwide long-term mental health monitoring.
Original languageEnglish
Pages115-116
Number of pages2
DOIs
Publication statusPublished - 29 Oct 2024

Fingerprint

Dive into the research topics of 'Multilingual Depression Detection Based on Speech Signals and Deep Learning: 2024 IEEE 10th International Conference on Big Data Computing Service and Machine Learning Applications (BigDataService)'. Together they form a unique fingerprint.

Cite this