Adaptation of Transformer-based Models for Depression Detection

Authors

  • Olaronke Oluwayemisi Adebanji Instituto Politécnico Nacional
  • Olumide Ebenezer Ojo Instituto Politécnico Nacional
  • Hiram Calvo Instituto Politécnico Nacional
  • Irina Gelbukh Instituto Politécnico Nacional
  • Grigori Sidorov Instituto Politécnico Nacional

DOI:

https://doi.org/10.13053/cys-28-1-4691

Keywords:

Bag-of-words, Word2Vec, GloVe, Machine Learning, Deep Learning, Transformers, Sentiment Analysis

Abstract

Pre-trained language models are able to capture a broad range of knowledge and language patterns in text and can be fine-tuned for specific tasks. In this paper, we focus on evaluating the effectiveness of various traditional machine learning and pre-trained language models in identifying depression through theanalysis of text from social media. We examined different feature representations with the traditional machine learning models and explored the impact of pre-training on the transformer models and compared their performance. Using BoW, Word2Vec, and GloVe representations, the machine learning models with which we experimented achieved impressive accuracies in the task of detecting depression. However, pre-trained language models exhibited outstanding performance, consistently achieving high accuracy, precision, recall, and F1 scores of approximately 0.98 or higher.

Downloads

Published

2024-03-20

Issue

Section

Articles