banglamentalBERT on mBERT (DAPT) - Depression Severity Detection

This is a Domain-Adaptive Pre-Trained (DAPT) model for detecting depression severity in Bangla social media text. A robust multilingual model capable of handling mixed-language (Code-Mixed) scenarios.

Model Details

  • Base Architecture: google-bert/bert-base-multilingual-cased
  • Training Method: Domain-Adaptive Pre-Training (DAPT) on bangla mental health corpora, followed by Fine-Tuning.
  • Task: Multi-class Classification (4 classes).
  • Language: Bengali (Bangla).

Label Mapping

The model outputs one of the following classes:

  • 0: Minimum/None
  • 1: Mild
  • 2: Moderate
  • 3: Severe

Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

# Load the model
model_name = "SrothJr/banglamentalBERT-mBERT-dapt"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)

# Inference
text = "আপনার বাংলা টেক্সট এখানে (Your Bengali text here)"
inputs = tokenizer(text, return_tensors="pt")

with torch.no_grad():
    outputs = model(**inputs)
    
predictions = torch.argmax(outputs.logits, dim=-1)
labels = ["Minimum", "Mild", "Moderate", "Severe"]
print(f"Prediction: {labels[predictions.item()]}")

Citation

If you use this mode, please cite our research.

Downloads last month
13
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for SrothJr/banglamentalBERT-mBERT-dapt

Finetuned
(946)
this model