Instructions to use mnaylor/psychbert-finetuned-multiclass with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mnaylor/psychbert-finetuned-multiclass with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="mnaylor/psychbert-finetuned-multiclass")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("mnaylor/psychbert-finetuned-multiclass") model = AutoModelForSequenceClassification.from_pretrained("mnaylor/psychbert-finetuned-multiclass") - Notebooks
- Google Colab
- Kaggle
PsychBERT Fine-Tuned on Simple Multi-class Problem
This is a version of https://huggingface.co/mnaylor/psychbert-cased which was fine-tuned to illustrate performance on a multi-class classification problem involving the detection of different types of language relating to mental health. The classes are as follows:
- 0: Negative / unrelated to mental health
- 1: Mental illnesses
- 2: Anxiety
- 3: Depression
- 4: Social anxiety
- 5: Loneliness
The dataset for this model was taken from Reddit and Twitter, and labels were assigned based on the post appearing in certain subreddits or containing certain hashtags. For more information, see the PsychBERT paper.
- Downloads last month
- 60