metadata
language: en
library_name: transformers
tags:
- multilabel-classification
- deberta-v3
- opp115
metrics:
- macro_f1
- micro_f1
- weighted_f1
- macro_precision
- macro_recall
DeBERTaV3 Base β OPP115 Multilabel (v2)
Fine-tuned DeBERTaV3 model for multi-label classification on the OPP115 dataset.
π Evaluation Metrics
| Metric | Score |
|---|---|
| Macro F1 | 0.8092 |
| Micro F1 | 0.8565 |
| Weighted F1 | 0.8531 |
| Macro Precision | 0.8657 |
| Macro Recall | 0.7697 |
π§ͺ Usage
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import torch
model = AutoModelForSequenceClassification.from_pretrained("Hacktrix-121/deberta-v3-base-opp115-multilabel-v2")
tokenizer = AutoTokenizer.from_pretrained("Hacktrix-121/deberta-v3-base-opp115-multilabel-v2")
text = "Your input text here"
inputs = tokenizer(text, return_tensors="pt")
logits = model(**inputs).logits
probs = torch.sigmoid(logits)