SayedShaun/sentigold
Viewer • Updated • 70k • 78 • 1
How to use SayedShaun/bangla-classifier-multiclass with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="SayedShaun/bangla-classifier-multiclass") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("SayedShaun/bangla-classifier-multiclass")
model = AutoModelForSequenceClassification.from_pretrained("SayedShaun/bangla-classifier-multiclass")This is a Bangla binary sentiment classification model, fine-tuned on top of csebuetnlp/banglabert. The model was trained using the SayedShaun/sentigold
from transformers import pipeline
pipe = pipeline("text-classification", model="SayedShaun/bangla-classifier-multiclass")
response = pipe("ডেলিভারি ম্যান খুব যত্ন সহকারে পণ্যটি ডেলিভারি করেছে")
print(response)
>>> [{'label': 'LABEL_0', 'score': 0.9503920674324036}]
{"SP" :0, "WP": 1, "WN": 2, "SN": 3, "NU": 4}
SP: Strongly Positive
WP: Weakly Positive
WN: Weakly Positive Negative
SN: Strongly Negative
NU: Neutral
| Training Loss | Validation Loss | Accuracy | Precision | Recall | F1 Score |
|---|---|---|---|---|---|
| 0.820600 | 0.916846 | 0.646714 | 0.649295 | 0.642749 | 0.643535 |
Source code can be found in files and versions as finetune.py
Base model
csebuetnlp/banglabert