google-research-datasets/go_emotions
Viewer β’ Updated β’ 265k β’ 23k β’ 260
How to use bhadresh-savani/bert-base-go-emotion with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="bhadresh-savani/bert-base-go-emotion") # Load model directly
from transformers import AutoTokenizer, DistilBertForMultilabelSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("bhadresh-savani/bert-base-go-emotion")
model = DistilBertForMultilabelSequenceClassification.from_pretrained("bhadresh-savani/bert-base-go-emotion")Num examples = 169208
Num Epochs = 3
Instantaneous batch size per device = 16
Total train batch size (w. parallel, distributed & accumulation) = 16
Gradient Accumulation steps = 1
Total optimization steps = 31728
'train_loss': 0.12085497042373672,
'eval_accuracy_thresh': 0.9614765048027039,
'eval_loss': 0.1164659634232521