klue/klue
Viewer • Updated • 206k • 7.07k • 92
How to use JIWON/bert-base-finetuned-nli with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="JIWON/bert-base-finetuned-nli") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("JIWON/bert-base-finetuned-nli")
model = AutoModelForSequenceClassification.from_pretrained("JIWON/bert-base-finetuned-nli")This model is a fine-tuned version of klue/bert-base on the klue dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| No log | 1.0 | 196 | 0.6210 | 0.085 |
| No log | 2.0 | 392 | 0.5421 | 0.0643 |
| 0.5048 | 3.0 | 588 | 0.5523 | 0.062 |
| 0.5048 | 4.0 | 784 | 0.5769 | 0.0533 |
| 0.5048 | 5.0 | 980 | 0.5959 | 0.052 |