BulBERT-finetunes-BgGLUE
Collection
18 items • Updated
How to use mor40/BulBERT-ct21-5pochs with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="mor40/BulBERT-ct21-5pochs") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("mor40/BulBERT-ct21-5pochs")
model = AutoModelForSequenceClassification.from_pretrained("mor40/BulBERT-ct21-5pochs")This model is a fine-tuned version of mor40/BulBERT-chitanka-model on the bgglue dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| No log | 1.0 | 163 | 0.4891 | 0.7743 |
| No log | 2.0 | 326 | 0.5475 | 0.8257 |
| No log | 3.0 | 489 | 0.7889 | 0.82 |
| 0.288 | 4.0 | 652 | 0.9438 | 0.8286 |
| 0.288 | 5.0 | 815 | 1.0051 | 0.84 |
Base model
mor40/BulBERT-chitanka-model