BulBERT-finetunes-BgGLUE
Collection
18 items • Updated
How to use mor40/BulBERT-ner-udep-5epochs with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="mor40/BulBERT-ner-udep-5epochs") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("mor40/BulBERT-ner-udep-5epochs")
model = AutoModelForTokenClassification.from_pretrained("mor40/BulBERT-ner-udep-5epochs")This model is a fine-tuned version of mor40/BulBERT-chitanka-model on the bgglue dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.1134 | 1.0 | 1114 | 0.0996 | 0.9674 | 0.9673 | 0.9673 | 0.9721 |
| 0.0578 | 2.0 | 2228 | 0.0933 | 0.9728 | 0.9722 | 0.9725 | 0.9760 |
| 0.0321 | 3.0 | 3342 | 0.0993 | 0.9739 | 0.9746 | 0.9743 | 0.9769 |
| 0.0178 | 4.0 | 4456 | 0.1054 | 0.9746 | 0.9750 | 0.9748 | 0.9776 |
| 0.0096 | 5.0 | 5570 | 0.1089 | 0.9753 | 0.9753 | 0.9753 | 0.9778 |
Base model
mor40/BulBERT-chitanka-model