eriktks/conll2003
Updated • 39.1k • 166
How to use ViktorDo/BERT-finetuned-ner-conll2003 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="ViktorDo/BERT-finetuned-ner-conll2003") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("ViktorDo/BERT-finetuned-ner-conll2003")
model = AutoModelForTokenClassification.from_pretrained("ViktorDo/BERT-finetuned-ner-conll2003")This model is a fine-tuned version of bert-base-cased on the conll2003 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.0855 | 1.0 | 1756 | 0.0637 | 0.9180 | 0.9369 | 0.9274 | 0.9832 |
| 0.0354 | 2.0 | 3512 | 0.0639 | 0.9301 | 0.9473 | 0.9386 | 0.9859 |
| 0.018 | 3.0 | 5268 | 0.0607 | 0.9317 | 0.9478 | 0.9397 | 0.9862 |