eriktks/conll2003
Updated • 39.3k • 166
How to use shre-db/bert-finetuned-ner with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="shre-db/bert-finetuned-ner") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("shre-db/bert-finetuned-ner")
model = AutoModelForTokenClassification.from_pretrained("shre-db/bert-finetuned-ner")This model is a fine-tuned version of bert-base-cased on the conll2003 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.0781 | 1.0 | 1756 | 0.0729 | 0.9083 | 0.9349 | 0.9214 | 0.9807 |
| 0.0413 | 2.0 | 3512 | 0.0581 | 0.9196 | 0.9465 | 0.9328 | 0.9854 |
| 0.0268 | 3.0 | 5268 | 0.0577 | 0.9314 | 0.9504 | 0.9408 | 0.9867 |
Base model
google-bert/bert-base-cased