eriktks/conll2003
Updated • 39k • 166
How to use Sharpaxis/BERT-NER-CoNLL with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="Sharpaxis/BERT-NER-CoNLL") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("Sharpaxis/BERT-NER-CoNLL")
model = AutoModelForTokenClassification.from_pretrained("Sharpaxis/BERT-NER-CoNLL")This model is a fine-tuned version of bert-large-uncased on the conll2003 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | F1 |
|---|---|---|---|---|
| 0.115 | 1.0 | 878 | 0.1003 | 0.8983 |
| 0.0276 | 2.0 | 1756 | 0.1157 | 0.9081 |
| 0.0128 | 3.0 | 2634 | 0.1243 | 0.9106 |
Base model
google-bert/bert-large-uncased