eriktks/conll2003
Updated • 39.3k • 166
How to use codingJacob/distilbert-base-uncased-finetuned-ner with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="codingJacob/distilbert-base-uncased-finetuned-ner") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("codingJacob/distilbert-base-uncased-finetuned-ner")
model = AutoModelForTokenClassification.from_pretrained("codingJacob/distilbert-base-uncased-finetuned-ner")# Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("codingJacob/distilbert-base-uncased-finetuned-ner")
model = AutoModelForTokenClassification.from_pretrained("codingJacob/distilbert-base-uncased-finetuned-ner")This model is a fine-tuned version of distilbert-base-uncased on the conll2003 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.2432 | 1.0 | 878 | 0.0689 | 0.9132 | 0.9203 | 0.9168 | 0.9813 |
| 0.0507 | 2.0 | 1756 | 0.0608 | 0.9208 | 0.9346 | 0.9276 | 0.9835 |
| 0.03 | 3.0 | 2634 | 0.0611 | 0.9272 | 0.9382 | 0.9327 | 0.9843 |
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="codingJacob/distilbert-base-uncased-finetuned-ner")