unimelb-nlp/wikiann
Viewer • Updated • 2M • 43k • 121
How to use Gladiator/distilbert-base-uncased_ner_wikiann with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="Gladiator/distilbert-base-uncased_ner_wikiann") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("Gladiator/distilbert-base-uncased_ner_wikiann")
model = AutoModelForTokenClassification.from_pretrained("Gladiator/distilbert-base-uncased_ner_wikiann")This model is a fine-tuned version of distilbert-base-uncased on the wikiann dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.3325 | 1.0 | 1250 | 0.2657 | 0.7732 | 0.8175 | 0.7947 | 0.9214 |
| 0.2242 | 2.0 | 2500 | 0.2505 | 0.7942 | 0.8289 | 0.8111 | 0.9262 |
| 0.158 | 3.0 | 3750 | 0.2539 | 0.8099 | 0.8367 | 0.8231 | 0.9294 |
| 0.1155 | 4.0 | 5000 | 0.2804 | 0.8172 | 0.8373 | 0.8271 | 0.9302 |
| 0.1047 | 5.0 | 6250 | 0.2834 | 0.8139 | 0.8367 | 0.8251 | 0.9300 |