leondz/wnut_17
Updated • 4.47k • 19
How to use LangChain12/my_awesome_wnut_model with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="LangChain12/my_awesome_wnut_model") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("LangChain12/my_awesome_wnut_model")
model = AutoModelForTokenClassification.from_pretrained("LangChain12/my_awesome_wnut_model")This model is a fine-tuned version of distilbert-base-uncased on the wnut_17 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| No log | 1.0 | 213 | 0.2976 | 0.4098 | 0.1937 | 0.2631 | 0.9349 |
| No log | 2.0 | 426 | 0.2839 | 0.4705 | 0.2660 | 0.3398 | 0.9393 |
Base model
distilbert/distilbert-base-uncased