Salesforce/wikitext
Viewer • Updated • 3.71M • 1.34M • 684
How to use PreyumKr/pretrained-bertbase-wikitext with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="PreyumKr/pretrained-bertbase-wikitext") # Load model directly
from transformers import AutoTokenizer, AutoModelForPreTraining
tokenizer = AutoTokenizer.from_pretrained("PreyumKr/pretrained-bertbase-wikitext")
model = AutoModelForPreTraining.from_pretrained("PreyumKr/pretrained-bertbase-wikitext")This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Train Loss | Validation Loss | Epoch |
|---|---|---|
| 8.2205 | 8.4585 | 0 |
| 7.7729 | 8.5769 | 1 |
| 7.6960 | 8.4921 | 2 |
| 7.6486 | 8.5941 | 3 |
| 7.5881 | 8.5897 | 4 |