enoriega/odinsynth_dataset
Viewer • Updated • 10.8M • 62
How to use enoriega/rule_learning_margin_test with Transformers:
# Load model directly
from transformers import AutoTokenizer, BertForRuleScoring
tokenizer = AutoTokenizer.from_pretrained("enoriega/rule_learning_margin_test")
model = BertForRuleScoring.from_pretrained("enoriega/rule_learning_margin_test")This model is a fine-tuned version of bert-base-uncased on the enoriega/odinsynth_dataset dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.6468 | 0.32 | 20 | 0.6191 |
| 0.5185 | 0.64 | 40 | 0.5083 |
| 0.459 | 0.96 | 60 | 0.4521 |
| 0.4352 | 1.29 | 80 | 0.4192 |
| 0.4427 | 1.61 | 100 | 0.4199 |
| 0.4246 | 1.93 | 120 | 0.4131 |
| 0.4301 | 2.26 | 140 | 0.4104 |
| 0.428 | 2.58 | 160 | 0.4099 |
| 0.4161 | 2.9 | 180 | 0.4102 |
# Load model directly from transformers import AutoTokenizer, BertForRuleScoring tokenizer = AutoTokenizer.from_pretrained("enoriega/rule_learning_margin_test") model = BertForRuleScoring.from_pretrained("enoriega/rule_learning_margin_test")