enoriega/odinsynth_dataset
Viewer • Updated • 10.8M • 62
How to use enoriega/rule_learning_margin_1mm_spanpred with Transformers:
# Load model directly
from transformers import AutoTokenizer, BertForRuleScoring
tokenizer = AutoTokenizer.from_pretrained("enoriega/rule_learning_margin_1mm_spanpred")
model = BertForRuleScoring.from_pretrained("enoriega/rule_learning_margin_1mm_spanpred")This model is a fine-tuned version of enoriega/rule_softmatching on the enoriega/odinsynth_dataset dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Margin Accuracy |
|---|---|---|---|---|
| 0.5448 | 0.16 | 20 | 0.5229 | 0.7717 |
| 0.4571 | 0.32 | 40 | 0.4292 | 0.8109 |
| 0.4296 | 0.48 | 60 | 0.4009 | 0.8193 |
| 0.4028 | 0.64 | 80 | 0.3855 | 0.8296 |
| 0.3878 | 0.8 | 100 | 0.3757 | 0.8334 |
| 0.3831 | 0.96 | 120 | 0.3643 | 0.8367 |
| 0.3591 | 1.12 | 140 | 0.3582 | 0.8393 |
| 0.3598 | 1.28 | 160 | 0.3533 | 0.8401 |
| 0.3635 | 1.44 | 180 | 0.3442 | 0.8427 |
| 0.3478 | 1.6 | 200 | 0.3406 | 0.8472 |
| 0.342 | 1.76 | 220 | 0.3352 | 0.8479 |
| 0.3327 | 1.92 | 240 | 0.3352 | 0.8486 |
| 0.3487 | 2.08 | 260 | 0.3293 | 0.8487 |
| 0.3387 | 2.24 | 280 | 0.3298 | 0.8496 |
| 0.3457 | 2.4 | 300 | 0.3279 | 0.8505 |
| 0.3483 | 2.56 | 320 | 0.3286 | 0.8510 |
| 0.3421 | 2.72 | 340 | 0.3245 | 0.8517 |
| 0.3332 | 2.88 | 360 | 0.3252 | 0.8517 |
# Load model directly from transformers import AutoTokenizer, BertForRuleScoring tokenizer = AutoTokenizer.from_pretrained("enoriega/rule_learning_margin_1mm_spanpred") model = BertForRuleScoring.from_pretrained("enoriega/rule_learning_margin_1mm_spanpred")