train_qqp_1744902602
This model is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.3 on the qqp dataset. It achieves the following results on the evaluation set:
- Loss: 0.0540
- Num Input Tokens Seen: 50647232
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 123
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
| Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
|---|---|---|---|---|
| 0.0992 | 0.0098 | 200 | 0.0987 | 254944 |
| 0.0554 | 0.0195 | 400 | 0.1044 | 507488 |
| 0.0773 | 0.0293 | 600 | 0.0894 | 764128 |
| 0.0961 | 0.0391 | 800 | 0.0825 | 1016160 |
| 0.101 | 0.0489 | 1000 | 0.0868 | 1268384 |
| 0.0645 | 0.0586 | 1200 | 0.0808 | 1520384 |
| 0.0761 | 0.0684 | 1400 | 0.0762 | 1773408 |
| 0.0288 | 0.0782 | 1600 | 0.0754 | 2028480 |
| 0.0561 | 0.0879 | 1800 | 0.0716 | 2280960 |
| 0.0591 | 0.0977 | 2000 | 0.0757 | 2536512 |
| 0.0805 | 0.1075 | 2200 | 0.0815 | 2790752 |
| 0.105 | 0.1173 | 2400 | 0.0756 | 3043136 |
| 0.0619 | 0.1270 | 2600 | 0.0697 | 3298464 |
| 0.0595 | 0.1368 | 2800 | 0.0688 | 3556256 |
| 0.0654 | 0.1466 | 3000 | 0.0680 | 3809536 |
| 0.0977 | 0.1564 | 3200 | 0.0667 | 4064256 |
| 0.0857 | 0.1661 | 3400 | 0.0708 | 4319712 |
| 0.0552 | 0.1759 | 3600 | 0.0673 | 4571104 |
| 0.0794 | 0.1857 | 3800 | 0.0696 | 4822176 |
| 0.0882 | 0.1954 | 4000 | 0.0715 | 5073216 |
| 0.0528 | 0.2052 | 4200 | 0.0655 | 5327680 |
| 0.058 | 0.2150 | 4400 | 0.0675 | 5582272 |
| 0.0528 | 0.2248 | 4600 | 0.0653 | 5834624 |
| 0.0803 | 0.2345 | 4800 | 0.0663 | 6085600 |
| 0.0328 | 0.2443 | 5000 | 0.0688 | 6339520 |
| 0.0874 | 0.2541 | 5200 | 0.0653 | 6592160 |
| 0.044 | 0.2638 | 5400 | 0.0648 | 6847232 |
| 0.0692 | 0.2736 | 5600 | 0.0667 | 7101984 |
| 0.0691 | 0.2834 | 5800 | 0.0663 | 7357536 |
| 0.0491 | 0.2932 | 6000 | 0.0679 | 7610208 |
| 0.0394 | 0.3029 | 6200 | 0.0658 | 7868832 |
| 0.0686 | 0.3127 | 6400 | 0.0640 | 8121856 |
| 0.1004 | 0.3225 | 6600 | 0.0649 | 8372096 |
| 0.0561 | 0.3323 | 6800 | 0.0624 | 8628064 |
| 0.054 | 0.3420 | 7000 | 0.0659 | 8882496 |
| 0.0572 | 0.3518 | 7200 | 0.0633 | 9135616 |
| 0.0827 | 0.3616 | 7400 | 0.0672 | 9389184 |
| 0.0667 | 0.3713 | 7600 | 0.0671 | 9641344 |
| 0.0629 | 0.3811 | 7800 | 0.0635 | 9894624 |
| 0.059 | 0.3909 | 8000 | 0.0620 | 10144480 |
| 0.0487 | 0.4007 | 8200 | 0.0610 | 10398432 |
| 0.0729 | 0.4104 | 8400 | 0.0632 | 10650624 |
| 0.0621 | 0.4202 | 8600 | 0.0629 | 10901824 |
| 0.0724 | 0.4300 | 8800 | 0.0616 | 11155840 |
| 0.0663 | 0.4397 | 9000 | 0.0613 | 11413440 |
| 0.0701 | 0.4495 | 9200 | 0.0629 | 11666624 |
| 0.0417 | 0.4593 | 9400 | 0.0605 | 11917248 |
| 0.0707 | 0.4691 | 9600 | 0.0605 | 12168736 |
| 0.0564 | 0.4788 | 9800 | 0.0602 | 12423520 |
| 0.037 | 0.4886 | 10000 | 0.0604 | 12673888 |
| 0.0593 | 0.4984 | 10200 | 0.0604 | 12924960 |
| 0.0418 | 0.5081 | 10400 | 0.0611 | 13176640 |
| 0.0645 | 0.5179 | 10600 | 0.0604 | 13430912 |
| 0.0595 | 0.5277 | 10800 | 0.0606 | 13684288 |
| 0.0623 | 0.5375 | 11000 | 0.0589 | 13937504 |
| 0.0628 | 0.5472 | 11200 | 0.0599 | 14190304 |
| 0.064 | 0.5570 | 11400 | 0.0589 | 14445312 |
| 0.0341 | 0.5668 | 11600 | 0.0599 | 14697344 |
| 0.029 | 0.5766 | 11800 | 0.0600 | 14951296 |
| 0.0424 | 0.5863 | 12000 | 0.0592 | 15205152 |
| 0.0733 | 0.5961 | 12200 | 0.0580 | 15457696 |
| 0.086 | 0.6059 | 12400 | 0.0594 | 15709984 |
| 0.0461 | 0.6156 | 12600 | 0.0585 | 15964384 |
| 0.0445 | 0.6254 | 12800 | 0.0615 | 16216768 |
| 0.0632 | 0.6352 | 13000 | 0.0611 | 16469792 |
| 0.0463 | 0.6450 | 13200 | 0.0592 | 16721536 |
| 0.0669 | 0.6547 | 13400 | 0.0599 | 16976192 |
| 0.0374 | 0.6645 | 13600 | 0.0625 | 17230496 |
| 0.0332 | 0.6743 | 13800 | 0.0595 | 17485120 |
| 0.0435 | 0.6840 | 14000 | 0.0600 | 17739872 |
| 0.0964 | 0.6938 | 14200 | 0.0575 | 17994144 |
| 0.063 | 0.7036 | 14400 | 0.0570 | 18248736 |
| 0.0467 | 0.7134 | 14600 | 0.0568 | 18504672 |
| 0.0843 | 0.7231 | 14800 | 0.0614 | 18754208 |
| 0.0731 | 0.7329 | 15000 | 0.0564 | 19005696 |
| 0.0491 | 0.7427 | 15200 | 0.0567 | 19260320 |
| 0.0601 | 0.7524 | 15400 | 0.0625 | 19514944 |
| 0.063 | 0.7622 | 15600 | 0.0570 | 19766912 |
| 0.0349 | 0.7720 | 15800 | 0.0579 | 20018240 |
| 0.0512 | 0.7818 | 16000 | 0.0561 | 20269632 |
| 0.0509 | 0.7915 | 16200 | 0.0560 | 20523232 |
| 0.0726 | 0.8013 | 16400 | 0.0585 | 20777376 |
| 0.0895 | 0.8111 | 16600 | 0.0569 | 21031776 |
| 0.048 | 0.8209 | 16800 | 0.0605 | 21283328 |
| 0.0303 | 0.8306 | 17000 | 0.0561 | 21535072 |
| 0.0566 | 0.8404 | 17200 | 0.0582 | 21786304 |
| 0.052 | 0.8502 | 17400 | 0.0556 | 22039232 |
| 0.0379 | 0.8599 | 17600 | 0.0554 | 22290976 |
| 0.0634 | 0.8697 | 17800 | 0.0604 | 22543904 |
| 0.0456 | 0.8795 | 18000 | 0.0562 | 22796480 |
| 0.0692 | 0.8893 | 18200 | 0.0572 | 23050080 |
| 0.0341 | 0.8990 | 18400 | 0.0563 | 23304192 |
| 0.0266 | 0.9088 | 18600 | 0.0559 | 23557152 |
| 0.0728 | 0.9186 | 18800 | 0.0548 | 23808960 |
| 0.0706 | 0.9283 | 19000 | 0.0560 | 24063776 |
| 0.0468 | 0.9381 | 19200 | 0.0552 | 24317280 |
| 0.0401 | 0.9479 | 19400 | 0.0551 | 24573184 |
| 0.0889 | 0.9577 | 19600 | 0.0549 | 24826560 |
| 0.039 | 0.9674 | 19800 | 0.0541 | 25081792 |
| 0.0226 | 0.9772 | 20000 | 0.0547 | 25332672 |
| 0.0603 | 0.9870 | 20200 | 0.0555 | 25584672 |
| 0.0547 | 0.9968 | 20400 | 0.0540 | 25834336 |
| 0.0438 | 1.0065 | 20600 | 0.0573 | 26090080 |
| 0.0828 | 1.0163 | 20800 | 0.0583 | 26343008 |
| 0.036 | 1.0261 | 21000 | 0.0557 | 26598784 |
| 0.0135 | 1.0359 | 21200 | 0.0604 | 26851648 |
| 0.0536 | 1.0456 | 21400 | 0.0605 | 27103392 |
| 0.0486 | 1.0554 | 21600 | 0.0595 | 27361312 |
| 0.0264 | 1.0652 | 21800 | 0.0564 | 27616640 |
| 0.0136 | 1.0750 | 22000 | 0.0607 | 27874656 |
| 0.0387 | 1.0847 | 22200 | 0.0575 | 28122656 |
| 0.0367 | 1.0945 | 22400 | 0.0593 | 28376640 |
| 0.0466 | 1.1043 | 22600 | 0.0605 | 28629632 |
| 0.0561 | 1.1140 | 22800 | 0.0583 | 28884480 |
| 0.0502 | 1.1238 | 23000 | 0.0562 | 29140832 |
| 0.0184 | 1.1336 | 23200 | 0.0569 | 29396960 |
| 0.041 | 1.1434 | 23400 | 0.0591 | 29648032 |
| 0.0283 | 1.1531 | 23600 | 0.0595 | 29897312 |
| 0.0296 | 1.1629 | 23800 | 0.0591 | 30153920 |
| 0.0427 | 1.1727 | 24000 | 0.0582 | 30407616 |
| 0.0452 | 1.1824 | 24200 | 0.0567 | 30656768 |
| 0.0711 | 1.1922 | 24400 | 0.0604 | 30908480 |
| 0.0688 | 1.2020 | 24600 | 0.0572 | 31162176 |
| 0.0279 | 1.2118 | 24800 | 0.0572 | 31412736 |
| 0.0474 | 1.2215 | 25000 | 0.0575 | 31668000 |
| 0.0378 | 1.2313 | 25200 | 0.0593 | 31919712 |
| 0.0335 | 1.2411 | 25400 | 0.0557 | 32172256 |
| 0.0353 | 1.2508 | 25600 | 0.0603 | 32424512 |
| 0.0228 | 1.2606 | 25800 | 0.0593 | 32678176 |
| 0.0664 | 1.2704 | 26000 | 0.0591 | 32931456 |
| 0.0237 | 1.2802 | 26200 | 0.0601 | 33184096 |
| 0.0514 | 1.2899 | 26400 | 0.0593 | 33436864 |
| 0.0632 | 1.2997 | 26600 | 0.0564 | 33691232 |
| 0.0192 | 1.3095 | 26800 | 0.0564 | 33944640 |
| 0.0461 | 1.3193 | 27000 | 0.0585 | 34193536 |
| 0.034 | 1.3290 | 27200 | 0.0584 | 34445952 |
| 0.0303 | 1.3388 | 27400 | 0.0582 | 34698784 |
| 0.038 | 1.3486 | 27600 | 0.0582 | 34950976 |
| 0.04 | 1.3583 | 27800 | 0.0578 | 35204128 |
| 0.0314 | 1.3681 | 28000 | 0.0559 | 35455296 |
| 0.0697 | 1.3779 | 28200 | 0.0575 | 35708160 |
| 0.0255 | 1.3877 | 28400 | 0.0587 | 35960608 |
| 0.035 | 1.3974 | 28600 | 0.0575 | 36214944 |
| 0.0509 | 1.4072 | 28800 | 0.0565 | 36466336 |
| 0.0239 | 1.4170 | 29000 | 0.0562 | 36720160 |
| 0.0161 | 1.4267 | 29200 | 0.0577 | 36971744 |
| 0.0326 | 1.4365 | 29400 | 0.0575 | 37226208 |
| 0.0312 | 1.4463 | 29600 | 0.0589 | 37479008 |
| 0.034 | 1.4561 | 29800 | 0.0569 | 37732672 |
| 0.0209 | 1.4658 | 30000 | 0.0589 | 37984768 |
| 0.0261 | 1.4756 | 30200 | 0.0584 | 38237120 |
| 0.029 | 1.4854 | 30400 | 0.0590 | 38490112 |
| 0.0394 | 1.4952 | 30600 | 0.0572 | 38742560 |
| 0.015 | 1.5049 | 30800 | 0.0579 | 38994368 |
| 0.0603 | 1.5147 | 31000 | 0.0594 | 39248416 |
| 0.0464 | 1.5245 | 31200 | 0.0566 | 39501152 |
| 0.0146 | 1.5342 | 31400 | 0.0566 | 39756224 |
| 0.0503 | 1.5440 | 31600 | 0.0573 | 40012896 |
| 0.024 | 1.5538 | 31800 | 0.0578 | 40268416 |
| 0.0219 | 1.5636 | 32000 | 0.0584 | 40522848 |
| 0.0498 | 1.5733 | 32200 | 0.0576 | 40775072 |
| 0.0504 | 1.5831 | 32400 | 0.0576 | 41031296 |
| 0.0329 | 1.5929 | 32600 | 0.0581 | 41287200 |
| 0.0172 | 1.6026 | 32800 | 0.0583 | 41541664 |
| 0.0435 | 1.6124 | 33000 | 0.0575 | 41793376 |
| 0.0123 | 1.6222 | 33200 | 0.0579 | 42044352 |
| 0.024 | 1.6320 | 33400 | 0.0584 | 42295520 |
| 0.0248 | 1.6417 | 33600 | 0.0584 | 42547680 |
| 0.0378 | 1.6515 | 33800 | 0.0581 | 42796992 |
| 0.0176 | 1.6613 | 34000 | 0.0585 | 43049888 |
| 0.0226 | 1.6710 | 34200 | 0.0577 | 43303328 |
| 0.0497 | 1.6808 | 34400 | 0.0570 | 43556672 |
| 0.0258 | 1.6906 | 34600 | 0.0571 | 43809088 |
| 0.0228 | 1.7004 | 34800 | 0.0574 | 44059712 |
| 0.023 | 1.7101 | 35000 | 0.0572 | 44313216 |
| 0.0263 | 1.7199 | 35200 | 0.0567 | 44566336 |
| 0.0523 | 1.7297 | 35400 | 0.0569 | 44817984 |
| 0.0263 | 1.7395 | 35600 | 0.0562 | 45072416 |
| 0.0506 | 1.7492 | 35800 | 0.0566 | 45330336 |
| 0.0256 | 1.7590 | 36000 | 0.0572 | 45584800 |
| 0.0456 | 1.7688 | 36200 | 0.0572 | 45838848 |
| 0.0229 | 1.7785 | 36400 | 0.0571 | 46091136 |
| 0.0398 | 1.7883 | 36600 | 0.0570 | 46343744 |
| 0.0242 | 1.7981 | 36800 | 0.0572 | 46597920 |
| 0.0487 | 1.8079 | 37000 | 0.0572 | 46850336 |
| 0.0307 | 1.8176 | 37200 | 0.0571 | 47104992 |
| 0.0433 | 1.8274 | 37400 | 0.0569 | 47356992 |
| 0.0126 | 1.8372 | 37600 | 0.0570 | 47609472 |
| 0.0316 | 1.8469 | 37800 | 0.0571 | 47859360 |
| 0.0462 | 1.8567 | 38000 | 0.0570 | 48110912 |
| 0.0375 | 1.8665 | 38200 | 0.0569 | 48362560 |
| 0.0194 | 1.8763 | 38400 | 0.0567 | 48613184 |
| 0.0445 | 1.8860 | 38600 | 0.0567 | 48868096 |
| 0.0183 | 1.8958 | 38800 | 0.0568 | 49125344 |
| 0.0304 | 1.9056 | 39000 | 0.0568 | 49380224 |
| 0.0326 | 1.9153 | 39200 | 0.0568 | 49633664 |
| 0.0402 | 1.9251 | 39400 | 0.0568 | 49887680 |
| 0.0233 | 1.9349 | 39600 | 0.0568 | 50139296 |
| 0.0313 | 1.9447 | 39800 | 0.0569 | 50391680 |
| 0.0289 | 1.9544 | 40000 | 0.0569 | 50647232 |
Framework versions
- PEFT 0.15.1
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 39
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_qqp_1744902602
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3