suku9 commited on
Commit
f3270e1
·
verified ·
1 Parent(s): cf18495

Model save

Browse files
Files changed (3) hide show
  1. README.md +123 -0
  2. generation_config.json +7 -0
  3. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,123 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: pretrain_spdl_
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # pretrain_spdl_
14
+
15
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.5261
18
+
19
+ ## Model description
20
+
21
+ More information needed
22
+
23
+ ## Intended uses & limitations
24
+
25
+ More information needed
26
+
27
+ ## Training and evaluation data
28
+
29
+ More information needed
30
+
31
+ ## Training procedure
32
+
33
+ ### Training hyperparameters
34
+
35
+ The following hyperparameters were used during training:
36
+ - learning_rate: 5e-05
37
+ - train_batch_size: 1024
38
+ - eval_batch_size: 1024
39
+ - seed: 42
40
+ - optimizer: Use adamw_torch with betas=(0.9,0.95) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
41
+ - lr_scheduler_type: cosine
42
+ - lr_scheduler_warmup_steps: 156250
43
+ - num_epochs: 25
44
+ - mixed_precision_training: Native AMP
45
+
46
+ ### Training results
47
+
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:-------:|:-----:|:---------------:|
50
+ | 0.4617 | 0.3774 | 500 | 1.7630 |
51
+ | 0.3667 | 0.7547 | 1000 | 1.3955 |
52
+ | 0.3332 | 1.1321 | 1500 | 1.2666 |
53
+ | 0.3104 | 1.5094 | 2000 | 1.1712 |
54
+ | 0.2941 | 1.8868 | 2500 | 1.0907 |
55
+ | 0.2766 | 2.2642 | 3000 | 1.0196 |
56
+ | 0.2606 | 2.6415 | 3500 | 0.9639 |
57
+ | 0.2505 | 3.0189 | 4000 | 0.9049 |
58
+ | 0.2359 | 3.3962 | 4500 | 0.8557 |
59
+ | 0.2251 | 3.7736 | 5000 | 0.8215 |
60
+ | 0.2169 | 4.1509 | 5500 | 0.7879 |
61
+ | 0.2126 | 4.5283 | 6000 | 0.7684 |
62
+ | 0.2049 | 4.9057 | 6500 | 0.7461 |
63
+ | 0.1997 | 5.2830 | 7000 | 0.7286 |
64
+ | 0.1962 | 5.6604 | 7500 | 0.7119 |
65
+ | 0.19 | 6.0377 | 8000 | 0.7050 |
66
+ | 0.1878 | 6.4151 | 8500 | 0.6906 |
67
+ | 0.1825 | 6.7925 | 9000 | 0.6831 |
68
+ | 0.1816 | 7.1698 | 9500 | 0.6669 |
69
+ | 0.1748 | 7.5472 | 10000 | 0.6604 |
70
+ | 0.1741 | 7.9245 | 10500 | 0.6518 |
71
+ | 0.1707 | 8.3019 | 11000 | 0.6429 |
72
+ | 0.1727 | 8.6792 | 11500 | 0.6378 |
73
+ | 0.1688 | 9.0566 | 12000 | 0.6326 |
74
+ | 0.166 | 9.4340 | 12500 | 0.6273 |
75
+ | 0.1652 | 9.8113 | 13000 | 0.6163 |
76
+ | 0.1642 | 10.1887 | 13500 | 0.6110 |
77
+ | 0.162 | 10.5660 | 14000 | 0.6068 |
78
+ | 0.1634 | 10.9434 | 14500 | 0.6044 |
79
+ | 0.1601 | 11.3208 | 15000 | 0.5986 |
80
+ | 0.1599 | 11.6981 | 15500 | 0.5942 |
81
+ | 0.1584 | 12.0755 | 16000 | 0.5909 |
82
+ | 0.1562 | 12.4528 | 16500 | 0.5863 |
83
+ | 0.1544 | 12.8302 | 17000 | 0.5832 |
84
+ | 0.1553 | 13.2075 | 17500 | 0.5801 |
85
+ | 0.1518 | 13.5849 | 18000 | 0.5778 |
86
+ | 0.1535 | 13.9623 | 18500 | 0.5737 |
87
+ | 0.1525 | 14.3396 | 19000 | 0.5727 |
88
+ | 0.1512 | 14.7170 | 19500 | 0.5710 |
89
+ | 0.1511 | 15.0943 | 20000 | 0.5675 |
90
+ | 0.1503 | 15.4717 | 20500 | 0.5671 |
91
+ | 0.1504 | 15.8491 | 21000 | 0.5640 |
92
+ | 0.1499 | 16.2264 | 21500 | 0.5618 |
93
+ | 0.1497 | 16.6038 | 22000 | 0.5584 |
94
+ | 0.1464 | 16.9811 | 22500 | 0.5552 |
95
+ | 0.148 | 17.3585 | 23000 | 0.5557 |
96
+ | 0.1465 | 17.7358 | 23500 | 0.5523 |
97
+ | 0.1465 | 18.1132 | 24000 | 0.5513 |
98
+ | 0.1447 | 18.4906 | 24500 | 0.5487 |
99
+ | 0.1452 | 18.8679 | 25000 | 0.5479 |
100
+ | 0.1447 | 19.2453 | 25500 | 0.5452 |
101
+ | 0.1431 | 19.6226 | 26000 | 0.5439 |
102
+ | 0.1438 | 20.0 | 26500 | 0.5430 |
103
+ | 0.1437 | 20.3774 | 27000 | 0.5411 |
104
+ | 0.1428 | 20.7547 | 27500 | 0.5395 |
105
+ | 0.1434 | 21.1321 | 28000 | 0.5388 |
106
+ | 0.142 | 21.5094 | 28500 | 0.5362 |
107
+ | 0.1418 | 21.8868 | 29000 | 0.5347 |
108
+ | 0.1419 | 22.2642 | 29500 | 0.5345 |
109
+ | 0.1418 | 22.6415 | 30000 | 0.5321 |
110
+ | 0.1407 | 23.0189 | 30500 | 0.5312 |
111
+ | 0.141 | 23.3962 | 31000 | 0.5303 |
112
+ | 0.1392 | 23.7736 | 31500 | 0.5293 |
113
+ | 0.1384 | 24.1509 | 32000 | 0.5289 |
114
+ | 0.1372 | 24.5283 | 32500 | 0.5274 |
115
+ | 0.1392 | 24.9057 | 33000 | 0.5261 |
116
+
117
+
118
+ ### Framework versions
119
+
120
+ - Transformers 4.51.1
121
+ - Pytorch 2.6.0+cu124
122
+ - Datasets 3.5.0
123
+ - Tokenizers 0.21.1
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 3,
4
+ "eos_token_id": 1,
5
+ "pad_token_id": 0,
6
+ "transformers_version": "4.51.1"
7
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9548214d560eadc13d7f087db738e4e7dbced99fe4f2e2dae22f3400f15ad849
3
  size 340724352
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:994ab1b3c4696b203a014d15df3ffcfd516e27a85b8964beb3aa5f1aecb0192c
3
  size 340724352