zera09 commited on
Commit
21b5308
·
verified ·
1 Parent(s): a5e7539

End of training

Browse files
README.md ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: t5-small
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - rouge
8
+ model-index:
9
+ - name: T5_small_sum_30_epoch
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # T5_small_sum_30_epoch
17
+
18
+ This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.9724
21
+ - Rouge1: 0.4391
22
+ - Rouge2: 0.2715
23
+ - Rougel: 0.4056
24
+ - Rougelsum: 0.4053
25
+ - Gen Len: 17.5469
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 16
46
+ - eval_batch_size: 16
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - num_epochs: 30
51
+ - mixed_precision_training: Native AMP
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
57
+ | No log | 1.0 | 400 | 2.3489 | 0.3892 | 0.2234 | 0.3514 | 0.3512 | 18.2775 |
58
+ | 2.8157 | 2.0 | 800 | 2.2388 | 0.4043 | 0.2353 | 0.3675 | 0.3672 | 17.8419 |
59
+ | 2.5008 | 3.0 | 1200 | 2.1871 | 0.4146 | 0.2446 | 0.3782 | 0.378 | 17.8312 |
60
+ | 2.4062 | 4.0 | 1600 | 2.1500 | 0.416 | 0.2475 | 0.3808 | 0.3806 | 17.7606 |
61
+ | 2.3515 | 5.0 | 2000 | 2.1213 | 0.4182 | 0.2479 | 0.3821 | 0.3817 | 17.83 |
62
+ | 2.3515 | 6.0 | 2400 | 2.0984 | 0.4236 | 0.2531 | 0.3889 | 0.3886 | 17.7031 |
63
+ | 2.2997 | 7.0 | 2800 | 2.0788 | 0.4245 | 0.2555 | 0.3906 | 0.3905 | 17.6712 |
64
+ | 2.2606 | 8.0 | 3200 | 2.0643 | 0.4271 | 0.2569 | 0.3922 | 0.3921 | 17.6825 |
65
+ | 2.2363 | 9.0 | 3600 | 2.0530 | 0.4291 | 0.2581 | 0.394 | 0.3939 | 17.6062 |
66
+ | 2.2016 | 10.0 | 4000 | 2.0378 | 0.4315 | 0.2618 | 0.3958 | 0.3957 | 17.5869 |
67
+ | 2.2016 | 11.0 | 4400 | 2.0287 | 0.4326 | 0.2629 | 0.3982 | 0.398 | 17.5612 |
68
+ | 2.1758 | 12.0 | 4800 | 2.0241 | 0.4328 | 0.2634 | 0.398 | 0.3978 | 17.5962 |
69
+ | 2.1502 | 13.0 | 5200 | 2.0145 | 0.4341 | 0.2651 | 0.3995 | 0.3994 | 17.56 |
70
+ | 2.1444 | 14.0 | 5600 | 2.0094 | 0.4346 | 0.2659 | 0.3994 | 0.3995 | 17.5831 |
71
+ | 2.1183 | 15.0 | 6000 | 2.0039 | 0.4351 | 0.2678 | 0.4008 | 0.4006 | 17.5812 |
72
+ | 2.1183 | 16.0 | 6400 | 1.9987 | 0.4343 | 0.2667 | 0.3998 | 0.3997 | 17.5225 |
73
+ | 2.1133 | 17.0 | 6800 | 1.9967 | 0.4342 | 0.2674 | 0.4004 | 0.4005 | 17.5544 |
74
+ | 2.0918 | 18.0 | 7200 | 1.9900 | 0.4357 | 0.2681 | 0.4014 | 0.4013 | 17.5419 |
75
+ | 2.0739 | 19.0 | 7600 | 1.9879 | 0.4365 | 0.2686 | 0.4029 | 0.4026 | 17.5469 |
76
+ | 2.0733 | 20.0 | 8000 | 1.9831 | 0.4378 | 0.2699 | 0.403 | 0.4029 | 17.5481 |
77
+ | 2.0733 | 21.0 | 8400 | 1.9818 | 0.4378 | 0.2705 | 0.4037 | 0.4037 | 17.5319 |
78
+ | 2.0657 | 22.0 | 8800 | 1.9791 | 0.4375 | 0.2703 | 0.4037 | 0.4037 | 17.5225 |
79
+ | 2.0412 | 23.0 | 9200 | 1.9792 | 0.4363 | 0.27 | 0.4026 | 0.4023 | 17.5581 |
80
+ | 2.0514 | 24.0 | 9600 | 1.9765 | 0.4381 | 0.2703 | 0.4041 | 0.4039 | 17.5262 |
81
+ | 2.047 | 25.0 | 10000 | 1.9764 | 0.4396 | 0.2716 | 0.4056 | 0.4055 | 17.5525 |
82
+ | 2.047 | 26.0 | 10400 | 1.9744 | 0.4388 | 0.2716 | 0.4054 | 0.4051 | 17.5675 |
83
+ | 2.0279 | 27.0 | 10800 | 1.9733 | 0.4397 | 0.2715 | 0.4057 | 0.4054 | 17.5494 |
84
+ | 2.0503 | 28.0 | 11200 | 1.9730 | 0.4391 | 0.2711 | 0.4055 | 0.4052 | 17.5456 |
85
+ | 2.0278 | 29.0 | 11600 | 1.9726 | 0.439 | 0.2712 | 0.4056 | 0.4053 | 17.5388 |
86
+ | 2.0322 | 30.0 | 12000 | 1.9724 | 0.4391 | 0.2715 | 0.4056 | 0.4053 | 17.5469 |
87
+
88
+
89
+ ### Framework versions
90
+
91
+ - Transformers 4.41.1
92
+ - Pytorch 1.13.1+cu117
93
+ - Datasets 2.19.1
94
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "decoder_start_token_id": 0,
3
+ "eos_token_id": 1,
4
+ "pad_token_id": 0,
5
+ "transformers_version": "4.41.1"
6
+ }
runs/May27_14-41-26_iit-p/events.out.tfevents.1716801089.iit-p.12670.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2400a7c839f5b476a687df97555ae3a6224a63fe10b1f524bc5fc6ada61e5665
3
- size 26085
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a4b2758feead2dce094efbba576aff96a60cd591de8b28d34e74b322a898da7a
3
+ size 26964