floflodebilbao commited on
Commit
133dd1e
·
verified ·
1 Parent(s): 3b57b06

End of training

Browse files
README.md CHANGED
@@ -23,20 +23,20 @@ should probably proofread and complete it, then remove this comment. -->
23
  This model is a fine-tuned version of [google/long-t5-tglobal-base](https://huggingface.co/google/long-t5-tglobal-base) on an unknown dataset.
24
  It achieves the following results on the evaluation set:
25
  - Loss: 4.2146
26
- - Rouge1: 0.136
27
- - Rouge2: 0.0285
28
- - Rougel: 0.1045
29
- - Rougelsum: 0.1042
30
- - Gen Len: 20.0
31
- - Bleu: 0.0065
32
- - Precisions: 0.0567
33
- - Brevity Penalty: 0.3576
34
- - Length Ratio: 0.493
35
- - Translation Length: 602.0
36
  - Reference Length: 1221.0
37
- - Precision: 0.817
38
- - Recall: 0.8329
39
- - F1: 0.8246
40
  - Hashcode: roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1)
41
 
42
  ## Model description
@@ -70,21 +70,21 @@ The following hyperparameters were used during training:
70
 
71
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | Bleu | Precisions | Brevity Penalty | Length Ratio | Translation Length | Reference Length | Precision | Recall | F1 | Hashcode |
72
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|:------:|:----------:|:---------------:|:------------:|:------------------:|:----------------:|:---------:|:------:|:------:|:---------------------------------------------------------:|
73
- | No log | 1.0 | 7 | 26.0662 | 0.2073 | 0.057 | 0.1688 | 0.1691 | 20.0 | 0.0198 | 0.0737 | 0.5394 | 0.6183 | 755.0 | 1221.0 | 0.8581 | 0.8517 | 0.8548 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
74
- | No log | 2.0 | 14 | 23.4955 | 0.185 | 0.0471 | 0.1522 | 0.152 | 20.0 | 0.0162 | 0.0646 | 0.5325 | 0.6134 | 749.0 | 1221.0 | 0.8547 | 0.8489 | 0.8517 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
75
- | No log | 3.0 | 21 | 21.3631 | 0.1825 | 0.0477 | 0.1523 | 0.1522 | 20.0 | 0.0157 | 0.0622 | 0.529 | 0.611 | 746.0 | 1221.0 | 0.8555 | 0.8485 | 0.8519 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
76
- | No log | 4.0 | 28 | 19.5442 | 0.1915 | 0.0533 | 0.1588 | 0.1592 | 20.0 | 0.0153 | 0.0655 | 0.5337 | 0.6143 | 750.0 | 1221.0 | 0.8575 | 0.8506 | 0.8539 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
77
- | No log | 5.0 | 35 | 17.8808 | 0.19 | 0.0507 | 0.1588 | 0.159 | 20.0 | 0.0153 | 0.0668 | 0.529 | 0.611 | 746.0 | 1221.0 | 0.8569 | 0.8499 | 0.8533 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
78
- | No log | 6.0 | 42 | 16.2547 | 0.1844 | 0.0457 | 0.1508 | 0.1509 | 20.0 | 0.0127 | 0.0626 | 0.529 | 0.611 | 746.0 | 1221.0 | 0.8548 | 0.8482 | 0.8514 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
79
- | No log | 7.0 | 49 | 14.4184 | 0.1887 | 0.0511 | 0.1559 | 0.1562 | 20.0 | 0.0198 | 0.0693 | 0.529 | 0.611 | 746.0 | 1221.0 | 0.8545 | 0.8486 | 0.8515 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
80
- | No log | 8.0 | 56 | 11.9307 | 0.193 | 0.0529 | 0.1567 | 0.156 | 20.0 | 0.0263 | 0.0774 | 0.5232 | 0.6069 | 741.0 | 1221.0 | 0.8526 | 0.8467 | 0.8496 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
81
- | No log | 9.0 | 63 | 8.0749 | 0.2082 | 0.0606 | 0.1658 | 0.1642 | 20.0 | 0.0287 | 0.0825 | 0.5162 | 0.602 | 735.0 | 1221.0 | 0.8521 | 0.848 | 0.85 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
82
- | No log | 10.0 | 70 | 4.2463 | 0.1488 | 0.0238 | 0.1176 | 0.1169 | 20.0 | 0.0 | 0.0454 | 0.4798 | 0.5766 | 704.0 | 1221.0 | 0.8255 | 0.8365 | 0.8308 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
83
- | No log | 11.0 | 77 | 4.1345 | 0.1271 | 0.0302 | 0.0991 | 0.0988 | 20.0 | 0.0073 | 0.0573 | 0.3697 | 0.5012 | 612.0 | 1221.0 | 0.8102 | 0.8285 | 0.819 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
84
- | No log | 12.0 | 84 | 4.2879 | 0.1217 | 0.0281 | 0.0991 | 0.0991 | 20.0 | 0.006 | 0.0582 | 0.2952 | 0.4505 | 550.0 | 1221.0 | 0.808 | 0.8265 | 0.8169 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
85
- | No log | 13.0 | 91 | 4.3117 | 0.1414 | 0.0341 | 0.1079 | 0.1072 | 20.0 | 0.0069 | 0.061 | 0.3203 | 0.4676 | 571.0 | 1221.0 | 0.8151 | 0.8311 | 0.8228 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
86
- | No log | 14.0 | 98 | 4.2535 | 0.1306 | 0.0252 | 0.0951 | 0.095 | 20.0 | 0.0063 | 0.0541 | 0.354 | 0.4906 | 599.0 | 1221.0 | 0.8155 | 0.8314 | 0.8232 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
87
- | No log | 15.0 | 105 | 4.2146 | 0.136 | 0.0285 | 0.1045 | 0.1042 | 20.0 | 0.0065 | 0.0567 | 0.3576 | 0.493 | 602.0 | 1221.0 | 0.817 | 0.8329 | 0.8246 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
88
 
89
 
90
  ### Framework versions
 
23
  This model is a fine-tuned version of [google/long-t5-tglobal-base](https://huggingface.co/google/long-t5-tglobal-base) on an unknown dataset.
24
  It achieves the following results on the evaluation set:
25
  - Loss: 4.2146
26
+ - Rouge1: 0.1397
27
+ - Rouge2: 0.0273
28
+ - Rougel: 0.0982
29
+ - Rougelsum: 0.097
30
+ - Gen Len: 62.6
31
+ - Bleu: 0.0075
32
+ - Precisions: 0.0304
33
+ - Brevity Penalty: 1.0
34
+ - Length Ratio: 1.6642
35
+ - Translation Length: 2032.0
36
  - Reference Length: 1221.0
37
+ - Precision: 0.7816
38
+ - Recall: 0.8361
39
+ - F1: 0.8076
40
  - Hashcode: roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1)
41
 
42
  ## Model description
 
70
 
71
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | Bleu | Precisions | Brevity Penalty | Length Ratio | Translation Length | Reference Length | Precision | Recall | F1 | Hashcode |
72
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|:------:|:----------:|:---------------:|:------------:|:------------------:|:----------------:|:---------:|:------:|:------:|:---------------------------------------------------------:|
73
+ | No log | 1.0 | 7 | 26.0662 | 0.2775 | 0.0911 | 0.2084 | 0.2082 | 62.06 | 0.0413 | 0.0659 | 1.0 | 1.8026 | 2201.0 | 1221.0 | 0.8359 | 0.8722 | 0.8536 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
74
+ | No log | 2.0 | 14 | 23.4955 | 0.273 | 0.0899 | 0.202 | 0.2032 | 61.62 | 0.0408 | 0.0663 | 1.0 | 1.7772 | 2170.0 | 1221.0 | 0.8376 | 0.8724 | 0.8546 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
75
+ | No log | 3.0 | 21 | 21.3631 | 0.2652 | 0.0914 | 0.2003 | 0.2016 | 60.82 | 0.0412 | 0.0657 | 1.0 | 1.7609 | 2150.0 | 1221.0 | 0.8383 | 0.8706 | 0.854 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
76
+ | No log | 4.0 | 28 | 19.5442 | 0.2774 | 0.098 | 0.2076 | 0.2081 | 59.64 | 0.0445 | 0.07 | 1.0 | 1.7387 | 2123.0 | 1221.0 | 0.8413 | 0.8726 | 0.8566 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
77
+ | No log | 5.0 | 35 | 17.8808 | 0.2684 | 0.0934 | 0.1991 | 0.2008 | 59.64 | 0.0437 | 0.0681 | 1.0 | 1.7363 | 2120.0 | 1221.0 | 0.8391 | 0.8697 | 0.854 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
78
+ | No log | 6.0 | 42 | 16.2547 | 0.265 | 0.088 | 0.1911 | 0.1937 | 59.42 | 0.0399 | 0.0642 | 1.0 | 1.7248 | 2106.0 | 1221.0 | 0.838 | 0.8682 | 0.8528 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
79
+ | No log | 7.0 | 49 | 14.4184 | 0.2621 | 0.0856 | 0.185 | 0.1849 | 59.4 | 0.0412 | 0.0648 | 1.0 | 1.7224 | 2103.0 | 1221.0 | 0.8373 | 0.8689 | 0.8527 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
80
+ | No log | 8.0 | 56 | 11.9307 | 0.2634 | 0.0904 | 0.1875 | 0.1875 | 59.06 | 0.0478 | 0.0696 | 1.0 | 1.7191 | 2099.0 | 1221.0 | 0.8369 | 0.8688 | 0.8525 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
81
+ | No log | 9.0 | 63 | 8.0749 | 0.2548 | 0.0802 | 0.1823 | 0.1821 | 56.46 | 0.0471 | 0.0686 | 1.0 | 1.6355 | 1997.0 | 1221.0 | 0.837 | 0.864 | 0.8502 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
82
+ | No log | 10.0 | 70 | 4.2463 | 0.1698 | 0.0315 | 0.122 | 0.1209 | 57.5 | 0.0065 | 0.0337 | 1.0 | 1.5995 | 1953.0 | 1221.0 | 0.8068 | 0.8444 | 0.8249 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
83
+ | No log | 11.0 | 77 | 4.1345 | 0.1154 | 0.0265 | 0.0863 | 0.0865 | 59.86 | 0.007 | 0.0268 | 1.0 | 1.5119 | 1846.0 | 1221.0 | 0.7703 | 0.828 | 0.7976 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
84
+ | No log | 12.0 | 84 | 4.2879 | 0.1033 | 0.0242 | 0.0844 | 0.0842 | 63.0 | 0.0081 | 0.0261 | 1.0 | 1.5111 | 1845.0 | 1221.0 | 0.7695 | 0.8276 | 0.797 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
85
+ | No log | 13.0 | 91 | 4.3117 | 0.1271 | 0.0306 | 0.0961 | 0.0956 | 63.0 | 0.0086 | 0.0297 | 1.0 | 1.6126 | 1969.0 | 1221.0 | 0.7731 | 0.8332 | 0.8015 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
86
+ | No log | 14.0 | 98 | 4.2535 | 0.1291 | 0.0277 | 0.089 | 0.0884 | 63.0 | 0.0077 | 0.0286 | 1.0 | 1.6847 | 2057.0 | 1221.0 | 0.7804 | 0.835 | 0.8064 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
87
+ | No log | 15.0 | 105 | 4.2146 | 0.1397 | 0.0273 | 0.0982 | 0.097 | 62.6 | 0.0075 | 0.0304 | 1.0 | 1.6642 | 2032.0 | 1221.0 | 0.7816 | 0.8361 | 0.8076 | roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.53.1) |
88
 
89
 
90
  ### Framework versions
runs/Jul09_12-42-54_tardis/events.out.tfevents.1752057775.tardis.55893.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:708129342fde718ac318e28e221654123853513dff996ec5e4906a411c82be26
3
+ size 5350
runs/Jul09_12-46-15_tardis/events.out.tfevents.1752057976.tardis.56897.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6149a17ac843cc8d44669c6c68e432dc9d465d80f127dedc9faae8e46f8241c
3
+ size 5349
runs/Jul09_12-48-29_tardis/events.out.tfevents.1752058110.tardis.57396.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f87496f82d2389c914f8ab117dd1540ca44fabf8ffa9c4b14b708fbde5e65c18
3
+ size 5349
runs/Jul09_12-51-58_tardis/events.out.tfevents.1752058320.tardis.58114.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8af930fbd2a497349eb278598f3b1ed52e6220f81907c786702edc12bef50134
3
+ size 8745
runs/Jul09_12-57-59_tardis/events.out.tfevents.1752058680.tardis.59856.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:65d00de34bcf3cca583698adab9bc9d88344bb0875771baa258b64c22fc3acc7
3
+ size 22677
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:394cf6c493731e254648f86939f024c0fc44bbb2fd113aab5a6cdbff1a3eda41
3
  size 5905
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:324f5823a00f4ac08d38a72d3895756b980b286a3b984c65ffaaa8e138c1221b
3
  size 5905