CocoRoF commited on
Commit
943642f
·
verified ·
1 Parent(s): d2589ec
Files changed (1) hide show
  1. README.md +45 -45
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ---
2
  library_name: transformers
3
  license: apache-2.0
4
- base_model: x2bee/KoModernBERT-base-mlm-v03-retry-ckp03
5
  tags:
6
  - generated_from_trainer
7
  model-index:
@@ -14,17 +14,17 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # KMB_SimCSE_test
16
 
17
- This model is a fine-tuned version of [x2bee/KoModernBERT-base-mlm-v03-retry-ckp03](https://huggingface.co/x2bee/KoModernBERT-base-mlm-v03-retry-ckp03) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.0306
20
- - Pearson Cosine: 0.8211
21
- - Spearman Cosine: 0.8198
22
- - Pearson Manhattan: 0.7909
23
- - Spearman Manhattan: 0.7991
24
- - Pearson Euclidean: 0.7883
25
- - Spearman Euclidean: 0.7968
26
- - Pearson Dot: 0.7578
27
- - Spearman Dot: 0.7578
28
 
29
  ## Model description
30
 
@@ -59,40 +59,40 @@ The following hyperparameters were used during training:
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Pearson Cosine | Spearman Cosine | Pearson Manhattan | Spearman Manhattan | Pearson Euclidean | Spearman Euclidean | Pearson Dot | Spearman Dot |
61
  |:-------------:|:------:|:----:|:---------------:|:--------------:|:---------------:|:-----------------:|:------------------:|:-----------------:|:------------------:|:-----------:|:------------:|
62
- | 0.4859 | 0.1172 | 250 | 0.0753 | 0.7923 | 0.7923 | 0.7833 | 0.7911 | 0.7825 | 0.7907 | 0.6785 | 0.6757 |
63
- | 0.4421 | 0.2343 | 500 | 0.0699 | 0.7956 | 0.7989 | 0.7894 | 0.7987 | 0.7887 | 0.7980 | 0.6754 | 0.6702 |
64
- | 0.3553 | 0.3515 | 750 | 0.0556 | 0.8076 | 0.8088 | 0.8036 | 0.8096 | 0.8024 | 0.8090 | 0.7051 | 0.7031 |
65
- | 0.3311 | 0.4686 | 1000 | 0.0558 | 0.8114 | 0.8143 | 0.8050 | 0.8126 | 0.8040 | 0.8118 | 0.7185 | 0.7185 |
66
- | 0.3541 | 0.5858 | 1250 | 0.0556 | 0.8070 | 0.8099 | 0.8135 | 0.8183 | 0.8126 | 0.8180 | 0.7040 | 0.7018 |
67
- | 0.344 | 0.7029 | 1500 | 0.0549 | 0.8153 | 0.8197 | 0.8109 | 0.8202 | 0.8097 | 0.8188 | 0.7054 | 0.7078 |
68
- | 0.3268 | 0.8201 | 1750 | 0.0535 | 0.8172 | 0.8210 | 0.8138 | 0.8211 | 0.8128 | 0.8202 | 0.7224 | 0.7208 |
69
- | 0.3399 | 0.9372 | 2000 | 0.0569 | 0.8113 | 0.8163 | 0.8073 | 0.8162 | 0.8066 | 0.8152 | 0.7242 | 0.7226 |
70
- | 0.2473 | 1.0544 | 2250 | 0.0453 | 0.8124 | 0.8143 | 0.8031 | 0.8103 | 0.8020 | 0.8093 | 0.7271 | 0.7261 |
71
- | 0.2563 | 1.1715 | 2500 | 0.0408 | 0.8178 | 0.8195 | 0.8043 | 0.8132 | 0.8032 | 0.8120 | 0.7518 | 0.7504 |
72
- | 0.2841 | 1.2887 | 2750 | 0.0437 | 0.8074 | 0.8100 | 0.8063 | 0.8138 | 0.8053 | 0.8130 | 0.7237 | 0.7204 |
73
- | 0.2462 | 1.4058 | 3000 | 0.0419 | 0.8164 | 0.8192 | 0.8050 | 0.8143 | 0.8039 | 0.8132 | 0.7395 | 0.7393 |
74
- | 0.2328 | 1.5230 | 3250 | 0.0404 | 0.8187 | 0.8203 | 0.8084 | 0.8165 | 0.8070 | 0.8154 | 0.7426 | 0.7414 |
75
- | 0.2052 | 1.6401 | 3500 | 0.0390 | 0.8147 | 0.8164 | 0.8045 | 0.8129 | 0.8035 | 0.8122 | 0.7426 | 0.7422 |
76
- | 0.262 | 1.7573 | 3750 | 0.0419 | 0.8188 | 0.8204 | 0.8080 | 0.8170 | 0.8067 | 0.8158 | 0.7306 | 0.7294 |
77
- | 0.2269 | 1.8744 | 4000 | 0.0393 | 0.8218 | 0.8235 | 0.8002 | 0.8112 | 0.7985 | 0.8094 | 0.7384 | 0.7375 |
78
- | 0.2472 | 1.9916 | 4250 | 0.0400 | 0.8203 | 0.8224 | 0.8053 | 0.8160 | 0.8040 | 0.8147 | 0.7317 | 0.7308 |
79
- | 0.1838 | 2.1087 | 4500 | 0.0348 | 0.8184 | 0.8191 | 0.8023 | 0.8099 | 0.8005 | 0.8085 | 0.7495 | 0.7481 |
80
- | 0.1509 | 2.2259 | 4750 | 0.0359 | 0.8117 | 0.8120 | 0.7977 | 0.8054 | 0.7958 | 0.8036 | 0.7344 | 0.7343 |
81
- | 0.1816 | 2.3430 | 5000 | 0.0330 | 0.8185 | 0.8181 | 0.8000 | 0.8079 | 0.7978 | 0.8060 | 0.7507 | 0.7501 |
82
- | 0.166 | 2.4602 | 5250 | 0.0335 | 0.8183 | 0.8188 | 0.8015 | 0.8107 | 0.7997 | 0.8091 | 0.7450 | 0.7445 |
83
- | 0.1572 | 2.5773 | 5500 | 0.0352 | 0.8123 | 0.8135 | 0.8021 | 0.8100 | 0.8003 | 0.8084 | 0.7368 | 0.7336 |
84
- | 0.1353 | 2.6945 | 5750 | 0.0333 | 0.8210 | 0.8211 | 0.8045 | 0.8123 | 0.8024 | 0.8103 | 0.7463 | 0.7463 |
85
- | 0.1555 | 2.8116 | 6000 | 0.0325 | 0.8185 | 0.8183 | 0.7959 | 0.8036 | 0.7939 | 0.8019 | 0.7526 | 0.7538 |
86
- | 0.152 | 2.9288 | 6250 | 0.0326 | 0.8154 | 0.8151 | 0.7929 | 0.8018 | 0.7908 | 0.8001 | 0.7415 | 0.7427 |
87
- | 0.1 | 3.0459 | 6500 | 0.0312 | 0.8194 | 0.8190 | 0.7908 | 0.7990 | 0.7886 | 0.7972 | 0.7565 | 0.7571 |
88
- | 0.1075 | 3.1631 | 6750 | 0.0318 | 0.8184 | 0.8181 | 0.7949 | 0.8031 | 0.7928 | 0.8016 | 0.7567 | 0.7583 |
89
- | 0.0971 | 3.2802 | 7000 | 0.0312 | 0.8183 | 0.8176 | 0.7905 | 0.7992 | 0.7882 | 0.7970 | 0.7561 | 0.7572 |
90
- | 0.12 | 3.3974 | 7250 | 0.0303 | 0.8237 | 0.8230 | 0.7953 | 0.8035 | 0.7930 | 0.8016 | 0.7683 | 0.7690 |
91
- | 0.1003 | 3.5145 | 7500 | 0.0315 | 0.8181 | 0.8172 | 0.7964 | 0.8047 | 0.7941 | 0.8028 | 0.7502 | 0.7505 |
92
- | 0.1237 | 3.6317 | 7750 | 0.0308 | 0.8190 | 0.8178 | 0.7915 | 0.7990 | 0.7886 | 0.7969 | 0.7589 | 0.7583 |
93
- | 0.0991 | 3.7488 | 8000 | 0.0315 | 0.8186 | 0.8172 | 0.7952 | 0.8024 | 0.7925 | 0.8000 | 0.7540 | 0.7531 |
94
- | 0.1017 | 3.8660 | 8250 | 0.0311 | 0.8182 | 0.8174 | 0.7925 | 0.8007 | 0.7900 | 0.7986 | 0.7532 | 0.7523 |
95
- | 0.1132 | 3.9831 | 8500 | 0.0306 | 0.8211 | 0.8198 | 0.7909 | 0.7991 | 0.7883 | 0.7968 | 0.7578 | 0.7578 |
96
 
97
 
98
  ### Framework versions
 
1
  ---
2
  library_name: transformers
3
  license: apache-2.0
4
+ base_model: CocoRoF/KoModernBERT-chp-11
5
  tags:
6
  - generated_from_trainer
7
  model-index:
 
14
 
15
  # KMB_SimCSE_test
16
 
17
+ This model is a fine-tuned version of [CocoRoF/KoModernBERT-chp-11](https://huggingface.co/CocoRoF/KoModernBERT-chp-11) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.0438
20
+ - Pearson Cosine: 0.7947
21
+ - Spearman Cosine: 0.7992
22
+ - Pearson Manhattan: 0.7493
23
+ - Spearman Manhattan: 0.7655
24
+ - Pearson Euclidean: 0.7507
25
+ - Spearman Euclidean: 0.7666
26
+ - Pearson Dot: 0.6408
27
+ - Spearman Dot: 0.6472
28
 
29
  ## Model description
30
 
 
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Pearson Cosine | Spearman Cosine | Pearson Manhattan | Spearman Manhattan | Pearson Euclidean | Spearman Euclidean | Pearson Dot | Spearman Dot |
61
  |:-------------:|:------:|:----:|:---------------:|:--------------:|:---------------:|:-----------------:|:------------------:|:-----------------:|:------------------:|:-----------:|:------------:|
62
+ | 0.761 | 0.1172 | 250 | 0.1397 | 0.7191 | 0.7366 | 0.7129 | 0.7205 | 0.7135 | 0.7210 | 0.4342 | 0.4302 |
63
+ | 0.6275 | 0.2343 | 500 | 0.1240 | 0.7535 | 0.7638 | 0.7442 | 0.7505 | 0.7442 | 0.7506 | 0.4527 | 0.4533 |
64
+ | 0.5326 | 0.3515 | 750 | 0.1149 | 0.7540 | 0.7698 | 0.7320 | 0.7461 | 0.7327 | 0.7466 | 0.4786 | 0.4737 |
65
+ | 0.4917 | 0.4686 | 1000 | 0.1028 | 0.7630 | 0.7778 | 0.7395 | 0.7532 | 0.7395 | 0.7531 | 0.5428 | 0.5404 |
66
+ | 0.4451 | 0.5858 | 1250 | 0.0959 | 0.7634 | 0.7803 | 0.7505 | 0.7649 | 0.7508 | 0.7652 | 0.5909 | 0.5929 |
67
+ | 0.4682 | 0.7029 | 1500 | 0.1057 | 0.7687 | 0.7855 | 0.7541 | 0.7681 | 0.7545 | 0.7685 | 0.5271 | 0.5190 |
68
+ | 0.4489 | 0.8201 | 1750 | 0.0994 | 0.7658 | 0.7800 | 0.7505 | 0.7624 | 0.7514 | 0.7627 | 0.5765 | 0.5760 |
69
+ | 0.4696 | 0.9372 | 2000 | 0.1055 | 0.7618 | 0.7835 | 0.7514 | 0.7669 | 0.7526 | 0.7675 | 0.5910 | 0.5835 |
70
+ | 0.3474 | 1.0544 | 2250 | 0.0818 | 0.7663 | 0.7777 | 0.7527 | 0.7636 | 0.7536 | 0.7642 | 0.5774 | 0.5748 |
71
+ | 0.319 | 1.1715 | 2500 | 0.0752 | 0.7753 | 0.7858 | 0.7589 | 0.7692 | 0.7592 | 0.7692 | 0.5929 | 0.5919 |
72
+ | 0.3682 | 1.2887 | 2750 | 0.0767 | 0.7736 | 0.7851 | 0.7556 | 0.7667 | 0.7564 | 0.7671 | 0.5784 | 0.5785 |
73
+ | 0.3033 | 1.4058 | 3000 | 0.0716 | 0.7836 | 0.7962 | 0.7590 | 0.7723 | 0.7600 | 0.7727 | 0.5987 | 0.5976 |
74
+ | 0.3247 | 1.5230 | 3250 | 0.0768 | 0.7779 | 0.7911 | 0.7613 | 0.7731 | 0.7621 | 0.7735 | 0.5638 | 0.5623 |
75
+ | 0.26 | 1.6401 | 3500 | 0.0686 | 0.7792 | 0.7902 | 0.7615 | 0.7733 | 0.7623 | 0.7734 | 0.6004 | 0.5998 |
76
+ | 0.3216 | 1.7573 | 3750 | 0.0707 | 0.7851 | 0.7950 | 0.7668 | 0.7787 | 0.7677 | 0.7791 | 0.6098 | 0.6136 |
77
+ | 0.3166 | 1.8744 | 4000 | 0.0719 | 0.7799 | 0.7911 | 0.7550 | 0.7693 | 0.7563 | 0.7701 | 0.5737 | 0.5754 |
78
+ | 0.315 | 1.9916 | 4250 | 0.0710 | 0.7818 | 0.7925 | 0.7657 | 0.7780 | 0.7672 | 0.7790 | 0.5918 | 0.5930 |
79
+ | 0.2117 | 2.1087 | 4500 | 0.0545 | 0.7772 | 0.7890 | 0.7551 | 0.7702 | 0.7567 | 0.7712 | 0.6059 | 0.6096 |
80
+ | 0.1725 | 2.2259 | 4750 | 0.0544 | 0.7780 | 0.7868 | 0.7593 | 0.7714 | 0.7605 | 0.7721 | 0.6065 | 0.6128 |
81
+ | 0.1985 | 2.3430 | 5000 | 0.0540 | 0.7818 | 0.7916 | 0.7621 | 0.7733 | 0.7626 | 0.7734 | 0.6017 | 0.6078 |
82
+ | 0.1871 | 2.4602 | 5250 | 0.0527 | 0.7830 | 0.7898 | 0.7576 | 0.7718 | 0.7587 | 0.7724 | 0.5843 | 0.5894 |
83
+ | 0.17 | 2.5773 | 5500 | 0.0521 | 0.7877 | 0.7959 | 0.7621 | 0.7746 | 0.7633 | 0.7753 | 0.6240 | 0.6246 |
84
+ | 0.174 | 2.6945 | 5750 | 0.0528 | 0.7876 | 0.7949 | 0.7594 | 0.7713 | 0.7603 | 0.7716 | 0.6196 | 0.6234 |
85
+ | 0.1896 | 2.8116 | 6000 | 0.0506 | 0.7848 | 0.7891 | 0.7595 | 0.7712 | 0.7606 | 0.7718 | 0.6052 | 0.6083 |
86
+ | 0.1897 | 2.9288 | 6250 | 0.0549 | 0.7819 | 0.7902 | 0.7521 | 0.7664 | 0.7533 | 0.7667 | 0.5957 | 0.5981 |
87
+ | 0.105 | 3.0459 | 6500 | 0.0450 | 0.7887 | 0.7931 | 0.7516 | 0.7669 | 0.7527 | 0.7675 | 0.6385 | 0.6450 |
88
+ | 0.1055 | 3.1631 | 6750 | 0.0460 | 0.7875 | 0.7927 | 0.7515 | 0.7652 | 0.7525 | 0.7657 | 0.6256 | 0.6332 |
89
+ | 0.1145 | 3.2802 | 7000 | 0.0453 | 0.7925 | 0.7977 | 0.7548 | 0.7671 | 0.7559 | 0.7678 | 0.6316 | 0.6408 |
90
+ | 0.1252 | 3.3974 | 7250 | 0.0470 | 0.7889 | 0.7947 | 0.7561 | 0.7683 | 0.7571 | 0.7693 | 0.6257 | 0.6283 |
91
+ | 0.1058 | 3.5145 | 7500 | 0.0446 | 0.7913 | 0.7958 | 0.7572 | 0.7714 | 0.7578 | 0.7715 | 0.6221 | 0.6338 |
92
+ | 0.1144 | 3.6317 | 7750 | 0.0433 | 0.7939 | 0.7989 | 0.7534 | 0.7673 | 0.7542 | 0.7677 | 0.6519 | 0.6583 |
93
+ | 0.0971 | 3.7488 | 8000 | 0.0438 | 0.7952 | 0.7993 | 0.7537 | 0.7675 | 0.7547 | 0.7679 | 0.6345 | 0.6383 |
94
+ | 0.1107 | 3.8660 | 8250 | 0.0432 | 0.7953 | 0.7992 | 0.7507 | 0.7673 | 0.7518 | 0.7675 | 0.6355 | 0.6411 |
95
+ | 0.1232 | 3.9831 | 8500 | 0.0438 | 0.7947 | 0.7992 | 0.7493 | 0.7655 | 0.7507 | 0.7666 | 0.6408 | 0.6472 |
96
 
97
 
98
  ### Framework versions