test Done
Browse files
README.md
CHANGED
|
@@ -1,7 +1,7 @@
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
license: apache-2.0
|
| 4 |
-
base_model: x2bee/KoModernBERT-base-mlm-v03-retry-
|
| 5 |
tags:
|
| 6 |
- generated_from_trainer
|
| 7 |
model-index:
|
|
@@ -14,17 +14,17 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 14 |
|
| 15 |
# KMB_SimCSE_test
|
| 16 |
|
| 17 |
-
This model is a fine-tuned version of [x2bee/KoModernBERT-base-mlm-v03-retry-
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
-
- Loss: 0.
|
| 20 |
-
- Pearson Cosine: 0.
|
| 21 |
-
- Spearman Cosine: 0.
|
| 22 |
-
- Pearson Manhattan: 0.
|
| 23 |
-
- Spearman Manhattan: 0.
|
| 24 |
-
- Pearson Euclidean: 0.
|
| 25 |
-
- Spearman Euclidean: 0.
|
| 26 |
-
- Pearson Dot: 0.
|
| 27 |
-
- Spearman Dot: 0.
|
| 28 |
|
| 29 |
## Model description
|
| 30 |
|
|
@@ -43,7 +43,7 @@ More information needed
|
|
| 43 |
### Training hyperparameters
|
| 44 |
|
| 45 |
The following hyperparameters were used during training:
|
| 46 |
-
- learning_rate:
|
| 47 |
- train_batch_size: 16
|
| 48 |
- eval_batch_size: 16
|
| 49 |
- seed: 42
|
|
@@ -53,54 +53,46 @@ The following hyperparameters were used during training:
|
|
| 53 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 54 |
- lr_scheduler_type: linear
|
| 55 |
- lr_scheduler_warmup_ratio: 0.1
|
| 56 |
-
- num_epochs:
|
| 57 |
|
| 58 |
### Training results
|
| 59 |
|
| 60 |
| Training Loss | Epoch | Step | Validation Loss | Pearson Cosine | Spearman Cosine | Pearson Manhattan | Spearman Manhattan | Pearson Euclidean | Spearman Euclidean | Pearson Dot | Spearman Dot |
|
| 61 |
|:-------------:|:------:|:----:|:---------------:|:--------------:|:---------------:|:-----------------:|:------------------:|:-----------------:|:------------------:|:-----------:|:------------:|
|
| 62 |
-
| 0.
|
| 63 |
-
| 0.
|
| 64 |
-
| 0.
|
| 65 |
-
| 0.
|
| 66 |
-
| 0.
|
| 67 |
-
| 0.
|
| 68 |
-
| 0.
|
| 69 |
-
| 0.
|
| 70 |
-
| 0.
|
| 71 |
-
| 0.
|
| 72 |
-
| 0.
|
| 73 |
-
| 0.
|
| 74 |
-
| 0.
|
| 75 |
-
| 0.
|
| 76 |
-
| 0.
|
| 77 |
-
| 0.
|
| 78 |
-
| 0.
|
| 79 |
-
| 0.
|
| 80 |
-
| 0.
|
| 81 |
-
| 0.
|
| 82 |
-
| 0.
|
| 83 |
-
| 0.
|
| 84 |
-
| 0.
|
| 85 |
-
| 0.
|
| 86 |
-
| 0.
|
| 87 |
-
| 0.
|
| 88 |
-
| 0.
|
| 89 |
-
| 0.
|
| 90 |
-
| 0.
|
| 91 |
-
| 0.
|
| 92 |
-
| 0.
|
| 93 |
-
| 0.
|
| 94 |
-
| 0.
|
| 95 |
-
| 0.
|
| 96 |
-
| 0.1606 | 1.6401 | 3500 | 0.0345 | 0.8219 | 0.8230 | 0.8146 | 0.8223 | 0.8128 | 0.8213 | 0.7629 | 0.7622 |
|
| 97 |
-
| 0.1982 | 1.6870 | 3600 | 0.0380 | 0.8220 | 0.8233 | 0.8196 | 0.8257 | 0.8182 | 0.8249 | 0.7552 | 0.7535 |
|
| 98 |
-
| 0.1824 | 1.7338 | 3700 | 0.0352 | 0.8246 | 0.8252 | 0.8181 | 0.8242 | 0.8166 | 0.8233 | 0.7567 | 0.7554 |
|
| 99 |
-
| 0.2009 | 1.7807 | 3800 | 0.0358 | 0.8270 | 0.8278 | 0.8105 | 0.8181 | 0.8090 | 0.8164 | 0.7669 | 0.7655 |
|
| 100 |
-
| 0.1899 | 1.8276 | 3900 | 0.0385 | 0.8240 | 0.8252 | 0.8133 | 0.8202 | 0.8111 | 0.8180 | 0.7418 | 0.7383 |
|
| 101 |
-
| 0.1858 | 1.8744 | 4000 | 0.0337 | 0.8281 | 0.8274 | 0.8122 | 0.8198 | 0.8102 | 0.8180 | 0.7620 | 0.7590 |
|
| 102 |
-
| 0.1679 | 1.9213 | 4100 | 0.0349 | 0.8238 | 0.8249 | 0.8109 | 0.8200 | 0.8097 | 0.8187 | 0.7561 | 0.7551 |
|
| 103 |
-
| 0.1699 | 1.9681 | 4200 | 0.0355 | 0.8274 | 0.8298 | 0.8125 | 0.8227 | 0.8113 | 0.8215 | 0.7647 | 0.7648 |
|
| 104 |
|
| 105 |
|
| 106 |
### Framework versions
|
|
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
license: apache-2.0
|
| 4 |
+
base_model: x2bee/KoModernBERT-base-mlm-v03-retry-ckp03
|
| 5 |
tags:
|
| 6 |
- generated_from_trainer
|
| 7 |
model-index:
|
|
|
|
| 14 |
|
| 15 |
# KMB_SimCSE_test
|
| 16 |
|
| 17 |
+
This model is a fine-tuned version of [x2bee/KoModernBERT-base-mlm-v03-retry-ckp03](https://huggingface.co/x2bee/KoModernBERT-base-mlm-v03-retry-ckp03) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
+
- Loss: 0.0306
|
| 20 |
+
- Pearson Cosine: 0.8211
|
| 21 |
+
- Spearman Cosine: 0.8198
|
| 22 |
+
- Pearson Manhattan: 0.7909
|
| 23 |
+
- Spearman Manhattan: 0.7991
|
| 24 |
+
- Pearson Euclidean: 0.7883
|
| 25 |
+
- Spearman Euclidean: 0.7968
|
| 26 |
+
- Pearson Dot: 0.7578
|
| 27 |
+
- Spearman Dot: 0.7578
|
| 28 |
|
| 29 |
## Model description
|
| 30 |
|
|
|
|
| 43 |
### Training hyperparameters
|
| 44 |
|
| 45 |
The following hyperparameters were used during training:
|
| 46 |
+
- learning_rate: 2e-05
|
| 47 |
- train_batch_size: 16
|
| 48 |
- eval_batch_size: 16
|
| 49 |
- seed: 42
|
|
|
|
| 53 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 54 |
- lr_scheduler_type: linear
|
| 55 |
- lr_scheduler_warmup_ratio: 0.1
|
| 56 |
+
- num_epochs: 4.0
|
| 57 |
|
| 58 |
### Training results
|
| 59 |
|
| 60 |
| Training Loss | Epoch | Step | Validation Loss | Pearson Cosine | Spearman Cosine | Pearson Manhattan | Spearman Manhattan | Pearson Euclidean | Spearman Euclidean | Pearson Dot | Spearman Dot |
|
| 61 |
|:-------------:|:------:|:----:|:---------------:|:--------------:|:---------------:|:-----------------:|:------------------:|:-----------------:|:------------------:|:-----------:|:------------:|
|
| 62 |
+
| 0.4859 | 0.1172 | 250 | 0.0753 | 0.7923 | 0.7923 | 0.7833 | 0.7911 | 0.7825 | 0.7907 | 0.6785 | 0.6757 |
|
| 63 |
+
| 0.4421 | 0.2343 | 500 | 0.0699 | 0.7956 | 0.7989 | 0.7894 | 0.7987 | 0.7887 | 0.7980 | 0.6754 | 0.6702 |
|
| 64 |
+
| 0.3553 | 0.3515 | 750 | 0.0556 | 0.8076 | 0.8088 | 0.8036 | 0.8096 | 0.8024 | 0.8090 | 0.7051 | 0.7031 |
|
| 65 |
+
| 0.3311 | 0.4686 | 1000 | 0.0558 | 0.8114 | 0.8143 | 0.8050 | 0.8126 | 0.8040 | 0.8118 | 0.7185 | 0.7185 |
|
| 66 |
+
| 0.3541 | 0.5858 | 1250 | 0.0556 | 0.8070 | 0.8099 | 0.8135 | 0.8183 | 0.8126 | 0.8180 | 0.7040 | 0.7018 |
|
| 67 |
+
| 0.344 | 0.7029 | 1500 | 0.0549 | 0.8153 | 0.8197 | 0.8109 | 0.8202 | 0.8097 | 0.8188 | 0.7054 | 0.7078 |
|
| 68 |
+
| 0.3268 | 0.8201 | 1750 | 0.0535 | 0.8172 | 0.8210 | 0.8138 | 0.8211 | 0.8128 | 0.8202 | 0.7224 | 0.7208 |
|
| 69 |
+
| 0.3399 | 0.9372 | 2000 | 0.0569 | 0.8113 | 0.8163 | 0.8073 | 0.8162 | 0.8066 | 0.8152 | 0.7242 | 0.7226 |
|
| 70 |
+
| 0.2473 | 1.0544 | 2250 | 0.0453 | 0.8124 | 0.8143 | 0.8031 | 0.8103 | 0.8020 | 0.8093 | 0.7271 | 0.7261 |
|
| 71 |
+
| 0.2563 | 1.1715 | 2500 | 0.0408 | 0.8178 | 0.8195 | 0.8043 | 0.8132 | 0.8032 | 0.8120 | 0.7518 | 0.7504 |
|
| 72 |
+
| 0.2841 | 1.2887 | 2750 | 0.0437 | 0.8074 | 0.8100 | 0.8063 | 0.8138 | 0.8053 | 0.8130 | 0.7237 | 0.7204 |
|
| 73 |
+
| 0.2462 | 1.4058 | 3000 | 0.0419 | 0.8164 | 0.8192 | 0.8050 | 0.8143 | 0.8039 | 0.8132 | 0.7395 | 0.7393 |
|
| 74 |
+
| 0.2328 | 1.5230 | 3250 | 0.0404 | 0.8187 | 0.8203 | 0.8084 | 0.8165 | 0.8070 | 0.8154 | 0.7426 | 0.7414 |
|
| 75 |
+
| 0.2052 | 1.6401 | 3500 | 0.0390 | 0.8147 | 0.8164 | 0.8045 | 0.8129 | 0.8035 | 0.8122 | 0.7426 | 0.7422 |
|
| 76 |
+
| 0.262 | 1.7573 | 3750 | 0.0419 | 0.8188 | 0.8204 | 0.8080 | 0.8170 | 0.8067 | 0.8158 | 0.7306 | 0.7294 |
|
| 77 |
+
| 0.2269 | 1.8744 | 4000 | 0.0393 | 0.8218 | 0.8235 | 0.8002 | 0.8112 | 0.7985 | 0.8094 | 0.7384 | 0.7375 |
|
| 78 |
+
| 0.2472 | 1.9916 | 4250 | 0.0400 | 0.8203 | 0.8224 | 0.8053 | 0.8160 | 0.8040 | 0.8147 | 0.7317 | 0.7308 |
|
| 79 |
+
| 0.1838 | 2.1087 | 4500 | 0.0348 | 0.8184 | 0.8191 | 0.8023 | 0.8099 | 0.8005 | 0.8085 | 0.7495 | 0.7481 |
|
| 80 |
+
| 0.1509 | 2.2259 | 4750 | 0.0359 | 0.8117 | 0.8120 | 0.7977 | 0.8054 | 0.7958 | 0.8036 | 0.7344 | 0.7343 |
|
| 81 |
+
| 0.1816 | 2.3430 | 5000 | 0.0330 | 0.8185 | 0.8181 | 0.8000 | 0.8079 | 0.7978 | 0.8060 | 0.7507 | 0.7501 |
|
| 82 |
+
| 0.166 | 2.4602 | 5250 | 0.0335 | 0.8183 | 0.8188 | 0.8015 | 0.8107 | 0.7997 | 0.8091 | 0.7450 | 0.7445 |
|
| 83 |
+
| 0.1572 | 2.5773 | 5500 | 0.0352 | 0.8123 | 0.8135 | 0.8021 | 0.8100 | 0.8003 | 0.8084 | 0.7368 | 0.7336 |
|
| 84 |
+
| 0.1353 | 2.6945 | 5750 | 0.0333 | 0.8210 | 0.8211 | 0.8045 | 0.8123 | 0.8024 | 0.8103 | 0.7463 | 0.7463 |
|
| 85 |
+
| 0.1555 | 2.8116 | 6000 | 0.0325 | 0.8185 | 0.8183 | 0.7959 | 0.8036 | 0.7939 | 0.8019 | 0.7526 | 0.7538 |
|
| 86 |
+
| 0.152 | 2.9288 | 6250 | 0.0326 | 0.8154 | 0.8151 | 0.7929 | 0.8018 | 0.7908 | 0.8001 | 0.7415 | 0.7427 |
|
| 87 |
+
| 0.1 | 3.0459 | 6500 | 0.0312 | 0.8194 | 0.8190 | 0.7908 | 0.7990 | 0.7886 | 0.7972 | 0.7565 | 0.7571 |
|
| 88 |
+
| 0.1075 | 3.1631 | 6750 | 0.0318 | 0.8184 | 0.8181 | 0.7949 | 0.8031 | 0.7928 | 0.8016 | 0.7567 | 0.7583 |
|
| 89 |
+
| 0.0971 | 3.2802 | 7000 | 0.0312 | 0.8183 | 0.8176 | 0.7905 | 0.7992 | 0.7882 | 0.7970 | 0.7561 | 0.7572 |
|
| 90 |
+
| 0.12 | 3.3974 | 7250 | 0.0303 | 0.8237 | 0.8230 | 0.7953 | 0.8035 | 0.7930 | 0.8016 | 0.7683 | 0.7690 |
|
| 91 |
+
| 0.1003 | 3.5145 | 7500 | 0.0315 | 0.8181 | 0.8172 | 0.7964 | 0.8047 | 0.7941 | 0.8028 | 0.7502 | 0.7505 |
|
| 92 |
+
| 0.1237 | 3.6317 | 7750 | 0.0308 | 0.8190 | 0.8178 | 0.7915 | 0.7990 | 0.7886 | 0.7969 | 0.7589 | 0.7583 |
|
| 93 |
+
| 0.0991 | 3.7488 | 8000 | 0.0315 | 0.8186 | 0.8172 | 0.7952 | 0.8024 | 0.7925 | 0.8000 | 0.7540 | 0.7531 |
|
| 94 |
+
| 0.1017 | 3.8660 | 8250 | 0.0311 | 0.8182 | 0.8174 | 0.7925 | 0.8007 | 0.7900 | 0.7986 | 0.7532 | 0.7523 |
|
| 95 |
+
| 0.1132 | 3.9831 | 8500 | 0.0306 | 0.8211 | 0.8198 | 0.7909 | 0.7991 | 0.7883 | 0.7968 | 0.7578 | 0.7578 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 96 |
|
| 97 |
|
| 98 |
### Framework versions
|