Model save
Browse files- README.md +33 -41
- model.safetensors +1 -1
README.md
CHANGED
|
@@ -16,13 +16,13 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the None dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
-
- Loss: 0.
|
| 20 |
-
- Subset Accuracy: 0.
|
| 21 |
-
- F1 Macro: 0.
|
| 22 |
-
- F1 Micro: 0.
|
| 23 |
-
- Precision Macro: 0.
|
| 24 |
-
- Recall Macro: 0.
|
| 25 |
-
- Roc Auc: 0.
|
| 26 |
|
| 27 |
## Model description
|
| 28 |
|
|
@@ -57,40 +57,32 @@ The following hyperparameters were used during training:
|
|
| 57 |
|
| 58 |
| Training Loss | Epoch | Step | Validation Loss | Subset Accuracy | F1 Macro | F1 Micro | Precision Macro | Recall Macro | Roc Auc |
|
| 59 |
|:-------------:|:-------:|:-----:|:---------------:|:---------------:|:--------:|:--------:|:---------------:|:------------:|:-------:|
|
| 60 |
-
| 0.
|
| 61 |
-
| 0.
|
| 62 |
-
| 0.
|
| 63 |
-
| 0.
|
| 64 |
-
| 0.
|
| 65 |
-
| 0.
|
| 66 |
-
| 0.1032 | 3.5002 | 5369 | 0.
|
| 67 |
-
| 0.
|
| 68 |
-
| 0.
|
| 69 |
-
| 0.
|
| 70 |
-
| 0.
|
| 71 |
-
| 0.
|
| 72 |
-
| 0.
|
| 73 |
-
| 0.0345 | 7.0 | 10738 | 0.
|
| 74 |
-
| 0.
|
| 75 |
-
| 0.
|
| 76 |
-
| 0.
|
| 77 |
-
| 0.
|
| 78 |
-
| 0.
|
| 79 |
-
| 0.
|
| 80 |
-
| 0.
|
| 81 |
-
| 0.
|
| 82 |
-
| 0.
|
| 83 |
-
| 0.
|
| 84 |
-
| 0.
|
| 85 |
-
| 0.
|
| 86 |
-
| 0.0039 | 13.5002 | 20709 | 0.3598 | 0.2977 | 0.3415 | 0.4025 | 0.3681 | 0.3291 | 0.7955 |
|
| 87 |
-
| 0.0036 | 14.0 | 21476 | 0.3600 | 0.2993 | 0.3419 | 0.4061 | 0.3644 | 0.3282 | 0.7902 |
|
| 88 |
-
| 0.0025 | 14.5002 | 22243 | 0.3717 | 0.3023 | 0.3465 | 0.4098 | 0.3655 | 0.3327 | 0.7904 |
|
| 89 |
-
| 0.003 | 15.0 | 23010 | 0.3783 | 0.3030 | 0.3373 | 0.3982 | 0.3687 | 0.3141 | 0.7914 |
|
| 90 |
-
| 0.002 | 15.5002 | 23777 | 0.3835 | 0.3011 | 0.3317 | 0.3985 | 0.3687 | 0.3089 | 0.7906 |
|
| 91 |
-
| 0.0016 | 16.0 | 24544 | 0.3909 | 0.3099 | 0.3430 | 0.4099 | 0.3712 | 0.3232 | 0.7894 |
|
| 92 |
-
| 0.0016 | 16.5002 | 25311 | 0.3900 | 0.2987 | 0.3449 | 0.4073 | 0.3616 | 0.3352 | 0.7935 |
|
| 93 |
-
| 0.0013 | 17.0 | 26078 | 0.3960 | 0.3047 | 0.3430 | 0.4073 | 0.3609 | 0.3304 | 0.7914 |
|
| 94 |
|
| 95 |
|
| 96 |
### Framework versions
|
|
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the None dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
+
- Loss: 0.3514
|
| 20 |
+
- Subset Accuracy: 0.2902
|
| 21 |
+
- F1 Macro: 0.3370
|
| 22 |
+
- F1 Micro: 0.3898
|
| 23 |
+
- Precision Macro: 0.3762
|
| 24 |
+
- Recall Macro: 0.3140
|
| 25 |
+
- Roc Auc: 0.7933
|
| 26 |
|
| 27 |
## Model description
|
| 28 |
|
|
|
|
| 57 |
|
| 58 |
| Training Loss | Epoch | Step | Validation Loss | Subset Accuracy | F1 Macro | F1 Micro | Precision Macro | Recall Macro | Roc Auc |
|
| 59 |
|:-------------:|:-------:|:-----:|:---------------:|:---------------:|:--------:|:--------:|:---------------:|:------------:|:-------:|
|
| 60 |
+
| 0.4274 | 0.5002 | 767 | 0.2090 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6092 |
|
| 61 |
+
| 0.1875 | 1.0 | 1534 | 0.1773 | 0.0816 | 0.0680 | 0.1479 | 0.2599 | 0.0476 | 0.7795 |
|
| 62 |
+
| 0.1682 | 1.5002 | 2301 | 0.1681 | 0.1630 | 0.1275 | 0.2611 | 0.2788 | 0.1014 | 0.8039 |
|
| 63 |
+
| 0.161 | 2.0 | 3068 | 0.1631 | 0.2076 | 0.1940 | 0.3133 | 0.4626 | 0.1538 | 0.8256 |
|
| 64 |
+
| 0.1379 | 2.5002 | 3835 | 0.1674 | 0.2572 | 0.2415 | 0.3613 | 0.4434 | 0.1922 | 0.8235 |
|
| 65 |
+
| 0.1323 | 3.0 | 4602 | 0.1634 | 0.2604 | 0.2566 | 0.3641 | 0.4828 | 0.1999 | 0.8349 |
|
| 66 |
+
| 0.1032 | 3.5002 | 5369 | 0.1855 | 0.2953 | 0.2878 | 0.3803 | 0.3958 | 0.2422 | 0.8211 |
|
| 67 |
+
| 0.0961 | 4.0 | 6136 | 0.1858 | 0.3151 | 0.3092 | 0.4045 | 0.4284 | 0.2670 | 0.8231 |
|
| 68 |
+
| 0.0737 | 4.5002 | 6903 | 0.2082 | 0.3121 | 0.3140 | 0.3941 | 0.3975 | 0.2748 | 0.8120 |
|
| 69 |
+
| 0.0651 | 5.0 | 7670 | 0.2108 | 0.3082 | 0.2990 | 0.3935 | 0.4146 | 0.2605 | 0.8106 |
|
| 70 |
+
| 0.0541 | 5.5002 | 8437 | 0.2241 | 0.2995 | 0.3174 | 0.3851 | 0.3851 | 0.2861 | 0.8055 |
|
| 71 |
+
| 0.0465 | 6.0 | 9204 | 0.2386 | 0.3039 | 0.3123 | 0.3871 | 0.3757 | 0.2779 | 0.8026 |
|
| 72 |
+
| 0.0399 | 6.5002 | 9971 | 0.2458 | 0.3020 | 0.3240 | 0.3894 | 0.3745 | 0.2979 | 0.8032 |
|
| 73 |
+
| 0.0345 | 7.0 | 10738 | 0.2539 | 0.3078 | 0.3288 | 0.4012 | 0.3615 | 0.3105 | 0.8039 |
|
| 74 |
+
| 0.0251 | 7.5002 | 11505 | 0.2663 | 0.2951 | 0.3301 | 0.3912 | 0.3619 | 0.3140 | 0.7993 |
|
| 75 |
+
| 0.0254 | 8.0 | 12272 | 0.2737 | 0.2944 | 0.3322 | 0.3920 | 0.3709 | 0.3109 | 0.7998 |
|
| 76 |
+
| 0.0189 | 8.5002 | 13039 | 0.2791 | 0.2844 | 0.3388 | 0.3984 | 0.3574 | 0.3310 | 0.8029 |
|
| 77 |
+
| 0.0195 | 9.0 | 13806 | 0.2838 | 0.2913 | 0.3273 | 0.3896 | 0.3615 | 0.3064 | 0.7989 |
|
| 78 |
+
| 0.014 | 9.5002 | 14573 | 0.3037 | 0.2925 | 0.3336 | 0.3987 | 0.3680 | 0.3201 | 0.7971 |
|
| 79 |
+
| 0.0139 | 10.0 | 15340 | 0.3015 | 0.2903 | 0.3401 | 0.3979 | 0.3648 | 0.3239 | 0.7950 |
|
| 80 |
+
| 0.0101 | 10.5002 | 16107 | 0.3192 | 0.2846 | 0.3428 | 0.4032 | 0.3598 | 0.3409 | 0.7934 |
|
| 81 |
+
| 0.0103 | 11.0 | 16874 | 0.3257 | 0.2866 | 0.3376 | 0.3989 | 0.3566 | 0.3274 | 0.7928 |
|
| 82 |
+
| 0.0073 | 11.5002 | 17641 | 0.3275 | 0.3004 | 0.3334 | 0.4008 | 0.3828 | 0.3077 | 0.7941 |
|
| 83 |
+
| 0.0074 | 12.0 | 18408 | 0.3378 | 0.2868 | 0.3361 | 0.3999 | 0.3646 | 0.3217 | 0.7911 |
|
| 84 |
+
| 0.0056 | 12.5002 | 19175 | 0.3424 | 0.3010 | 0.3419 | 0.4036 | 0.3733 | 0.3215 | 0.7926 |
|
| 85 |
+
| 0.0052 | 13.0 | 19942 | 0.3514 | 0.2902 | 0.3370 | 0.3898 | 0.3762 | 0.3140 | 0.7933 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 86 |
|
| 87 |
|
| 88 |
### Framework versions
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 438010940
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:330a57b119eec750afdcc7c58c7a9d7d2d6420877fe9a8e85349786c9ba2467a
|
| 3 |
size 438010940
|