End of training
Browse files
README.md
CHANGED
|
@@ -20,11 +20,11 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 20 |
|
| 21 |
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
|
| 22 |
It achieves the following results on the evaluation set:
|
| 23 |
-
- Loss: 0.
|
| 24 |
-
- Precision: 0.
|
| 25 |
-
- Recall: 0.
|
| 26 |
-
- F1: 0.
|
| 27 |
-
- Accuracy: 0.
|
| 28 |
|
| 29 |
## Model description
|
| 30 |
|
|
@@ -44,37 +44,87 @@ More information needed
|
|
| 44 |
|
| 45 |
The following hyperparameters were used during training:
|
| 46 |
- learning_rate: 2e-05
|
| 47 |
-
- train_batch_size:
|
| 48 |
-
- eval_batch_size:
|
| 49 |
- seed: 42
|
| 50 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 51 |
- lr_scheduler_type: linear
|
| 52 |
-
- num_epochs:
|
| 53 |
|
| 54 |
### Training results
|
| 55 |
|
| 56 |
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|
| 57 |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
|
| 58 |
-
| No log | 1.0 |
|
| 59 |
-
| No log | 2.0 |
|
| 60 |
-
| No log | 3.0 |
|
| 61 |
-
| No log | 4.0 |
|
| 62 |
-
| No log | 5.0 |
|
| 63 |
-
| No log | 6.0 |
|
| 64 |
-
| No log | 7.0 |
|
| 65 |
-
| No log | 8.0 |
|
| 66 |
-
| No log | 9.0 |
|
| 67 |
-
| No log | 10.0 |
|
| 68 |
-
| No log | 11.0 |
|
| 69 |
-
| No log | 12.0 |
|
| 70 |
-
|
|
| 71 |
-
|
|
| 72 |
-
|
|
| 73 |
-
|
|
| 74 |
-
|
|
| 75 |
-
|
|
| 76 |
-
|
|
| 77 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 78 |
|
| 79 |
|
| 80 |
### Framework versions
|
|
|
|
| 20 |
|
| 21 |
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
|
| 22 |
It achieves the following results on the evaluation set:
|
| 23 |
+
- Loss: 0.5025
|
| 24 |
+
- Precision: 0.6763
|
| 25 |
+
- Recall: 0.6572
|
| 26 |
+
- F1: 0.6666
|
| 27 |
+
- Accuracy: 0.7599
|
| 28 |
|
| 29 |
## Model description
|
| 30 |
|
|
|
|
| 44 |
|
| 45 |
The following hyperparameters were used during training:
|
| 46 |
- learning_rate: 2e-05
|
| 47 |
+
- train_batch_size: 8
|
| 48 |
+
- eval_batch_size: 8
|
| 49 |
- seed: 42
|
| 50 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 51 |
- lr_scheduler_type: linear
|
| 52 |
+
- num_epochs: 70
|
| 53 |
|
| 54 |
### Training results
|
| 55 |
|
| 56 |
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|
| 57 |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
|
| 58 |
+
| No log | 1.0 | 40 | 1.8742 | 0.2828 | 0.3325 | 0.3056 | 0.5265 |
|
| 59 |
+
| No log | 2.0 | 80 | 1.3440 | 0.4424 | 0.4724 | 0.4569 | 0.6392 |
|
| 60 |
+
| No log | 3.0 | 120 | 1.0792 | 0.5232 | 0.5486 | 0.5356 | 0.6919 |
|
| 61 |
+
| No log | 4.0 | 160 | 0.9092 | 0.5655 | 0.5861 | 0.5756 | 0.7183 |
|
| 62 |
+
| No log | 5.0 | 200 | 0.8110 | 0.6024 | 0.6073 | 0.6048 | 0.7436 |
|
| 63 |
+
| No log | 6.0 | 240 | 0.7321 | 0.6253 | 0.6593 | 0.6419 | 0.7663 |
|
| 64 |
+
| No log | 7.0 | 280 | 0.6677 | 0.6049 | 0.6346 | 0.6194 | 0.7600 |
|
| 65 |
+
| No log | 8.0 | 320 | 0.6198 | 0.6257 | 0.6486 | 0.6370 | 0.7701 |
|
| 66 |
+
| No log | 9.0 | 360 | 0.5884 | 0.6660 | 0.6599 | 0.6629 | 0.7824 |
|
| 67 |
+
| No log | 10.0 | 400 | 0.5586 | 0.7004 | 0.6888 | 0.6945 | 0.8011 |
|
| 68 |
+
| No log | 11.0 | 440 | 0.5278 | 0.6896 | 0.6851 | 0.6873 | 0.8014 |
|
| 69 |
+
| No log | 12.0 | 480 | 0.4915 | 0.7125 | 0.6840 | 0.6979 | 0.8045 |
|
| 70 |
+
| 0.8976 | 13.0 | 520 | 0.4698 | 0.6954 | 0.7050 | 0.7001 | 0.8053 |
|
| 71 |
+
| 0.8976 | 14.0 | 560 | 0.4551 | 0.7044 | 0.7180 | 0.7111 | 0.8088 |
|
| 72 |
+
| 0.8976 | 15.0 | 600 | 0.4464 | 0.6775 | 0.6784 | 0.6779 | 0.7918 |
|
| 73 |
+
| 0.8976 | 16.0 | 640 | 0.4366 | 0.7130 | 0.6981 | 0.7055 | 0.8097 |
|
| 74 |
+
| 0.8976 | 17.0 | 680 | 0.4353 | 0.7457 | 0.7465 | 0.7461 | 0.8284 |
|
| 75 |
+
| 0.8976 | 18.0 | 720 | 0.4276 | 0.7333 | 0.7320 | 0.7327 | 0.8250 |
|
| 76 |
+
| 0.8976 | 19.0 | 760 | 0.4125 | 0.6961 | 0.6948 | 0.6955 | 0.8055 |
|
| 77 |
+
| 0.8976 | 20.0 | 800 | 0.4047 | 0.7335 | 0.7179 | 0.7256 | 0.8130 |
|
| 78 |
+
| 0.8976 | 21.0 | 840 | 0.4059 | 0.7059 | 0.6892 | 0.6974 | 0.8012 |
|
| 79 |
+
| 0.8976 | 22.0 | 880 | 0.3943 | 0.6979 | 0.6965 | 0.6972 | 0.8000 |
|
| 80 |
+
| 0.8976 | 23.0 | 920 | 0.3933 | 0.6869 | 0.6864 | 0.6866 | 0.7937 |
|
| 81 |
+
| 0.8976 | 24.0 | 960 | 0.3948 | 0.6973 | 0.6895 | 0.6934 | 0.7958 |
|
| 82 |
+
| 0.2879 | 25.0 | 1000 | 0.3971 | 0.7468 | 0.7119 | 0.7289 | 0.8249 |
|
| 83 |
+
| 0.2879 | 26.0 | 1040 | 0.3854 | 0.7031 | 0.6968 | 0.6999 | 0.8052 |
|
| 84 |
+
| 0.2879 | 27.0 | 1080 | 0.3863 | 0.7088 | 0.7099 | 0.7093 | 0.8113 |
|
| 85 |
+
| 0.2879 | 28.0 | 1120 | 0.3950 | 0.7179 | 0.6919 | 0.7047 | 0.8029 |
|
| 86 |
+
| 0.2879 | 29.0 | 1160 | 0.3947 | 0.7193 | 0.7005 | 0.7097 | 0.8063 |
|
| 87 |
+
| 0.2879 | 30.0 | 1200 | 0.3857 | 0.6953 | 0.6927 | 0.6940 | 0.7888 |
|
| 88 |
+
| 0.2879 | 31.0 | 1240 | 0.3909 | 0.7149 | 0.6937 | 0.7042 | 0.7948 |
|
| 89 |
+
| 0.2879 | 32.0 | 1280 | 0.3883 | 0.6915 | 0.6799 | 0.6856 | 0.7788 |
|
| 90 |
+
| 0.2879 | 33.0 | 1320 | 0.3841 | 0.7249 | 0.7133 | 0.7190 | 0.8184 |
|
| 91 |
+
| 0.2879 | 34.0 | 1360 | 0.3869 | 0.7228 | 0.6954 | 0.7088 | 0.8040 |
|
| 92 |
+
| 0.2879 | 35.0 | 1400 | 0.3909 | 0.7259 | 0.7038 | 0.7147 | 0.8056 |
|
| 93 |
+
| 0.2879 | 36.0 | 1440 | 0.3880 | 0.6991 | 0.6931 | 0.6961 | 0.7974 |
|
| 94 |
+
| 0.2879 | 37.0 | 1480 | 0.3902 | 0.7010 | 0.6902 | 0.6956 | 0.7958 |
|
| 95 |
+
| 0.2133 | 38.0 | 1520 | 0.3939 | 0.6921 | 0.6816 | 0.6868 | 0.7837 |
|
| 96 |
+
| 0.2133 | 39.0 | 1560 | 0.4051 | 0.6967 | 0.6837 | 0.6902 | 0.7836 |
|
| 97 |
+
| 0.2133 | 40.0 | 1600 | 0.4018 | 0.7136 | 0.6972 | 0.7053 | 0.8036 |
|
| 98 |
+
| 0.2133 | 41.0 | 1640 | 0.4064 | 0.6815 | 0.6693 | 0.6754 | 0.7794 |
|
| 99 |
+
| 0.2133 | 42.0 | 1680 | 0.4076 | 0.6809 | 0.6741 | 0.6775 | 0.7776 |
|
| 100 |
+
| 0.2133 | 43.0 | 1720 | 0.4106 | 0.6855 | 0.6723 | 0.6788 | 0.7713 |
|
| 101 |
+
| 0.2133 | 44.0 | 1760 | 0.4145 | 0.6967 | 0.6799 | 0.6882 | 0.7823 |
|
| 102 |
+
| 0.2133 | 45.0 | 1800 | 0.4142 | 0.6848 | 0.6748 | 0.6798 | 0.7750 |
|
| 103 |
+
| 0.2133 | 46.0 | 1840 | 0.4195 | 0.6787 | 0.6703 | 0.6745 | 0.7703 |
|
| 104 |
+
| 0.2133 | 47.0 | 1880 | 0.4250 | 0.6899 | 0.6807 | 0.6853 | 0.7828 |
|
| 105 |
+
| 0.2133 | 48.0 | 1920 | 0.4285 | 0.6907 | 0.6745 | 0.6825 | 0.7752 |
|
| 106 |
+
| 0.2133 | 49.0 | 1960 | 0.4372 | 0.6866 | 0.6757 | 0.6811 | 0.7787 |
|
| 107 |
+
| 0.1892 | 50.0 | 2000 | 0.4330 | 0.6815 | 0.6664 | 0.6739 | 0.7675 |
|
| 108 |
+
| 0.1892 | 51.0 | 2040 | 0.4486 | 0.6863 | 0.6653 | 0.6756 | 0.7658 |
|
| 109 |
+
| 0.1892 | 52.0 | 2080 | 0.4464 | 0.6802 | 0.6664 | 0.6732 | 0.7710 |
|
| 110 |
+
| 0.1892 | 53.0 | 2120 | 0.4542 | 0.6762 | 0.6633 | 0.6697 | 0.7635 |
|
| 111 |
+
| 0.1892 | 54.0 | 2160 | 0.4483 | 0.6743 | 0.6591 | 0.6666 | 0.7624 |
|
| 112 |
+
| 0.1892 | 55.0 | 2200 | 0.4559 | 0.6848 | 0.6707 | 0.6777 | 0.7735 |
|
| 113 |
+
| 0.1892 | 56.0 | 2240 | 0.4608 | 0.6790 | 0.6606 | 0.6697 | 0.7627 |
|
| 114 |
+
| 0.1892 | 57.0 | 2280 | 0.4578 | 0.6837 | 0.6631 | 0.6733 | 0.7677 |
|
| 115 |
+
| 0.1892 | 58.0 | 2320 | 0.4683 | 0.6749 | 0.6596 | 0.6672 | 0.7626 |
|
| 116 |
+
| 0.1892 | 59.0 | 2360 | 0.4693 | 0.6773 | 0.6620 | 0.6696 | 0.7643 |
|
| 117 |
+
| 0.1892 | 60.0 | 2400 | 0.4788 | 0.6777 | 0.6582 | 0.6678 | 0.7610 |
|
| 118 |
+
| 0.1892 | 61.0 | 2440 | 0.4796 | 0.6750 | 0.6599 | 0.6674 | 0.7613 |
|
| 119 |
+
| 0.1892 | 62.0 | 2480 | 0.4827 | 0.6764 | 0.6595 | 0.6679 | 0.7605 |
|
| 120 |
+
| 0.1772 | 63.0 | 2520 | 0.4878 | 0.6765 | 0.6592 | 0.6678 | 0.7608 |
|
| 121 |
+
| 0.1772 | 64.0 | 2560 | 0.4926 | 0.6765 | 0.6572 | 0.6667 | 0.7602 |
|
| 122 |
+
| 0.1772 | 65.0 | 2600 | 0.4946 | 0.6761 | 0.6570 | 0.6664 | 0.7600 |
|
| 123 |
+
| 0.1772 | 66.0 | 2640 | 0.4931 | 0.6749 | 0.6586 | 0.6667 | 0.7614 |
|
| 124 |
+
| 0.1772 | 67.0 | 2680 | 0.4977 | 0.6767 | 0.6574 | 0.6669 | 0.7600 |
|
| 125 |
+
| 0.1772 | 68.0 | 2720 | 0.5003 | 0.6762 | 0.6575 | 0.6667 | 0.7601 |
|
| 126 |
+
| 0.1772 | 69.0 | 2760 | 0.5018 | 0.6760 | 0.6570 | 0.6663 | 0.7598 |
|
| 127 |
+
| 0.1772 | 70.0 | 2800 | 0.5025 | 0.6763 | 0.6572 | 0.6666 | 0.7599 |
|
| 128 |
|
| 129 |
|
| 130 |
### Framework versions
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 435839092
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:60b29be23c5dac461a0e6624a5dee77fc11d0d6cc3ea90355cbb8db02d6e23d2
|
| 3 |
size 435839092
|
runs/Apr06_14-51-51_09459640fcae/events.out.tfevents.1712415315.09459640fcae.789.0
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:87dae275dc44b4b98fb1656179e3cd15fd265f5e737c9094c8a62587883cb246
|
| 3 |
+
size 42355
|