zakariyafirachine commited on
Commit
607bc70
·
verified ·
1 Parent(s): 3ea870f

End of training

Browse files
README.md CHANGED
@@ -20,11 +20,11 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.5025
24
- - Precision: 0.6763
25
- - Recall: 0.6572
26
- - F1: 0.6666
27
- - Accuracy: 0.7599
28
 
29
  ## Model description
30
 
@@ -44,87 +44,37 @@ More information needed
44
 
45
  The following hyperparameters were used during training:
46
  - learning_rate: 2e-05
47
- - train_batch_size: 8
48
- - eval_batch_size: 8
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
- - num_epochs: 70
53
 
54
  ### Training results
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
- | No log | 1.0 | 40 | 1.8742 | 0.2828 | 0.3325 | 0.3056 | 0.5265 |
59
- | No log | 2.0 | 80 | 1.3440 | 0.4424 | 0.4724 | 0.4569 | 0.6392 |
60
- | No log | 3.0 | 120 | 1.0792 | 0.5232 | 0.5486 | 0.5356 | 0.6919 |
61
- | No log | 4.0 | 160 | 0.9092 | 0.5655 | 0.5861 | 0.5756 | 0.7183 |
62
- | No log | 5.0 | 200 | 0.8110 | 0.6024 | 0.6073 | 0.6048 | 0.7436 |
63
- | No log | 6.0 | 240 | 0.7321 | 0.6253 | 0.6593 | 0.6419 | 0.7663 |
64
- | No log | 7.0 | 280 | 0.6677 | 0.6049 | 0.6346 | 0.6194 | 0.7600 |
65
- | No log | 8.0 | 320 | 0.6198 | 0.6257 | 0.6486 | 0.6370 | 0.7701 |
66
- | No log | 9.0 | 360 | 0.5884 | 0.6660 | 0.6599 | 0.6629 | 0.7824 |
67
- | No log | 10.0 | 400 | 0.5586 | 0.7004 | 0.6888 | 0.6945 | 0.8011 |
68
- | No log | 11.0 | 440 | 0.5278 | 0.6896 | 0.6851 | 0.6873 | 0.8014 |
69
- | No log | 12.0 | 480 | 0.4915 | 0.7125 | 0.6840 | 0.6979 | 0.8045 |
70
- | 0.8976 | 13.0 | 520 | 0.4698 | 0.6954 | 0.7050 | 0.7001 | 0.8053 |
71
- | 0.8976 | 14.0 | 560 | 0.4551 | 0.7044 | 0.7180 | 0.7111 | 0.8088 |
72
- | 0.8976 | 15.0 | 600 | 0.4464 | 0.6775 | 0.6784 | 0.6779 | 0.7918 |
73
- | 0.8976 | 16.0 | 640 | 0.4366 | 0.7130 | 0.6981 | 0.7055 | 0.8097 |
74
- | 0.8976 | 17.0 | 680 | 0.4353 | 0.7457 | 0.7465 | 0.7461 | 0.8284 |
75
- | 0.8976 | 18.0 | 720 | 0.4276 | 0.7333 | 0.7320 | 0.7327 | 0.8250 |
76
- | 0.8976 | 19.0 | 760 | 0.4125 | 0.6961 | 0.6948 | 0.6955 | 0.8055 |
77
- | 0.8976 | 20.0 | 800 | 0.4047 | 0.7335 | 0.7179 | 0.7256 | 0.8130 |
78
- | 0.8976 | 21.0 | 840 | 0.4059 | 0.7059 | 0.6892 | 0.6974 | 0.8012 |
79
- | 0.8976 | 22.0 | 880 | 0.3943 | 0.6979 | 0.6965 | 0.6972 | 0.8000 |
80
- | 0.8976 | 23.0 | 920 | 0.3933 | 0.6869 | 0.6864 | 0.6866 | 0.7937 |
81
- | 0.8976 | 24.0 | 960 | 0.3948 | 0.6973 | 0.6895 | 0.6934 | 0.7958 |
82
- | 0.2879 | 25.0 | 1000 | 0.3971 | 0.7468 | 0.7119 | 0.7289 | 0.8249 |
83
- | 0.2879 | 26.0 | 1040 | 0.3854 | 0.7031 | 0.6968 | 0.6999 | 0.8052 |
84
- | 0.2879 | 27.0 | 1080 | 0.3863 | 0.7088 | 0.7099 | 0.7093 | 0.8113 |
85
- | 0.2879 | 28.0 | 1120 | 0.3950 | 0.7179 | 0.6919 | 0.7047 | 0.8029 |
86
- | 0.2879 | 29.0 | 1160 | 0.3947 | 0.7193 | 0.7005 | 0.7097 | 0.8063 |
87
- | 0.2879 | 30.0 | 1200 | 0.3857 | 0.6953 | 0.6927 | 0.6940 | 0.7888 |
88
- | 0.2879 | 31.0 | 1240 | 0.3909 | 0.7149 | 0.6937 | 0.7042 | 0.7948 |
89
- | 0.2879 | 32.0 | 1280 | 0.3883 | 0.6915 | 0.6799 | 0.6856 | 0.7788 |
90
- | 0.2879 | 33.0 | 1320 | 0.3841 | 0.7249 | 0.7133 | 0.7190 | 0.8184 |
91
- | 0.2879 | 34.0 | 1360 | 0.3869 | 0.7228 | 0.6954 | 0.7088 | 0.8040 |
92
- | 0.2879 | 35.0 | 1400 | 0.3909 | 0.7259 | 0.7038 | 0.7147 | 0.8056 |
93
- | 0.2879 | 36.0 | 1440 | 0.3880 | 0.6991 | 0.6931 | 0.6961 | 0.7974 |
94
- | 0.2879 | 37.0 | 1480 | 0.3902 | 0.7010 | 0.6902 | 0.6956 | 0.7958 |
95
- | 0.2133 | 38.0 | 1520 | 0.3939 | 0.6921 | 0.6816 | 0.6868 | 0.7837 |
96
- | 0.2133 | 39.0 | 1560 | 0.4051 | 0.6967 | 0.6837 | 0.6902 | 0.7836 |
97
- | 0.2133 | 40.0 | 1600 | 0.4018 | 0.7136 | 0.6972 | 0.7053 | 0.8036 |
98
- | 0.2133 | 41.0 | 1640 | 0.4064 | 0.6815 | 0.6693 | 0.6754 | 0.7794 |
99
- | 0.2133 | 42.0 | 1680 | 0.4076 | 0.6809 | 0.6741 | 0.6775 | 0.7776 |
100
- | 0.2133 | 43.0 | 1720 | 0.4106 | 0.6855 | 0.6723 | 0.6788 | 0.7713 |
101
- | 0.2133 | 44.0 | 1760 | 0.4145 | 0.6967 | 0.6799 | 0.6882 | 0.7823 |
102
- | 0.2133 | 45.0 | 1800 | 0.4142 | 0.6848 | 0.6748 | 0.6798 | 0.7750 |
103
- | 0.2133 | 46.0 | 1840 | 0.4195 | 0.6787 | 0.6703 | 0.6745 | 0.7703 |
104
- | 0.2133 | 47.0 | 1880 | 0.4250 | 0.6899 | 0.6807 | 0.6853 | 0.7828 |
105
- | 0.2133 | 48.0 | 1920 | 0.4285 | 0.6907 | 0.6745 | 0.6825 | 0.7752 |
106
- | 0.2133 | 49.0 | 1960 | 0.4372 | 0.6866 | 0.6757 | 0.6811 | 0.7787 |
107
- | 0.1892 | 50.0 | 2000 | 0.4330 | 0.6815 | 0.6664 | 0.6739 | 0.7675 |
108
- | 0.1892 | 51.0 | 2040 | 0.4486 | 0.6863 | 0.6653 | 0.6756 | 0.7658 |
109
- | 0.1892 | 52.0 | 2080 | 0.4464 | 0.6802 | 0.6664 | 0.6732 | 0.7710 |
110
- | 0.1892 | 53.0 | 2120 | 0.4542 | 0.6762 | 0.6633 | 0.6697 | 0.7635 |
111
- | 0.1892 | 54.0 | 2160 | 0.4483 | 0.6743 | 0.6591 | 0.6666 | 0.7624 |
112
- | 0.1892 | 55.0 | 2200 | 0.4559 | 0.6848 | 0.6707 | 0.6777 | 0.7735 |
113
- | 0.1892 | 56.0 | 2240 | 0.4608 | 0.6790 | 0.6606 | 0.6697 | 0.7627 |
114
- | 0.1892 | 57.0 | 2280 | 0.4578 | 0.6837 | 0.6631 | 0.6733 | 0.7677 |
115
- | 0.1892 | 58.0 | 2320 | 0.4683 | 0.6749 | 0.6596 | 0.6672 | 0.7626 |
116
- | 0.1892 | 59.0 | 2360 | 0.4693 | 0.6773 | 0.6620 | 0.6696 | 0.7643 |
117
- | 0.1892 | 60.0 | 2400 | 0.4788 | 0.6777 | 0.6582 | 0.6678 | 0.7610 |
118
- | 0.1892 | 61.0 | 2440 | 0.4796 | 0.6750 | 0.6599 | 0.6674 | 0.7613 |
119
- | 0.1892 | 62.0 | 2480 | 0.4827 | 0.6764 | 0.6595 | 0.6679 | 0.7605 |
120
- | 0.1772 | 63.0 | 2520 | 0.4878 | 0.6765 | 0.6592 | 0.6678 | 0.7608 |
121
- | 0.1772 | 64.0 | 2560 | 0.4926 | 0.6765 | 0.6572 | 0.6667 | 0.7602 |
122
- | 0.1772 | 65.0 | 2600 | 0.4946 | 0.6761 | 0.6570 | 0.6664 | 0.7600 |
123
- | 0.1772 | 66.0 | 2640 | 0.4931 | 0.6749 | 0.6586 | 0.6667 | 0.7614 |
124
- | 0.1772 | 67.0 | 2680 | 0.4977 | 0.6767 | 0.6574 | 0.6669 | 0.7600 |
125
- | 0.1772 | 68.0 | 2720 | 0.5003 | 0.6762 | 0.6575 | 0.6667 | 0.7601 |
126
- | 0.1772 | 69.0 | 2760 | 0.5018 | 0.6760 | 0.6570 | 0.6663 | 0.7598 |
127
- | 0.1772 | 70.0 | 2800 | 0.5025 | 0.6763 | 0.6572 | 0.6666 | 0.7599 |
128
 
129
 
130
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.7020
24
+ - Precision: 0.6057
25
+ - Recall: 0.6438
26
+ - F1: 0.6242
27
+ - Accuracy: 0.7418
28
 
29
  ## Model description
30
 
 
44
 
45
  The following hyperparameters were used during training:
46
  - learning_rate: 2e-05
47
+ - train_batch_size: 16
48
+ - eval_batch_size: 16
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
+ - num_epochs: 20
53
 
54
  ### Training results
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | No log | 1.0 | 20 | 2.2734 | 0.2443 | 0.2113 | 0.2266 | 0.4657 |
59
+ | No log | 2.0 | 40 | 1.8002 | 0.3464 | 0.3955 | 0.3693 | 0.5525 |
60
+ | No log | 3.0 | 60 | 1.4474 | 0.4147 | 0.4754 | 0.4430 | 0.6278 |
61
+ | No log | 4.0 | 80 | 1.2530 | 0.5030 | 0.5156 | 0.5092 | 0.6619 |
62
+ | No log | 5.0 | 100 | 1.1036 | 0.5513 | 0.5499 | 0.5506 | 0.6947 |
63
+ | No log | 6.0 | 120 | 1.0047 | 0.5901 | 0.5734 | 0.5816 | 0.7131 |
64
+ | No log | 7.0 | 140 | 0.9380 | 0.5797 | 0.5935 | 0.5865 | 0.7162 |
65
+ | No log | 8.0 | 160 | 0.8928 | 0.5585 | 0.5995 | 0.5782 | 0.7137 |
66
+ | No log | 9.0 | 180 | 0.8506 | 0.5855 | 0.6074 | 0.5962 | 0.7216 |
67
+ | No log | 10.0 | 200 | 0.8208 | 0.5944 | 0.6239 | 0.6088 | 0.7372 |
68
+ | No log | 11.0 | 220 | 0.7940 | 0.6082 | 0.6385 | 0.6230 | 0.7394 |
69
+ | No log | 12.0 | 240 | 0.7752 | 0.5911 | 0.6241 | 0.6072 | 0.7350 |
70
+ | No log | 13.0 | 260 | 0.7520 | 0.5896 | 0.6258 | 0.6071 | 0.7336 |
71
+ | No log | 14.0 | 280 | 0.7375 | 0.5987 | 0.6352 | 0.6164 | 0.7370 |
72
+ | No log | 15.0 | 300 | 0.7306 | 0.5913 | 0.6372 | 0.6134 | 0.7365 |
73
+ | No log | 16.0 | 320 | 0.7181 | 0.5969 | 0.6374 | 0.6165 | 0.7372 |
74
+ | No log | 17.0 | 340 | 0.7109 | 0.6107 | 0.6525 | 0.6309 | 0.7482 |
75
+ | No log | 18.0 | 360 | 0.7059 | 0.6009 | 0.6413 | 0.6205 | 0.7417 |
76
+ | No log | 19.0 | 380 | 0.7039 | 0.6011 | 0.6404 | 0.6201 | 0.7396 |
77
+ | No log | 20.0 | 400 | 0.7020 | 0.6057 | 0.6438 | 0.6242 | 0.7418 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78
 
79
 
80
  ### Framework versions
config.json CHANGED
@@ -178,7 +178,7 @@
178
  "LABEL_9": 9
179
  },
180
  "layer_norm_eps": 1e-12,
181
- "max_position_embeddings": 512,
182
  "model_type": "bert",
183
  "num_attention_heads": 12,
184
  "num_hidden_layers": 12,
 
178
  "LABEL_9": 9
179
  },
180
  "layer_norm_eps": 1e-12,
181
+ "max_position_embeddings": 1024,
182
  "model_type": "bert",
183
  "num_attention_heads": 12,
184
  "num_hidden_layers": 12,
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:60b29be23c5dac461a0e6624a5dee77fc11d0d6cc3ea90355cbb8db02d6e23d2
3
  size 435839092
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:32fa44ea6a5c4400ce7171f13f7eb1b33f738ee7dfdf423bfdf2d4dd8245d291
3
  size 435839092
runs/Apr08_21-30-20_91ae4e8de78a/events.out.tfevents.1712611910.91ae4e8de78a.193.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7f3687ab03f808a07c0b6ceeb85ae15eea52513cad809a353696b36732003a2f
3
+ size 7823
runs/Apr08_21-30-20_91ae4e8de78a/events.out.tfevents.1712612271.91ae4e8de78a.193.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ead19046ffbd36839e11dd62cef78952601e05e3d54c38bce4153ffa580162aa
3
+ size 7824
runs/Apr08_21-42-30_91ae4e8de78a/events.out.tfevents.1712612574.91ae4e8de78a.193.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9f186699bf7ac148187692ebfe211e2ec1e8b765802043ea1fe9fee1c38cd6f8
3
+ size 17676
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4328212aee4642ada640338f0726147f4fc3c66eabe28dffdea3eb7fb14edd02
3
  size 4856
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dbdd11cbacdb5cd4218c410d9c0afbb7e3491ffafaedf12d82056ac14bbbfade
3
  size 4856