V2-bert-text-classification-model
This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2017
- Accuracy: 0.9601
- F1: 0.8264
- Precision: 0.8214
- Recall: 0.8331
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 1.5342 | 0.11 | 50 | 1.6906 | 0.3486 | 0.1581 | 0.1874 | 0.1879 |
| 0.7232 | 0.22 | 100 | 0.7529 | 0.8296 | 0.5057 | 0.5008 | 0.5124 |
| 0.2933 | 0.33 | 150 | 0.4824 | 0.9018 | 0.6709 | 0.6673 | 0.6756 |
| 0.2774 | 0.44 | 200 | 0.4746 | 0.8772 | 0.6543 | 0.6423 | 0.6686 |
| 0.1739 | 0.55 | 250 | 0.4650 | 0.9103 | 0.6760 | 0.6636 | 0.6892 |
| 0.1757 | 0.66 | 300 | 0.3614 | 0.9166 | 0.7175 | 0.7823 | 0.7127 |
| 0.177 | 0.76 | 350 | 0.2602 | 0.9111 | 0.7284 | 0.7568 | 0.7163 |
| 0.1019 | 0.87 | 400 | 0.3053 | 0.9223 | 0.7301 | 0.7881 | 0.7203 |
| 0.1067 | 0.98 | 450 | 0.4436 | 0.9095 | 0.7255 | 0.7598 | 0.7197 |
| 0.1577 | 1.09 | 500 | 0.2348 | 0.9532 | 0.8227 | 0.8285 | 0.8171 |
| 0.0792 | 1.2 | 550 | 0.2429 | 0.9519 | 0.8190 | 0.8218 | 0.8175 |
| 0.086 | 1.31 | 600 | 0.1858 | 0.9595 | 0.8264 | 0.8282 | 0.8258 |
| 0.091 | 1.42 | 650 | 0.1868 | 0.9625 | 0.8279 | 0.8259 | 0.8308 |
| 0.0909 | 1.53 | 700 | 0.2091 | 0.9549 | 0.8244 | 0.8284 | 0.8217 |
| 0.0434 | 1.64 | 750 | 0.1942 | 0.9628 | 0.8303 | 0.8294 | 0.8315 |
| 0.1175 | 1.75 | 800 | 0.1572 | 0.9650 | 0.8317 | 0.8304 | 0.8333 |
| 0.092 | 1.86 | 850 | 0.2515 | 0.9300 | 0.7489 | 0.7995 | 0.7346 |
| 0.06 | 1.97 | 900 | 0.4890 | 0.9136 | 0.7334 | 0.7694 | 0.7261 |
| 0.0652 | 2.07 | 950 | 0.2258 | 0.9541 | 0.8218 | 0.8143 | 0.8309 |
| 0.0436 | 2.18 | 1000 | 0.2224 | 0.9587 | 0.8245 | 0.8184 | 0.8326 |
| 0.0524 | 2.29 | 1050 | 0.2476 | 0.9546 | 0.8193 | 0.8118 | 0.8283 |
| 0.0598 | 2.4 | 1100 | 0.1913 | 0.9669 | 0.8317 | 0.8312 | 0.8328 |
| 0.0503 | 2.51 | 1150 | 0.2179 | 0.9612 | 0.8230 | 0.8298 | 0.8175 |
| 0.0258 | 2.62 | 1200 | 0.2204 | 0.9631 | 0.8298 | 0.8280 | 0.8323 |
| 0.0091 | 2.73 | 1250 | 0.5198 | 0.9218 | 0.7127 | 0.8107 | 0.7092 |
| 0.1076 | 2.84 | 1300 | 0.1853 | 0.9642 | 0.8323 | 0.8338 | 0.8310 |
| 0.0356 | 2.95 | 1350 | 0.2162 | 0.9612 | 0.8273 | 0.8220 | 0.8338 |
| 0.0492 | 3.06 | 1400 | 0.2382 | 0.9573 | 0.8245 | 0.8201 | 0.8296 |
| 0.0088 | 3.17 | 1450 | 0.2252 | 0.9636 | 0.8303 | 0.8285 | 0.8329 |
| 0.0275 | 3.28 | 1500 | 0.3000 | 0.9543 | 0.8234 | 0.8207 | 0.8279 |
| 0.0215 | 3.38 | 1550 | 0.3234 | 0.9497 | 0.8191 | 0.8152 | 0.8255 |
| 0.0294 | 3.49 | 1600 | 0.3486 | 0.9311 | 0.7500 | 0.8114 | 0.7338 |
| 0.0393 | 3.6 | 1650 | 0.2357 | 0.9595 | 0.8291 | 0.8274 | 0.8311 |
| 0.008 | 3.71 | 1700 | 0.2762 | 0.9587 | 0.8277 | 0.8260 | 0.8297 |
| 0.0042 | 3.82 | 1750 | 0.2393 | 0.9666 | 0.8330 | 0.8348 | 0.8312 |
| 0.0329 | 3.93 | 1800 | 0.3012 | 0.9584 | 0.8290 | 0.8267 | 0.8325 |
| 0.0185 | 4.04 | 1850 | 0.2400 | 0.9653 | 0.8324 | 0.8331 | 0.8319 |
| 0.019 | 4.15 | 1900 | 0.3604 | 0.9314 | 0.7489 | 0.8084 | 0.7324 |
| 0.0205 | 4.26 | 1950 | 0.2451 | 0.9653 | 0.8346 | 0.8365 | 0.8328 |
| 0.0202 | 4.37 | 2000 | 0.3619 | 0.9483 | 0.8190 | 0.8174 | 0.8237 |
| 0.019 | 4.48 | 2050 | 0.2573 | 0.9628 | 0.8315 | 0.8332 | 0.8306 |
| 0.0087 | 4.59 | 2100 | 0.2661 | 0.9634 | 0.8316 | 0.8319 | 0.8322 |
| 0.0212 | 4.69 | 2150 | 0.3671 | 0.9311 | 0.7497 | 0.8091 | 0.7378 |
| 0.0087 | 4.8 | 2200 | 0.3005 | 0.9305 | 0.7582 | 0.8108 | 0.7431 |
| 0.0005 | 4.91 | 2250 | 0.2772 | 0.9584 | 0.8257 | 0.8223 | 0.8297 |
| 0.0231 | 5.02 | 2300 | 0.2556 | 0.9634 | 0.8290 | 0.8269 | 0.8318 |
| 0.0006 | 5.13 | 2350 | 0.2798 | 0.9595 | 0.8253 | 0.8219 | 0.8298 |
| 0.0012 | 5.24 | 2400 | 0.2777 | 0.9625 | 0.8305 | 0.8278 | 0.8334 |
| 0.0096 | 5.35 | 2450 | 0.2818 | 0.9614 | 0.8280 | 0.8259 | 0.8308 |
| 0.0145 | 5.46 | 2500 | 0.2449 | 0.9628 | 0.8311 | 0.8286 | 0.8341 |
| 0.032 | 5.57 | 2550 | 0.2480 | 0.9653 | 0.8322 | 0.8296 | 0.8355 |
| 0.0075 | 5.68 | 2600 | 0.2241 | 0.9661 | 0.8341 | 0.8324 | 0.8360 |
| 0.0058 | 5.79 | 2650 | 0.2349 | 0.9645 | 0.8309 | 0.8290 | 0.8332 |
| 0.0079 | 5.9 | 2700 | 0.4499 | 0.9325 | 0.7515 | 0.8158 | 0.7383 |
| 0.0003 | 6.0 | 2750 | 0.2890 | 0.9590 | 0.8268 | 0.8252 | 0.8296 |
| 0.0109 | 6.11 | 2800 | 0.2298 | 0.9669 | 0.8337 | 0.8331 | 0.8346 |
| 0.0004 | 6.22 | 2850 | 0.2356 | 0.9669 | 0.8341 | 0.8334 | 0.8351 |
| 0.0003 | 6.33 | 2900 | 0.2272 | 0.9691 | 0.8364 | 0.8364 | 0.8366 |
| 0.0003 | 6.44 | 2950 | 0.2389 | 0.9669 | 0.8350 | 0.8342 | 0.8362 |
| 0.0201 | 6.55 | 3000 | 0.2427 | 0.9661 | 0.8346 | 0.8343 | 0.8354 |
| 0.0003 | 6.66 | 3050 | 0.2382 | 0.9677 | 0.8347 | 0.8352 | 0.8344 |
| 0.0095 | 6.77 | 3100 | 0.2004 | 0.9705 | 0.8367 | 0.8379 | 0.8354 |
| 0.0187 | 6.88 | 3150 | 0.2470 | 0.9677 | 0.8335 | 0.8332 | 0.8341 |
| 0.0086 | 6.99 | 3200 | 0.2243 | 0.9688 | 0.8348 | 0.8340 | 0.8358 |
| 0.0003 | 7.1 | 3250 | 0.2424 | 0.9677 | 0.8342 | 0.8329 | 0.8357 |
| 0.0067 | 7.21 | 3300 | 0.2754 | 0.9623 | 0.8287 | 0.8268 | 0.8314 |
| 0.0003 | 7.31 | 3350 | 0.2302 | 0.9686 | 0.8348 | 0.8340 | 0.8358 |
| 0.0002 | 7.42 | 3400 | 0.2318 | 0.9688 | 0.8350 | 0.8342 | 0.8359 |
| 0.0002 | 7.53 | 3450 | 0.2327 | 0.9686 | 0.8349 | 0.8342 | 0.8358 |
| 0.0002 | 7.64 | 3500 | 0.2376 | 0.9680 | 0.8346 | 0.8339 | 0.8355 |
| 0.0002 | 7.75 | 3550 | 0.2391 | 0.9680 | 0.8346 | 0.8339 | 0.8355 |
| 0.0002 | 7.86 | 3600 | 0.2435 | 0.9683 | 0.8358 | 0.8349 | 0.8370 |
| 0.0164 | 7.97 | 3650 | 0.2196 | 0.9705 | 0.8359 | 0.8358 | 0.8361 |
| 0.0003 | 8.08 | 3700 | 0.2116 | 0.9718 | 0.8380 | 0.8390 | 0.8369 |
| 0.004 | 8.19 | 3750 | 0.2192 | 0.9702 | 0.8364 | 0.8367 | 0.8362 |
| 0.0002 | 8.3 | 3800 | 0.2213 | 0.9699 | 0.8357 | 0.8356 | 0.8358 |
| 0.0002 | 8.41 | 3850 | 0.2232 | 0.9699 | 0.8357 | 0.8356 | 0.8358 |
| 0.0001 | 8.52 | 3900 | 0.2242 | 0.9699 | 0.8357 | 0.8356 | 0.8358 |
| 0.0001 | 8.62 | 3950 | 0.2230 | 0.9705 | 0.8360 | 0.8357 | 0.8362 |
| 0.0001 | 8.73 | 4000 | 0.2240 | 0.9705 | 0.8360 | 0.8357 | 0.8362 |
| 0.0001 | 8.84 | 4050 | 0.2254 | 0.9705 | 0.8361 | 0.8359 | 0.8364 |
| 0.0001 | 8.95 | 4100 | 0.2265 | 0.9705 | 0.8361 | 0.8359 | 0.8364 |
| 0.0002 | 9.06 | 4150 | 0.2280 | 0.9705 | 0.8364 | 0.8359 | 0.8369 |
| 0.0071 | 9.17 | 4200 | 0.2393 | 0.9694 | 0.8357 | 0.8355 | 0.8362 |
| 0.0001 | 9.28 | 4250 | 0.2564 | 0.9680 | 0.8355 | 0.8347 | 0.8367 |
| 0.0002 | 9.39 | 4300 | 0.2442 | 0.9688 | 0.8354 | 0.8352 | 0.8358 |
| 0.0002 | 9.5 | 4350 | 0.2363 | 0.9699 | 0.8361 | 0.8359 | 0.8365 |
| 0.0001 | 9.61 | 4400 | 0.2365 | 0.9699 | 0.8361 | 0.8359 | 0.8365 |
| 0.0001 | 9.72 | 4450 | 0.2366 | 0.9699 | 0.8361 | 0.8359 | 0.8365 |
| 0.0001 | 9.83 | 4500 | 0.2372 | 0.9699 | 0.8361 | 0.8359 | 0.8365 |
| 0.0001 | 9.93 | 4550 | 0.2372 | 0.9699 | 0.8361 | 0.8359 | 0.8365 |
Framework versions
- Transformers 4.39.3
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- -
Model tree for AmirlyPhd/V2-bert-text-classification-model
Base model
google-bert/bert-base-uncased