Intent-classification-BERT-cased-Ashu
This model is a fine-tuned version of google-bert/bert-base-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1960
- Accuracy: 0.9321
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 1.6354 | 0.24 | 10 | 1.3680 | 0.3478 |
| 1.2714 | 0.49 | 20 | 1.0836 | 0.5590 |
| 1.1593 | 0.73 | 30 | 0.7338 | 0.8012 |
| 0.6734 | 0.98 | 40 | 0.4365 | 0.8882 |
| 0.554 | 1.22 | 50 | 0.6694 | 0.8385 |
| 0.5863 | 1.46 | 60 | 0.4113 | 0.8385 |
| 0.3894 | 1.71 | 70 | 0.2667 | 0.9006 |
| 0.3458 | 1.95 | 80 | 0.3299 | 0.8882 |
| 0.3338 | 2.2 | 90 | 0.3247 | 0.8882 |
| 0.2073 | 2.44 | 100 | 0.2303 | 0.8944 |
| 0.2844 | 2.68 | 110 | 0.2886 | 0.8944 |
| 0.4828 | 2.93 | 120 | 0.2475 | 0.9006 |
| 0.2294 | 3.17 | 130 | 0.2751 | 0.8820 |
| 0.3103 | 3.41 | 140 | 0.2409 | 0.8696 |
| 0.1972 | 3.66 | 150 | 0.2038 | 0.9130 |
| 0.1808 | 3.9 | 160 | 0.2399 | 0.9068 |
| 0.1904 | 4.15 | 170 | 0.2559 | 0.9068 |
| 0.2458 | 4.39 | 180 | 0.5942 | 0.8634 |
| 0.1777 | 4.63 | 190 | 0.3048 | 0.8820 |
| 0.2233 | 4.88 | 200 | 0.2170 | 0.9130 |
| 0.2446 | 5.12 | 210 | 0.3414 | 0.8758 |
| 0.1631 | 5.37 | 220 | 0.3601 | 0.8882 |
| 0.3007 | 5.61 | 230 | 0.4856 | 0.8571 |
| 0.2979 | 5.85 | 240 | 0.7301 | 0.8447 |
| 0.3355 | 6.1 | 250 | 0.3030 | 0.8944 |
| 0.2137 | 6.34 | 260 | 0.4366 | 0.8820 |
| 0.3109 | 6.59 | 270 | 0.3117 | 0.8944 |
| 0.1863 | 6.83 | 280 | 0.4387 | 0.8758 |
| 0.2912 | 7.07 | 290 | 0.2516 | 0.9006 |
| 0.1621 | 7.32 | 300 | 0.3174 | 0.9006 |
| 0.2598 | 7.56 | 310 | 0.6467 | 0.8385 |
| 0.351 | 7.8 | 320 | 0.2943 | 0.8820 |
| 0.3232 | 8.05 | 330 | 0.2908 | 0.8944 |
| 0.1534 | 8.29 | 340 | 0.3321 | 0.8944 |
| 0.208 | 8.54 | 350 | 0.4615 | 0.8447 |
| 0.3685 | 8.78 | 360 | 0.4475 | 0.8696 |
| 0.1663 | 9.02 | 370 | 0.4067 | 0.8696 |
| 0.2267 | 9.27 | 380 | 0.4081 | 0.8758 |
| 0.2486 | 9.51 | 390 | 0.5971 | 0.8509 |
| 0.4295 | 9.76 | 400 | 0.2917 | 0.8882 |
| 0.2481 | 10.0 | 410 | 0.3792 | 0.8820 |
| 0.1681 | 10.24 | 420 | 0.3793 | 0.8882 |
| 0.1313 | 10.49 | 430 | 0.3035 | 0.9006 |
| 0.3188 | 10.73 | 440 | 0.3317 | 0.8758 |
| 0.2266 | 10.98 | 450 | 0.4534 | 0.8696 |
| 0.1728 | 11.22 | 460 | 0.3922 | 0.8882 |
| 0.1655 | 11.46 | 470 | 0.2906 | 0.8882 |
| 0.1764 | 11.71 | 480 | 0.3753 | 0.8758 |
| 0.1654 | 11.95 | 490 | 0.3411 | 0.8820 |
| 0.114 | 12.2 | 500 | 0.3693 | 0.8758 |
| 0.2119 | 12.44 | 510 | 0.4721 | 0.8820 |
| 0.1655 | 12.68 | 520 | 0.5551 | 0.8758 |
| 0.2329 | 12.93 | 530 | 0.4987 | 0.8758 |
| 0.2048 | 13.17 | 540 | 0.4264 | 0.8758 |
| 0.2365 | 13.41 | 550 | 0.3291 | 0.9006 |
| 0.1067 | 13.66 | 560 | 0.3542 | 0.9006 |
| 0.1939 | 13.9 | 570 | 0.3957 | 0.9006 |
| 0.2257 | 14.15 | 580 | 0.3690 | 0.8882 |
| 0.1853 | 14.39 | 590 | 0.3377 | 0.9006 |
| 0.2486 | 14.63 | 600 | 0.2423 | 0.9068 |
| 0.147 | 14.88 | 610 | 0.3141 | 0.8882 |
| 0.1639 | 15.12 | 620 | 0.4718 | 0.8758 |
| 0.151 | 15.37 | 630 | 0.6900 | 0.8571 |
| 0.2909 | 15.61 | 640 | 0.3900 | 0.9006 |
| 0.2273 | 15.85 | 650 | 0.3972 | 0.8820 |
| 0.2317 | 16.1 | 660 | 0.3208 | 0.8944 |
| 0.2005 | 16.34 | 670 | 0.3355 | 0.8882 |
| 0.1807 | 16.59 | 680 | 0.4310 | 0.8882 |
| 0.216 | 16.83 | 690 | 0.4881 | 0.8882 |
| 0.1307 | 17.07 | 700 | 0.3590 | 0.8882 |
| 0.234 | 17.32 | 710 | 0.3503 | 0.8758 |
| 0.224 | 17.56 | 720 | 0.3790 | 0.8758 |
| 0.1708 | 17.8 | 730 | 0.2696 | 0.8944 |
| 0.1848 | 18.05 | 740 | 0.2631 | 0.8944 |
| 0.1799 | 18.29 | 750 | 0.2867 | 0.9006 |
| 0.1882 | 18.54 | 760 | 0.4595 | 0.8758 |
| 0.1072 | 18.78 | 770 | 0.3914 | 0.8944 |
| 0.2072 | 19.02 | 780 | 0.3018 | 0.9006 |
| 0.2289 | 19.27 | 790 | 0.2462 | 0.9006 |
| 0.1597 | 19.51 | 800 | 0.2632 | 0.9068 |
| 0.1475 | 19.76 | 810 | 0.3012 | 0.8944 |
| 0.1691 | 20.0 | 820 | 0.2272 | 0.9006 |
| 0.1339 | 20.24 | 830 | 0.2947 | 0.8882 |
| 0.1247 | 20.49 | 840 | 0.3514 | 0.9068 |
| 0.2072 | 20.73 | 850 | 0.3281 | 0.8758 |
| 0.1379 | 20.98 | 860 | 0.3696 | 0.9006 |
| 0.123 | 21.22 | 870 | 0.4604 | 0.8944 |
| 0.1697 | 21.46 | 880 | 0.4491 | 0.8820 |
| 0.1613 | 21.71 | 890 | 0.3338 | 0.9006 |
| 0.1816 | 21.95 | 900 | 0.3421 | 0.9006 |
| 0.2516 | 22.2 | 910 | 0.3395 | 0.9006 |
| 0.1367 | 22.44 | 920 | 0.3416 | 0.9006 |
| 0.1148 | 22.68 | 930 | 0.3901 | 0.8944 |
| 0.123 | 22.93 | 940 | 0.4092 | 0.8944 |
| 0.0922 | 23.17 | 950 | 0.4680 | 0.8820 |
| 0.1294 | 23.41 | 960 | 0.4898 | 0.8944 |
| 0.1986 | 23.66 | 970 | 0.4286 | 0.8882 |
| 0.175 | 23.9 | 980 | 0.4919 | 0.8882 |
| 0.1264 | 24.15 | 990 | 0.5121 | 0.8944 |
| 0.1454 | 24.39 | 1000 | 0.5529 | 0.8944 |
| 0.1986 | 24.63 | 1010 | 0.4504 | 0.8944 |
| 0.2549 | 24.88 | 1020 | 0.3442 | 0.8944 |
| 0.1878 | 25.12 | 1030 | 0.3414 | 0.8882 |
| 0.1313 | 25.37 | 1040 | 0.3944 | 0.8758 |
| 0.0957 | 25.61 | 1050 | 0.4231 | 0.8820 |
| 0.1751 | 25.85 | 1060 | 0.4765 | 0.8820 |
| 0.1389 | 26.1 | 1070 | 0.4927 | 0.8820 |
| 0.1038 | 26.34 | 1080 | 0.4923 | 0.8820 |
| 0.1371 | 26.59 | 1090 | 0.4848 | 0.8820 |
| 0.1576 | 26.83 | 1100 | 0.4765 | 0.8820 |
| 0.1539 | 27.07 | 1110 | 0.4926 | 0.8820 |
| 0.1742 | 27.32 | 1120 | 0.4749 | 0.8820 |
| 0.1365 | 27.56 | 1130 | 0.4717 | 0.8820 |
| 0.1284 | 27.8 | 1140 | 0.4754 | 0.8820 |
| 0.0794 | 28.05 | 1150 | 0.4871 | 0.8820 |
| 0.0934 | 28.29 | 1160 | 0.5052 | 0.8820 |
| 0.1384 | 28.54 | 1170 | 0.5146 | 0.8820 |
| 0.1323 | 28.78 | 1180 | 0.5140 | 0.8820 |
| 0.1352 | 29.02 | 1190 | 0.5068 | 0.8820 |
| 0.1576 | 29.27 | 1200 | 0.5068 | 0.8758 |
| 0.1278 | 29.51 | 1210 | 0.5067 | 0.8758 |
| 0.1256 | 29.76 | 1220 | 0.5071 | 0.8758 |
| 0.0999 | 30.0 | 1230 | 0.5078 | 0.8758 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.2
- Downloads last month
- -
Model tree for Narkantak/Intent-classification-BERT-cased-Ashu
Base model
google-bert/bert-base-cased