ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5058
  • Qwk: 0.5133
  • Mse: 0.5058
  • Rmse: 0.7112

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0308 2 2.6066 -0.0262 2.6066 1.6145
No log 0.0615 4 1.2320 0.0495 1.2320 1.1100
No log 0.0923 6 0.7361 0.1372 0.7361 0.8580
No log 0.1231 8 0.6351 0.2676 0.6351 0.7970
No log 0.1538 10 0.5903 0.3523 0.5903 0.7683
No log 0.1846 12 0.5783 0.4354 0.5783 0.7604
No log 0.2154 14 0.5505 0.4494 0.5505 0.7420
No log 0.2462 16 0.6381 0.4209 0.6381 0.7988
No log 0.2769 18 0.7837 0.4152 0.7837 0.8853
No log 0.3077 20 1.2075 0.2386 1.2075 1.0989
No log 0.3385 22 1.3071 0.2386 1.3071 1.1433
No log 0.3692 24 0.7191 0.5124 0.7191 0.8480
No log 0.4 26 0.5849 0.5543 0.5849 0.7648
No log 0.4308 28 0.6715 0.5310 0.6715 0.8194
No log 0.4615 30 0.4443 0.6114 0.4443 0.6665
No log 0.4923 32 0.4311 0.6184 0.4311 0.6566
No log 0.5231 34 0.6039 0.5295 0.6039 0.7771
No log 0.5538 36 0.6735 0.5295 0.6735 0.8207
No log 0.5846 38 0.6298 0.5295 0.6298 0.7936
No log 0.6154 40 0.4297 0.6092 0.4297 0.6555
No log 0.6462 42 0.5743 0.5243 0.5743 0.7578
No log 0.6769 44 0.6221 0.5278 0.6221 0.7887
No log 0.7077 46 0.4690 0.5081 0.4690 0.6848
No log 0.7385 48 0.5526 0.2776 0.5526 0.7434
No log 0.7692 50 0.5583 0.3931 0.5583 0.7472
No log 0.8 52 0.5367 0.5119 0.5367 0.7326
No log 0.8308 54 0.4260 0.6052 0.4260 0.6527
No log 0.8615 56 0.5081 0.5939 0.5081 0.7128
No log 0.8923 58 0.4218 0.6263 0.4218 0.6494
No log 0.9231 60 0.5680 0.6180 0.5680 0.7536
No log 0.9538 62 0.8621 0.4903 0.8621 0.9285
No log 0.9846 64 0.8088 0.4903 0.8088 0.8993
No log 1.0154 66 0.5161 0.6610 0.5161 0.7184
No log 1.0462 68 0.4769 0.6455 0.4769 0.6906
No log 1.0769 70 0.4681 0.5889 0.4681 0.6841
No log 1.1077 72 0.4749 0.5949 0.4749 0.6891
No log 1.1385 74 0.4586 0.5840 0.4586 0.6772
No log 1.1692 76 0.4771 0.5421 0.4771 0.6907
No log 1.2 78 0.4988 0.5160 0.4988 0.7062
No log 1.2308 80 0.4584 0.5604 0.4584 0.6770
No log 1.2615 82 0.5619 0.6154 0.5619 0.7496
No log 1.2923 84 0.6674 0.5596 0.6674 0.8170
No log 1.3231 86 0.4836 0.6186 0.4836 0.6954
No log 1.3538 88 0.6615 0.5340 0.6615 0.8133
No log 1.3846 90 0.7746 0.4321 0.7746 0.8801
No log 1.4154 92 0.5904 0.6300 0.5904 0.7684
No log 1.4462 94 0.5546 0.6285 0.5546 0.7447
No log 1.4769 96 0.7811 0.5272 0.7811 0.8838
No log 1.5077 98 0.6912 0.5207 0.6912 0.8314
No log 1.5385 100 0.4464 0.6228 0.4464 0.6681
No log 1.5692 102 0.5349 0.4825 0.5349 0.7314
No log 1.6 104 0.5579 0.4613 0.5579 0.7469
No log 1.6308 106 0.5832 0.5363 0.5832 0.7637
No log 1.6615 108 0.4615 0.5796 0.4615 0.6793
No log 1.6923 110 0.4563 0.6121 0.4563 0.6755
No log 1.7231 112 0.4816 0.6133 0.4816 0.6939
No log 1.7538 114 0.4968 0.6245 0.4968 0.7048
No log 1.7846 116 0.4962 0.6438 0.4962 0.7044
No log 1.8154 118 0.4799 0.6438 0.4799 0.6928
No log 1.8462 120 0.4640 0.6009 0.4640 0.6812
No log 1.8769 122 0.5218 0.5524 0.5218 0.7223
No log 1.9077 124 0.4854 0.5765 0.4854 0.6967
No log 1.9385 126 0.4869 0.5765 0.4869 0.6978
No log 1.9692 128 0.4289 0.5939 0.4289 0.6549
No log 2.0 130 0.4091 0.5711 0.4091 0.6396
No log 2.0308 132 0.4448 0.6201 0.4448 0.6670
No log 2.0615 134 0.5093 0.6475 0.5093 0.7136
No log 2.0923 136 0.4155 0.6530 0.4155 0.6446
No log 2.1231 138 0.4113 0.6446 0.4113 0.6413
No log 2.1538 140 0.4189 0.6641 0.4189 0.6472
No log 2.1846 142 0.4460 0.6782 0.4460 0.6679
No log 2.2154 144 0.4872 0.6624 0.4872 0.6980
No log 2.2462 146 0.4858 0.6548 0.4858 0.6970
No log 2.2769 148 0.4526 0.6612 0.4526 0.6727
No log 2.3077 150 0.4494 0.6537 0.4494 0.6704
No log 2.3385 152 0.4148 0.6672 0.4148 0.6441
No log 2.3692 154 0.4831 0.6074 0.4831 0.6950
No log 2.4 156 0.4939 0.6354 0.4939 0.7028
No log 2.4308 158 0.4440 0.5796 0.4440 0.6663
No log 2.4615 160 0.5319 0.5918 0.5319 0.7293
No log 2.4923 162 0.5761 0.5481 0.5761 0.7590
No log 2.5231 164 0.5038 0.4997 0.5038 0.7098
No log 2.5538 166 0.4395 0.6517 0.4395 0.6630
No log 2.5846 168 0.4422 0.6344 0.4422 0.6650
No log 2.6154 170 0.4578 0.6047 0.4578 0.6766
No log 2.6462 172 0.4200 0.6741 0.4200 0.6481
No log 2.6769 174 0.4411 0.6552 0.4411 0.6642
No log 2.7077 176 0.4601 0.6360 0.4601 0.6783
No log 2.7385 178 0.4335 0.6771 0.4335 0.6584
No log 2.7692 180 0.5454 0.6283 0.5454 0.7385
No log 2.8 182 0.4955 0.6079 0.4955 0.7039
No log 2.8308 184 0.4271 0.6010 0.4271 0.6535
No log 2.8615 186 0.5032 0.5291 0.5032 0.7093
No log 2.8923 188 0.4844 0.6058 0.4844 0.6960
No log 2.9231 190 0.4474 0.6038 0.4474 0.6689
No log 2.9538 192 0.4462 0.6477 0.4462 0.6680
No log 2.9846 194 0.4696 0.6431 0.4696 0.6853
No log 3.0154 196 0.4415 0.7208 0.4415 0.6645
No log 3.0462 198 0.4936 0.5546 0.4936 0.7026
No log 3.0769 200 0.4544 0.6623 0.4544 0.6741
No log 3.1077 202 0.5013 0.6442 0.5013 0.7080
No log 3.1385 204 0.4937 0.6330 0.4937 0.7026
No log 3.1692 206 0.4408 0.6001 0.4408 0.6640
No log 3.2 208 0.5072 0.5307 0.5072 0.7122
No log 3.2308 210 0.5343 0.5307 0.5343 0.7310
No log 3.2615 212 0.4584 0.6038 0.4584 0.6771
No log 3.2923 214 0.4223 0.6770 0.4223 0.6499
No log 3.3231 216 0.6031 0.5749 0.6031 0.7766
No log 3.3538 218 0.6679 0.4852 0.6679 0.8172
No log 3.3846 220 0.5380 0.5922 0.5380 0.7335
No log 3.4154 222 0.4308 0.7004 0.4308 0.6564
No log 3.4462 224 0.4529 0.6438 0.4529 0.6730
No log 3.4769 226 0.4383 0.6434 0.4383 0.6620
No log 3.5077 228 0.4163 0.7496 0.4163 0.6452
No log 3.5385 230 0.5234 0.6074 0.5234 0.7234
No log 3.5692 232 0.6044 0.5529 0.6044 0.7775
No log 3.6 234 0.5106 0.6074 0.5106 0.7146
No log 3.6308 236 0.4244 0.7496 0.4244 0.6515
No log 3.6615 238 0.4890 0.5908 0.4890 0.6993
No log 3.6923 240 0.4974 0.6025 0.4974 0.7053
No log 3.7231 242 0.4344 0.6214 0.4344 0.6591
No log 3.7538 244 0.4312 0.6964 0.4312 0.6567
No log 3.7846 246 0.5090 0.6061 0.5090 0.7135
No log 3.8154 248 0.5436 0.6181 0.5436 0.7373
No log 3.8462 250 0.4793 0.5678 0.4793 0.6923
No log 3.8769 252 0.4625 0.6918 0.4625 0.6801
No log 3.9077 254 0.4821 0.6038 0.4821 0.6943
No log 3.9385 256 0.4754 0.6018 0.4754 0.6895
No log 3.9692 258 0.4551 0.5703 0.4551 0.6746
No log 4.0 260 0.4585 0.6214 0.4585 0.6771
No log 4.0308 262 0.4773 0.6032 0.4773 0.6909
No log 4.0615 264 0.4701 0.6197 0.4701 0.6856
No log 4.0923 266 0.4854 0.6688 0.4854 0.6967
No log 4.1231 268 0.5010 0.6156 0.5010 0.7078
No log 4.1538 270 0.5129 0.6079 0.5129 0.7162
No log 4.1846 272 0.5219 0.6141 0.5219 0.7225
No log 4.2154 274 0.5012 0.6182 0.5012 0.7079
No log 4.2462 276 0.4742 0.5125 0.4742 0.6886
No log 4.2769 278 0.4808 0.5770 0.4808 0.6934
No log 4.3077 280 0.5728 0.5814 0.5728 0.7568
No log 4.3385 282 0.5765 0.5895 0.5765 0.7593
No log 4.3692 284 0.4838 0.5619 0.4838 0.6955
No log 4.4 286 0.4281 0.5965 0.4281 0.6543
No log 4.4308 288 0.4726 0.6141 0.4726 0.6874
No log 4.4615 290 0.4958 0.5657 0.4958 0.7042
No log 4.4923 292 0.4498 0.6330 0.4498 0.6707
No log 4.5231 294 0.4341 0.6228 0.4341 0.6589
No log 4.5538 296 0.4946 0.6004 0.4946 0.7033
No log 4.5846 298 0.5434 0.5275 0.5434 0.7371
No log 4.6154 300 0.4919 0.6004 0.4919 0.7014
No log 4.6462 302 0.4420 0.5831 0.4420 0.6648
No log 4.6769 304 0.4654 0.5702 0.4654 0.6822
No log 4.7077 306 0.4883 0.6248 0.4883 0.6988
No log 4.7385 308 0.4702 0.5966 0.4702 0.6857
No log 4.7692 310 0.4817 0.6559 0.4817 0.6941
No log 4.8 312 0.5190 0.5962 0.5190 0.7204
No log 4.8308 314 0.4492 0.6438 0.4492 0.6702
No log 4.8615 316 0.4211 0.6924 0.4211 0.6489
No log 4.8923 318 0.4221 0.7415 0.4221 0.6497
No log 4.9231 320 0.4312 0.7302 0.4312 0.6567
No log 4.9538 322 0.4469 0.6720 0.4469 0.6685
No log 4.9846 324 0.4441 0.6923 0.4441 0.6664
No log 5.0154 326 0.4535 0.7475 0.4535 0.6734
No log 5.0462 328 0.4920 0.6434 0.4920 0.7014
No log 5.0769 330 0.4913 0.6434 0.4913 0.7010
No log 5.1077 332 0.4879 0.6570 0.4879 0.6985
No log 5.1385 334 0.5271 0.6633 0.5271 0.7260
No log 5.1692 336 0.5720 0.6450 0.5720 0.7563
No log 5.2 338 0.5303 0.6304 0.5303 0.7282
No log 5.2308 340 0.5093 0.6245 0.5093 0.7137
No log 5.2615 342 0.5338 0.5724 0.5338 0.7306
No log 5.2923 344 0.5236 0.5795 0.5236 0.7236
No log 5.3231 346 0.4807 0.6481 0.4807 0.6933
No log 5.3538 348 0.4966 0.6052 0.4966 0.7047
No log 5.3846 350 0.5363 0.5756 0.5363 0.7323
No log 5.4154 352 0.5202 0.5845 0.5202 0.7213
No log 5.4462 354 0.4961 0.6046 0.4961 0.7043
No log 5.4769 356 0.4968 0.5549 0.4968 0.7048
No log 5.5077 358 0.4968 0.5831 0.4968 0.7048
No log 5.5385 360 0.5252 0.5923 0.5252 0.7247
No log 5.5692 362 0.6245 0.5342 0.6245 0.7902
No log 5.6 364 0.6472 0.5748 0.6472 0.8045
No log 5.6308 366 0.5735 0.5942 0.5735 0.7573
No log 5.6615 368 0.5153 0.6298 0.5153 0.7178
No log 5.6923 370 0.4948 0.5966 0.4948 0.7034
No log 5.7231 372 0.4767 0.6020 0.4767 0.6904
No log 5.7538 374 0.4688 0.5751 0.4688 0.6847
No log 5.7846 376 0.4568 0.5826 0.4568 0.6758
No log 5.8154 378 0.4471 0.6156 0.4471 0.6687
No log 5.8462 380 0.4442 0.6024 0.4442 0.6665
No log 5.8769 382 0.4603 0.6414 0.4603 0.6784
No log 5.9077 384 0.4701 0.6414 0.4701 0.6857
No log 5.9385 386 0.4608 0.6517 0.4608 0.6788
No log 5.9692 388 0.4585 0.5457 0.4585 0.6772
No log 6.0 390 0.4959 0.5153 0.4959 0.7042
No log 6.0308 392 0.5528 0.5178 0.5528 0.7435
No log 6.0615 394 0.5790 0.5721 0.5790 0.7609
No log 6.0923 396 0.5580 0.5822 0.5580 0.7470
No log 6.1231 398 0.5789 0.5892 0.5789 0.7608
No log 6.1538 400 0.6070 0.6626 0.6070 0.7791
No log 6.1846 402 0.5610 0.6411 0.5610 0.7490
No log 6.2154 404 0.5106 0.6837 0.5106 0.7146
No log 6.2462 406 0.4931 0.6827 0.4931 0.7022
No log 6.2769 408 0.5240 0.5872 0.5240 0.7239
No log 6.3077 410 0.5893 0.6032 0.5893 0.7677
No log 6.3385 412 0.6352 0.5944 0.6352 0.7970
No log 6.3692 414 0.5991 0.5944 0.5991 0.7740
No log 6.4 416 0.5032 0.6214 0.5032 0.7093
No log 6.4308 418 0.4543 0.5677 0.4543 0.6740
No log 6.4615 420 0.4496 0.5979 0.4496 0.6706
No log 6.4923 422 0.4483 0.6184 0.4483 0.6696
No log 6.5231 424 0.4563 0.6541 0.4563 0.6755
No log 6.5538 426 0.4613 0.6730 0.4613 0.6792
No log 6.5846 428 0.4562 0.6837 0.4562 0.6755
No log 6.6154 430 0.4706 0.6683 0.4706 0.6860
No log 6.6462 432 0.4611 0.6860 0.4611 0.6791
No log 6.6769 434 0.4643 0.6651 0.4643 0.6814
No log 6.7077 436 0.4930 0.6152 0.4930 0.7021
No log 6.7385 438 0.4751 0.6398 0.4751 0.6893
No log 6.7692 440 0.4821 0.6953 0.4821 0.6943
No log 6.8 442 0.5242 0.6169 0.5242 0.7240
No log 6.8308 444 0.5770 0.5400 0.5770 0.7596
No log 6.8615 446 0.5275 0.5470 0.5275 0.7263
No log 6.8923 448 0.4578 0.5714 0.4578 0.6766
No log 6.9231 450 0.4775 0.6517 0.4775 0.6910
No log 6.9538 452 0.5597 0.5698 0.5597 0.7482
No log 6.9846 454 0.6149 0.4815 0.6149 0.7842
No log 7.0154 456 0.5882 0.4919 0.5882 0.7669
No log 7.0462 458 0.5433 0.3976 0.5433 0.7371
No log 7.0769 460 0.5059 0.5003 0.5059 0.7113
No log 7.1077 462 0.4818 0.5267 0.4818 0.6941
No log 7.1385 464 0.4577 0.5539 0.4577 0.6765
No log 7.1692 466 0.4340 0.6017 0.4340 0.6588
No log 7.2 468 0.4348 0.6017 0.4348 0.6594
No log 7.2308 470 0.4495 0.5782 0.4495 0.6705
No log 7.2615 472 0.4616 0.5488 0.4616 0.6794
No log 7.2923 474 0.4767 0.6517 0.4767 0.6905
No log 7.3231 476 0.5062 0.5980 0.5062 0.7115
No log 7.3538 478 0.5188 0.5836 0.5188 0.7203
No log 7.3846 480 0.5237 0.4052 0.5237 0.7237
No log 7.4154 482 0.5140 0.4322 0.5140 0.7170
No log 7.4462 484 0.4822 0.5042 0.4822 0.6944
No log 7.4769 486 0.4666 0.6517 0.4666 0.6831
No log 7.5077 488 0.4925 0.6214 0.4925 0.7018
No log 7.5385 490 0.5111 0.6025 0.5111 0.7149
No log 7.5692 492 0.4723 0.6111 0.4723 0.6873
No log 7.6 494 0.4581 0.5798 0.4581 0.6769
No log 7.6308 496 0.4692 0.6082 0.4692 0.6850
No log 7.6615 498 0.4641 0.5945 0.4641 0.6813
0.2867 7.6923 500 0.4581 0.5556 0.4581 0.6768
0.2867 7.7231 502 0.4793 0.5235 0.4793 0.6924
0.2867 7.7538 504 0.5380 0.6038 0.5380 0.7335
0.2867 7.7846 506 0.6126 0.5227 0.6126 0.7827
0.2867 7.8154 508 0.5830 0.4808 0.5830 0.7636
0.2867 7.8462 510 0.4978 0.5736 0.4978 0.7056
0.2867 7.8769 512 0.4373 0.5286 0.4373 0.6613
0.2867 7.9077 514 0.4275 0.5556 0.4275 0.6539
0.2867 7.9385 516 0.4196 0.5798 0.4196 0.6478
0.2867 7.9692 518 0.4153 0.6032 0.4153 0.6445
0.2867 8.0 520 0.4158 0.6736 0.4158 0.6448
0.2867 8.0308 522 0.4435 0.6321 0.4435 0.6660
0.2867 8.0615 524 0.4536 0.6321 0.4536 0.6735
0.2867 8.0923 526 0.4279 0.6518 0.4279 0.6541
0.2867 8.1231 528 0.4067 0.7123 0.4067 0.6377
0.2867 8.1538 530 0.4188 0.6514 0.4188 0.6471
0.2867 8.1846 532 0.4381 0.6514 0.4381 0.6619
0.2867 8.2154 534 0.4191 0.6514 0.4191 0.6474
0.2867 8.2462 536 0.4145 0.7191 0.4145 0.6439
0.2867 8.2769 538 0.4167 0.6993 0.4167 0.6455
0.2867 8.3077 540 0.4107 0.7012 0.4107 0.6408
0.2867 8.3385 542 0.3918 0.7033 0.3918 0.6260
0.2867 8.3692 544 0.3896 0.6648 0.3896 0.6242
0.2867 8.4 546 0.3955 0.6648 0.3955 0.6289
0.2867 8.4308 548 0.4026 0.5930 0.4026 0.6345
0.2867 8.4615 550 0.4012 0.6745 0.4012 0.6334
0.2867 8.4923 552 0.4172 0.6721 0.4172 0.6459
0.2867 8.5231 554 0.4373 0.6503 0.4373 0.6613
0.2867 8.5538 556 0.4449 0.6503 0.4449 0.6670
0.2867 8.5846 558 0.4317 0.6200 0.4317 0.6571
0.2867 8.6154 560 0.4341 0.5556 0.4341 0.6589
0.2867 8.6462 562 0.4394 0.5556 0.4394 0.6629
0.2867 8.6769 564 0.4473 0.5556 0.4473 0.6688
0.2867 8.7077 566 0.4518 0.5765 0.4518 0.6722
0.2867 8.7385 568 0.4569 0.5556 0.4569 0.6760
0.2867 8.7692 570 0.4689 0.5475 0.4689 0.6848
0.2867 8.8 572 0.4682 0.5475 0.4682 0.6843
0.2867 8.8308 574 0.4666 0.5463 0.4666 0.6831
0.2867 8.8615 576 0.4698 0.5751 0.4698 0.6854
0.2867 8.8923 578 0.4636 0.5956 0.4636 0.6809
0.2867 8.9231 580 0.4779 0.6517 0.4779 0.6913
0.2867 8.9538 582 0.4988 0.6018 0.4988 0.7062
0.2867 8.9846 584 0.5155 0.5923 0.5155 0.7180
0.2867 9.0154 586 0.4949 0.6018 0.4949 0.7035
0.2867 9.0462 588 0.4718 0.6426 0.4718 0.6869
0.2867 9.0769 590 0.4743 0.6267 0.4743 0.6887
0.2867 9.1077 592 0.4754 0.6267 0.4754 0.6895
0.2867 9.1385 594 0.4668 0.6426 0.4668 0.6832
0.2867 9.1692 596 0.4779 0.6326 0.4779 0.6913
0.2867 9.2 598 0.4966 0.6326 0.4966 0.7047
0.2867 9.2308 600 0.5340 0.5923 0.5340 0.7308
0.2867 9.2615 602 0.5905 0.5362 0.5905 0.7685
0.2867 9.2923 604 0.6382 0.4880 0.6382 0.7989
0.2867 9.3231 606 0.6116 0.5345 0.6116 0.7820
0.2867 9.3538 608 0.5346 0.5923 0.5346 0.7312
0.2867 9.3846 610 0.5018 0.6517 0.5018 0.7084
0.2867 9.4154 612 0.4995 0.6142 0.4995 0.7068
0.2867 9.4462 614 0.5033 0.6142 0.5033 0.7094
0.2867 9.4769 616 0.5070 0.6142 0.5070 0.7121
0.2867 9.5077 618 0.5010 0.6142 0.5010 0.7078
0.2867 9.5385 620 0.4936 0.6142 0.4936 0.7026
0.2867 9.5692 622 0.4912 0.6632 0.4912 0.7009
0.2867 9.6 624 0.4864 0.6228 0.4864 0.6974
0.2867 9.6308 626 0.4840 0.6255 0.4840 0.6957
0.2867 9.6615 628 0.5040 0.5248 0.5040 0.7100
0.2867 9.6923 630 0.5285 0.4913 0.5285 0.7270
0.2867 9.7231 632 0.5221 0.5046 0.5221 0.7225
0.2867 9.7538 634 0.5080 0.5397 0.5080 0.7128
0.2867 9.7846 636 0.5058 0.5133 0.5058 0.7112

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task7_organization

Finetuned
(4019)
this model