bert-suicide-detection-hk-large

This model is a fine-tuned version of wcyat/bert-suicide-detection-hk on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0695
  • Accuracy: 0.9832

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.1588 0.0769 20 0.0662 0.9832
0.257 0.1538 40 0.0271 0.9916
0.1515 0.2308 60 0.0256 0.9832
0.1104 0.3077 80 0.0976 0.9580
0.062 0.3846 100 0.0559 0.9916
0.2219 0.4615 120 0.0380 0.9916
0.292 0.5385 140 0.1136 0.9748
0.0766 0.6154 160 0.0473 0.9916
0.1286 0.6923 180 0.0592 0.9916
0.0965 0.7692 200 0.0525 0.9832
0.1859 0.8462 220 0.0732 0.9748
0.0795 0.9231 240 0.0039 1.0
0.1486 1.0 260 0.0572 0.9832
0.0656 1.0769 280 0.0392 0.9916
0.001 1.1538 300 0.0501 0.9916
0.0014 1.2308 320 0.0973 0.9832
0.1297 1.3077 340 0.0905 0.9832
0.0004 1.3846 360 0.0639 0.9916
0.0387 1.4615 380 0.0674 0.9916
0.0551 1.5385 400 0.0661 0.9916
0.1413 1.6154 420 0.0660 0.9916
0.0004 1.6923 440 0.0663 0.9916
0.0514 1.7692 460 0.1180 0.9748
0.105 1.8462 480 0.0699 0.9832
0.0599 1.9231 500 0.1168 0.9748
0.0005 2.0 520 0.1709 0.9580
0.0007 2.0769 540 0.0688 0.9916
0.0002 2.1538 560 0.0670 0.9916
0.0002 2.2308 580 0.0673 0.9916
0.0001 2.3077 600 0.0683 0.9916
0.0001 2.3846 620 0.0688 0.9916
0.0002 2.4615 640 0.0699 0.9916
0.034 2.5385 660 0.1193 0.9748
0.1757 2.6154 680 0.0948 0.9748
0.0615 2.6923 700 0.0344 0.9916
0.0015 2.7692 720 0.0559 0.9916
0.0002 2.8462 740 0.0615 0.9916
0.0001 2.9231 760 0.0628 0.9916
0.0001 3.0 780 0.0635 0.9916
0.0001 3.0769 800 0.0643 0.9916
0.0001 3.1538 820 0.0648 0.9916
0.0001 3.2308 840 0.0654 0.9916
0.0001 3.3077 860 0.0661 0.9916
0.0005 3.3846 880 0.0670 0.9916
0.0006 3.4615 900 0.0682 0.9916
0.0695 3.5385 920 0.0669 0.9916
0.0001 3.6154 940 0.0656 0.9916
0.0372 3.6923 960 0.0632 0.9916
0.0802 3.7692 980 0.0546 0.9916
0.0002 3.8462 1000 0.0541 0.9916
0.0002 3.9231 1020 0.0561 0.9916
0.0098 4.0 1040 0.0601 0.9916
0.0002 4.0769 1060 0.0640 0.9916
0.0017 4.1538 1080 0.0682 0.9832
0.0001 4.2308 1100 0.0688 0.9916
0.0159 4.3077 1120 0.0669 0.9916
0.0001 4.3846 1140 0.0657 0.9916
0.0102 4.4615 1160 0.0676 0.9916
0.0327 4.5385 1180 0.0730 0.9832
0.0182 4.6154 1200 0.0717 0.9832
0.0001 4.6923 1220 0.0699 0.9832
0.0001 4.7692 1240 0.0698 0.9832
0.0001 4.8462 1260 0.0698 0.9832
0.0557 4.9231 1280 0.0695 0.9832
0.0347 5.0 1300 0.0695 0.9832

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
31
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for wcyat/bert-suicide-detection-hk-large

Unable to build the model tree, the base model loops to the model itself. Learn more.

Spaces using wcyat/bert-suicide-detection-hk-large 2

Evaluation results