aus_slang_classifier

This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Accuracy: 0.487

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.0005 1.0 1250 0.0002 0.487
0.001 2.0 2500 0.0002 0.487
0.0088 3.0 3750 0.0012 0.487
0.0035 4.0 5000 0.0027 0.487
0.0061 5.0 6250 0.0016 0.487
0.0003 6.0 7500 0.0000 0.487
0.0003 7.0 8750 0.0001 0.487
0.0003 8.0 10000 0.0000 0.487
0.0003 9.0 11250 0.0000 0.487
0.0016 10.0 12500 0.0004 0.487
0.0005 11.0 13750 0.0000 0.487
0.0011 12.0 15000 0.0000 0.487
0.0002 13.0 16250 0.0000 0.487
0.0002 14.0 17500 0.0001 0.487
0.0002 15.0 18750 0.0000 0.487
0.0002 16.0 20000 0.0002 0.487
0.0002 17.0 21250 0.0000 0.487
0.0002 18.0 22500 0.0004 0.487
0.0005 19.0 23750 0.0000 0.487
0.0002 20.0 25000 0.0001 0.487
0.0002 21.0 26250 0.0000 0.487
0.0001 22.0 27500 0.0000 0.487
0.0015 23.0 28750 0.0004 0.487
0.0011 24.0 30000 0.0001 0.487
0.0007 25.0 31250 0.0061 0.487
0.0012 26.0 32500 0.0025 0.487
0.0015 27.0 33750 0.0060 0.487
0.0018 28.0 35000 0.0051 0.487
0.0022 29.0 36250 0.0050 0.487
0.0024 30.0 37500 0.0051 0.487
0.0025 31.0 38750 0.0020 0.487
0.0007 32.0 40000 0.0021 0.487
0.0013 33.0 41250 0.0021 0.487
0.0018 34.0 42500 0.0020 0.487
0.0013 35.0 43750 0.0027 0.487
0.0013 36.0 45000 0.0020 0.487
0.001 37.0 46250 0.0020 0.487
0.0007 38.0 47500 0.0022 0.487
0.0017 39.0 48750 0.0022 0.487
0.0017 40.0 50000 0.0021 0.487
0.0048 41.0 51250 0.0041 0.487
0.0012 42.0 52500 0.0020 0.487
0.0015 43.0 53750 0.0020 0.487
0.0017 44.0 55000 0.0023 0.487
0.0038 45.0 56250 0.0021 0.487
0.0032 46.0 57500 0.0021 0.487
0.0343 47.0 58750 0.2751 0.487
0.0012 48.0 60000 0.0013 0.487
0.0007 49.0 61250 0.0005 0.487
0.0006 50.0 62500 0.0003 0.487
0.0008 51.0 63750 0.0007 0.487
0.0015 52.0 65000 0.0020 0.487
0.0005 53.0 66250 0.0011 0.487
0.0002 54.0 67500 0.0009 0.487
0.0002 55.0 68750 0.0012 0.487
0.0002 56.0 70000 0.0002 0.487
0.0002 57.0 71250 0.0014 0.487
0.0002 58.0 72500 0.0003 0.487
0.0002 59.0 73750 0.0004 0.487
0.0002 60.0 75000 0.0006 0.487
0.0002 61.0 76250 0.0007 0.487
0.0001 62.0 77500 0.0004 0.487
0.0002 63.0 78750 0.0008 0.487
0.0001 64.0 80000 0.0006 0.487
0.0001 65.0 81250 0.0007 0.487
0.0001 66.0 82500 0.0006 0.487
0.0001 67.0 83750 0.0004 0.487
0.0001 68.0 85000 0.0004 0.487
0.0001 69.0 86250 0.0003 0.487
0.0031 70.0 87500 0.0032 0.487
0.0155 71.0 88750 0.0057 0.487
0.0112 72.0 90000 0.0066 0.487
0.0103 73.0 91250 0.0064 0.487
0.0086 74.0 92500 0.0072 0.487
0.0029 75.0 93750 0.0002 0.487
0.0009 76.0 95000 0.0004 0.487
0.0014 77.0 96250 0.0006 0.487
0.0014 78.0 97500 0.0006 0.487
0.0009 79.0 98750 0.0002 0.487
0.0014 80.0 100000 0.0003 0.487
0.0014 81.0 101250 0.0004 0.487
0.0009 82.0 102500 0.0001 0.487
0.0006 83.0 103750 0.0007 0.487
0.0004 84.0 105000 0.0005 0.487
0.0014 85.0 106250 0.0002 0.487
0.0009 86.0 107500 0.0005 0.487
0.0006 87.0 108750 0.0003 0.487
0.0004 88.0 110000 0.0004 0.487
0.0003 89.0 111250 0.0005 0.487
0.0001 90.0 112500 0.0004 0.487
0.0004 91.0 113750 0.0003 0.487
0.0001 92.0 115000 0.0003 0.487
0.0001 93.0 116250 0.0003 0.487
0.0056 94.0 117500 0.0053 0.487
0.0049 95.0 118750 0.0046 0.487
0.0036 96.0 120000 0.0042 0.487
0.0029 97.0 121250 0.0002 0.487
0.0021 98.0 122500 0.0003 0.487
0.0028 99.0 123750 0.0094 0.487
0.0038 100.0 125000 0.0074 0.487
0.0051 101.0 126250 0.0041 0.487
0.0046 102.0 127500 0.0042 0.487
0.0041 103.0 128750 0.0042 0.487
0.0026 104.0 130000 0.0023 0.487
0.0034 105.0 131250 0.0023 0.487
0.0041 106.0 132500 0.0022 0.487
0.0028 107.0 133750 0.0022 0.487
0.0038 108.0 135000 0.0022 0.487
0.0029 109.0 136250 0.0022 0.487
0.0026 110.0 137500 0.0021 0.487
0.0051 111.0 138750 0.0119 0.487
0.0305 112.0 140000 0.0091 0.487
0.0063 113.0 141250 0.0092 0.487
0.0073 114.0 142500 0.0092 0.487
0.008 115.0 143750 0.0090 0.487
0.0031 116.0 145000 0.0003 0.487
0.0101 117.0 146250 0.0148 0.487
0.0065 118.0 147500 0.0071 0.487
0.0042 119.0 148750 0.0008 0.487
0.0031 120.0 150000 0.0001 0.487
0.0021 121.0 151250 0.0011 0.487
0.0034 122.0 152500 0.0001 0.487
0.0014 123.0 153750 0.0001 0.487
0.0008 124.0 155000 0.0001 0.487
0.0013 125.0 156250 0.0001 0.487
0.0016 126.0 157500 0.0000 0.487
0.0022 127.0 158750 0.0002 0.487
0.0001 128.0 160000 0.0002 0.487
0.0001 129.0 161250 0.0000 0.487
0.0001 130.0 162500 0.0002 0.487
0.0001 131.0 163750 0.0001 0.487
0.0001 132.0 165000 0.0002 0.487
0.0008 133.0 166250 0.0001 0.487
0.0001 134.0 167500 0.0001 0.487
0.0001 135.0 168750 0.0001 0.487
0.0001 136.0 170000 0.0002 0.487
0.0001 137.0 171250 0.0001 0.487
0.0001 138.0 172500 0.0001 0.487
0.0001 139.0 173750 0.0001 0.487
0.0001 140.0 175000 0.0002 0.487
0.0001 141.0 176250 0.0001 0.487
0.0001 142.0 177500 0.0001 0.487
0.0001 143.0 178750 0.0001 0.487
0.0001 144.0 180000 0.0001 0.487
0.0001 145.0 181250 0.0000 0.487
0.0001 146.0 182500 0.0000 0.487
0.0001 147.0 183750 0.0000 0.487
0.0001 148.0 185000 0.0000 0.487
0.0001 149.0 186250 0.0001 0.487
0.0001 150.0 187500 0.0000 0.487
0.0001 151.0 188750 0.0000 0.487
0.0001 152.0 190000 0.0000 0.487
0.0001 153.0 191250 0.0000 0.487
0.0001 154.0 192500 0.0001 0.487
0.0001 155.0 193750 0.0001 0.487
0.0001 156.0 195000 0.0000 0.487
0.0001 157.0 196250 0.0001 0.487
0.0001 158.0 197500 0.0001 0.487
0.0001 159.0 198750 0.0001 0.487
0.0001 160.0 200000 0.0001 0.487
0.0001 161.0 201250 0.0001 0.487
0.0001 162.0 202500 0.0000 0.487
0.0001 163.0 203750 0.0001 0.487
0.0001 164.0 205000 0.0001 0.487
0.0001 165.0 206250 0.0001 0.487
0.0001 166.0 207500 0.0000 0.487
0.0001 167.0 208750 0.0000 0.487
0.0001 168.0 210000 0.0000 0.487
0.0001 169.0 211250 0.0000 0.487
0.0001 170.0 212500 0.0001 0.487
0.0001 171.0 213750 0.0001 0.487
0.0001 172.0 215000 0.0000 0.487
0.0001 173.0 216250 0.0001 0.487
0.0001 174.0 217500 0.0001 0.487
0.0001 175.0 218750 0.0000 0.487
0.0001 176.0 220000 0.0000 0.487
0.0001 177.0 221250 0.0001 0.487
0.0001 178.0 222500 0.0000 0.487
0.0001 179.0 223750 0.0001 0.487
0.0001 180.0 225000 0.0001 0.487
0.0001 181.0 226250 0.0000 0.487
0.0001 182.0 227500 0.0000 0.487
0.0001 183.0 228750 0.0000 0.487
0.0001 184.0 230000 0.0001 0.487
0.0001 185.0 231250 0.0000 0.487
0.0001 186.0 232500 0.0001 0.487
0.0001 187.0 233750 0.0001 0.487
0.0001 188.0 235000 0.0000 0.487
0.0001 189.0 236250 0.0000 0.487
0.0001 190.0 237500 0.0000 0.487
0.0001 191.0 238750 0.0001 0.487
0.0001 192.0 240000 0.0000 0.487
0.0001 193.0 241250 0.0000 0.487
0.0001 194.0 242500 0.0000 0.487
0.0001 195.0 243750 0.0001 0.487
0.0001 196.0 245000 0.0000 0.487
0.0001 197.0 246250 0.0000 0.487
0.0001 198.0 247500 0.0000 0.487
0.0001 199.0 248750 0.0001 0.487
0.0001 200.0 250000 0.0000 0.487

Framework versions

  • Transformers 4.55.0
  • Pytorch 2.8.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.21.4
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for hdabare/aus_slang_classifier

Finetuned
(2771)
this model