logsegmenter / README.md
bolu61's picture
Model save
232ffb2 verified
|
raw
history blame
16.8 kB
metadata
library_name: transformers
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: sal-base
    results: []

sal-base

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4904
  • Accuracy: 0.9313
  • Precision Eol: 0.9398
  • Precision Msg: 0.8710
  • Precision Cmd: 0.0
  • Precision Var: 1.0
  • Precision Dff: 0.9469
  • Precision Pgr: 0.65
  • Precision Stk: 1.0
  • Precision Itm: 0.9667
  • Precision Hex: 0.6
  • Precision Yml: 1.0
  • Recall Eol: 0.9713
  • Recall Msg: 0.9643
  • Recall Cmd: 0.0
  • Recall Var: 0.75
  • Recall Dff: 0.9774
  • Recall Pgr: 0.9286
  • Recall Stk: 0.7671
  • Recall Itm: 0.7838
  • Recall Hex: 1.0
  • Recall Yml: 1.0
  • F1 Eol: 0.9553
  • F1 Msg: 0.9153
  • F1 Cmd: 0.0
  • F1 Var: 0.8571
  • F1 Dff: 0.9619
  • F1 Pgr: 0.7647
  • F1 Stk: 0.8682
  • F1 Itm: 0.8657
  • F1 Hex: 0.75
  • F1 Yml: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 1
  • eval_batch_size: 8
  • seed: 45242
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 32.0

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Eol Precision Msg Precision Cmd Precision Var Precision Dff Precision Pgr Precision Stk Precision Itm Precision Hex Precision Yml Recall Eol Recall Msg Recall Cmd Recall Var Recall Dff Recall Pgr Recall Stk Recall Itm Recall Hex Recall Yml F1 Eol F1 Msg F1 Cmd F1 Var F1 Dff F1 Pgr F1 Stk F1 Itm F1 Hex F1 Yml
1.3346 1.0 807 1.1337 0.6295 0.4384 0.0 0.0 0.0 0.9043 0.0 0.7 0.0 0.0 0.0 0.8517 0.0 0.0 0.0 0.8226 0.0 0.0959 0.0 0.0 0.0 0.5789 0.0 0.0 0.0 0.8615 0.0 0.1687 0.0 0.0 0.0
1.1568 2.0 1614 0.6844 0.7868 0.6988 0.8571 0.0 0.0 0.9900 0.8 0.4766 0.8333 0.0 0.0 0.8325 0.2143 0.0 0.0 0.9548 0.5714 0.8356 0.1351 0.0 0.0 0.7598 0.3429 0.0 0.0 0.9721 0.6667 0.6070 0.2326 0.0 0.0
0.7174 3.0 2421 0.6051 0.8498 0.7316 0.9091 0.0 0.0 0.9966 0.56 0.7846 0.8387 0.0 0.0 0.9522 0.3571 0.0 0.0 0.9484 1.0 0.6986 0.7027 0.0 0.0 0.8274 0.5128 0.0 0.0 0.9719 0.7179 0.7391 0.7647 0.0 0.0
0.7051 4.0 3228 0.4208 0.9013 0.8417 0.8235 0.0 0.0 0.9870 0.5417 0.9 1.0 0.0 0.8889 0.9665 0.5 0.0 0.0 0.9806 0.9286 0.8630 0.7027 0.0 1.0 0.8998 0.6222 0.0 0.0 0.9838 0.6842 0.8811 0.8254 0.0 0.9412
0.5254 5.0 4035 0.3691 0.9142 0.8826 0.7 0.0 0.0 0.9902 0.5417 0.9103 1.0 0.0 1.0 0.9713 0.5 0.0 0.0 0.9806 0.9286 0.9726 0.7027 0.0 1.0 0.9248 0.5833 0.0 0.0 0.9854 0.6842 0.9404 0.8254 0.0 1.0
0.2888 6.0 4842 0.3166 0.9256 0.9615 0.7407 0.0 0.0 0.9902 0.56 0.8295 0.9655 0.0 0.8889 0.9569 0.7143 0.0 0.0 0.9806 1.0 1.0 0.7568 0.0 1.0 0.9592 0.7273 0.0 0.0 0.9854 0.7179 0.9068 0.8485 0.0 0.9412
0.1577 7.0 5649 0.3385 0.9299 0.9309 0.7857 0.0 0.0 0.9870 0.5652 0.9125 0.9655 0.0 1.0 0.9665 0.7857 0.0 0.0 0.9806 0.9286 1.0 0.7568 0.0 1.0 0.9484 0.7857 0.0 0.0 0.9838 0.7027 0.9542 0.8485 0.0 1.0
0.2253 8.0 6456 0.3227 0.9356 0.9528 0.8438 0.0 0.0 0.9967 0.56 0.8588 1.0 0.0 1.0 0.9665 0.9643 0.0 0.0 0.9774 1.0 1.0 0.7297 0.0 1.0 0.9596 0.9 0.0 0.0 0.9870 0.7179 0.9241 0.8438 0.0 1.0
0.2967 9.0 7263 0.2755 0.9456 0.9484 0.8710 0.0 0.0 0.9934 0.5909 0.9241 1.0 0.5 1.0 0.9665 0.9643 0.0 0.0 0.9774 0.9286 1.0 0.7838 0.6667 1.0 0.9573 0.9153 0.0 0.0 0.9854 0.7222 0.9605 0.8788 0.5714 1.0
0.0673 10.0 8070 0.2925 0.9485 0.9528 0.8710 0.0 0.0 0.9838 0.56 1.0 1.0 0.6 1.0 0.9665 0.9643 0.0 0.0 0.9806 1.0 0.9863 0.7297 1.0 1.0 0.9596 0.9153 0.0 0.0 0.9822 0.7179 0.9931 0.8438 0.75 1.0
0.2369 11.0 8877 0.3057 0.9514 0.9533 0.9 0.0 0.0 0.9870 0.5417 1.0 1.0 0.6 1.0 0.9761 0.9643 0.0 0.0 0.9806 0.9286 0.9863 0.7568 1.0 1.0 0.9645 0.9310 0.0 0.0 0.9838 0.6842 0.9931 0.8615 0.75 1.0
0.1307 12.0 9684 0.3090 0.9499 0.9531 0.8710 0.0 0.0 0.9870 0.5417 1.0 1.0 0.6 1.0 0.9713 0.9643 0.0 0.0 0.9774 0.9286 1.0 0.7568 1.0 1.0 0.9621 0.9153 0.0 0.0 0.9822 0.6842 1.0 0.8615 0.75 1.0
0.0639 13.0 10491 0.2790 0.9642 0.9758 0.875 0.0 1.0 0.9935 0.6190 1.0 1.0 0.6 1.0 0.9665 1.0 0.0 0.75 0.9935 0.9286 1.0 0.8108 1.0 1.0 0.9712 0.9333 0.0 0.8571 0.9935 0.7429 1.0 0.8955 0.75 1.0
0.1262 14.0 11298 0.3562 0.9342 0.9486 0.9 0.0 1.0 0.9441 0.6190 1.0 1.0 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9806 0.9286 0.7671 0.8108 1.0 1.0 0.9598 0.9310 0.0 0.8571 0.9620 0.7429 0.8682 0.8955 0.75 1.0
0.0169 15.0 12105 0.3680 0.9557 0.9442 0.8710 0.0 1.0 0.9967 0.6190 1.0 1.0 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9774 0.9286 1.0 0.7838 1.0 1.0 0.9575 0.9153 0.0 0.8571 0.9870 0.7429 1.0 0.8788 0.75 1.0
0.1178 16.0 12912 0.3159 0.9571 0.9531 0.8710 0.0 1.0 0.9967 0.5909 1.0 1.0 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9839 0.9286 1.0 0.7568 1.0 1.0 0.9621 0.9153 0.0 0.8571 0.9903 0.7222 1.0 0.8615 0.75 1.0
0.1505 17.0 13719 0.3400 0.9571 0.9486 0.8710 0.0 1.0 0.9967 0.6364 1.0 1.0 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9774 1.0 1.0 0.7838 1.0 1.0 0.9598 0.9153 0.0 0.8571 0.9870 0.7778 1.0 0.8788 0.75 1.0
0.0759 18.0 14526 0.4036 0.9385 0.9621 0.8710 0.0 1.0 0.9474 0.6364 1.0 1.0 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9871 1.0 0.7671 0.8108 1.0 1.0 0.9667 0.9153 0.0 0.8571 0.9668 0.7778 0.8682 0.8955 0.75 1.0
0.023 19.0 15333 0.4086 0.9542 0.9398 0.8710 0.0 1.0 0.9967 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9774 0.9286 0.9863 0.7838 1.0 1.0 0.9553 0.9153 0.0 0.8571 0.9870 0.7647 0.9931 0.8657 0.75 1.0
0.0844 20.0 16140 0.4447 0.9342 0.9486 0.8710 0.0 1.0 0.9470 0.6364 1.0 1.0 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9806 1.0 0.7671 0.7838 1.0 1.0 0.9598 0.9153 0.0 0.8571 0.9635 0.7778 0.8682 0.8788 0.75 1.0
0.1371 21.0 16947 0.4818 0.9299 0.9401 0.9 0.0 1.0 0.9410 0.65 1.0 0.9667 0.6 1.0 0.9761 0.9643 0.0 0.25 0.9774 0.9286 0.7671 0.7838 1.0 1.0 0.9577 0.9310 0.0 0.4 0.9589 0.7647 0.8682 0.8657 0.75 1.0
0.0311 22.0 17754 0.4369 0.9356 0.9531 0.8710 0.0 1.0 0.9472 0.6667 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9839 1.0 0.7671 0.7838 1.0 1.0 0.9621 0.9153 0.0 0.8571 0.9652 0.8 0.8682 0.8657 0.75 1.0
0.0343 23.0 18561 0.4735 0.9342 0.9486 0.875 0.0 1.0 0.9469 0.6667 1.0 0.9667 0.6 1.0 0.9713 1.0 0.0 0.75 0.9774 1.0 0.7671 0.7838 1.0 1.0 0.9598 0.9333 0.0 0.8571 0.9619 0.8 0.8682 0.8657 0.75 1.0
0.0752 24.0 19368 0.4295 0.9356 0.9531 0.875 0.0 1.0 0.9470 0.6667 1.0 0.9667 0.6 1.0 0.9713 1.0 0.0 0.75 0.9806 1.0 0.7671 0.7838 1.0 1.0 0.9621 0.9333 0.0 0.8571 0.9635 0.8 0.8682 0.8657 0.75 1.0
0.0074 25.0 20175 0.4687 0.9313 0.9398 0.8710 0.0 1.0 0.9469 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9774 0.9286 0.7671 0.7838 1.0 1.0 0.9553 0.9153 0.0 0.8571 0.9619 0.7647 0.8682 0.8657 0.75 1.0
0.0053 26.0 20982 0.4892 0.9328 0.9442 0.8710 0.0 1.0 0.9470 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9806 0.9286 0.7671 0.7838 1.0 1.0 0.9575 0.9153 0.0 0.8571 0.9635 0.7647 0.8682 0.8657 0.75 1.0
0.0211 27.0 21789 0.4765 0.9328 0.9442 0.8710 0.0 1.0 0.9470 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9806 0.9286 0.7671 0.7838 1.0 1.0 0.9575 0.9153 0.0 0.8571 0.9635 0.7647 0.8682 0.8657 0.75 1.0
0.0671 28.0 22596 0.4978 0.9328 0.9442 0.8710 0.0 1.0 0.9470 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9806 0.9286 0.7671 0.7838 1.0 1.0 0.9575 0.9153 0.0 0.8571 0.9635 0.7647 0.8682 0.8657 0.75 1.0
0.0065 29.0 23403 0.4934 0.9328 0.9442 0.8710 0.0 1.0 0.9470 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9806 0.9286 0.7671 0.7838 1.0 1.0 0.9575 0.9153 0.0 0.8571 0.9635 0.7647 0.8682 0.8657 0.75 1.0
0.0003 30.0 24210 0.4905 0.9328 0.9442 0.8710 0.0 1.0 0.9470 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9806 0.9286 0.7671 0.7838 1.0 1.0 0.9575 0.9153 0.0 0.8571 0.9635 0.7647 0.8682 0.8657 0.75 1.0
0.0095 31.0 25017 0.4895 0.9313 0.9398 0.8710 0.0 1.0 0.9469 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9774 0.9286 0.7671 0.7838 1.0 1.0 0.9553 0.9153 0.0 0.8571 0.9619 0.7647 0.8682 0.8657 0.75 1.0
0.0665 32.0 25824 0.4904 0.9313 0.9398 0.8710 0.0 1.0 0.9469 0.65 1.0 0.9667 0.6 1.0 0.9713 0.9643 0.0 0.75 0.9774 0.9286 0.7671 0.7838 1.0 1.0 0.9553 0.9153 0.0 0.8571 0.9619 0.7647 0.8682 0.8657 0.75 1.0

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0