roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2

This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1732
  • Tk: {'precision': 0.967032967032967, 'recall': 0.7586206896551724, 'f1': 0.8502415458937199, 'number': 116}
  • Gày: {'precision': 0.7142857142857143, 'recall': 0.7575757575757576, 'f1': 0.7352941176470589, 'number': 33}
  • Gày trừu tượng: {'precision': 0.9189765458422174, 'recall': 0.9229122055674518, 'f1': 0.920940170940171, 'number': 467}
  • Ã đơn: {'precision': 0.8775510204081632, 'recall': 0.864321608040201, 'f1': 0.870886075949367, 'number': 199}
  • Đt: {'precision': 0.9275053304904051, 'recall': 0.9908883826879271, 'f1': 0.9581497797356828, 'number': 878}
  • Đt trừu tượng: {'precision': 0.7916666666666666, 'recall': 0.8878504672897196, 'f1': 0.8370044052863437, 'number': 214}
  • Overall Precision: 0.9020
  • Overall Recall: 0.9313
  • Overall F1: 0.9164
  • Overall Accuracy: 0.9695

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Tk Gày Gày trừu tượng à đơn Đt Đt trừu tượng Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 1.0 227 0.1721 {'precision': 0.75, 'recall': 0.1810344827586207, 'f1': 0.2916666666666667, 'number': 116} {'precision': 0.3508771929824561, 'recall': 0.6060606060606061, 'f1': 0.4444444444444444, 'number': 33} {'precision': 0.8739316239316239, 'recall': 0.8758029978586723, 'f1': 0.8748663101604278, 'number': 467} {'precision': 0.5769230769230769, 'recall': 0.9045226130653267, 'f1': 0.7045009784735811, 'number': 199} {'precision': 0.8824763903462749, 'recall': 0.9578587699316629, 'f1': 0.9186237028945932, 'number': 878} {'precision': 0.5850340136054422, 'recall': 0.8037383177570093, 'f1': 0.6771653543307086, 'number': 214} 0.7779 0.8616 0.8176 0.9333
No log 2.0 454 0.1182 {'precision': 0.7603305785123967, 'recall': 0.7931034482758621, 'f1': 0.7763713080168776, 'number': 116} {'precision': 0.875, 'recall': 0.6363636363636364, 'f1': 0.7368421052631579, 'number': 33} {'precision': 0.9292237442922374, 'recall': 0.8715203426124197, 'f1': 0.8994475138121547, 'number': 467} {'precision': 0.75, 'recall': 0.8291457286432161, 'f1': 0.7875894988066827, 'number': 199} {'precision': 0.9436008676789588, 'recall': 0.9908883826879271, 'f1': 0.9666666666666667, 'number': 878} {'precision': 0.8571428571428571, 'recall': 0.7570093457943925, 'f1': 0.803970223325062, 'number': 214} 0.8971 0.9004 0.8987 0.9584
0.1761 3.0 681 0.1078 {'precision': 0.8, 'recall': 0.7931034482758621, 'f1': 0.7965367965367965, 'number': 116} {'precision': 0.58, 'recall': 0.8787878787878788, 'f1': 0.6987951807228915, 'number': 33} {'precision': 0.8898305084745762, 'recall': 0.8993576017130621, 'f1': 0.8945686900958466, 'number': 467} {'precision': 0.8028169014084507, 'recall': 0.8592964824120602, 'f1': 0.8300970873786407, 'number': 199} {'precision': 0.9232386961093586, 'recall': 1.0, 'f1': 0.9600874794969929, 'number': 878} {'precision': 0.8140703517587939, 'recall': 0.7570093457943925, 'f1': 0.784503631961259, 'number': 214} 0.876 0.9187 0.8969 0.9636
0.1761 4.0 908 0.1136 {'precision': 0.8409090909090909, 'recall': 0.6379310344827587, 'f1': 0.7254901960784312, 'number': 116} {'precision': 0.717948717948718, 'recall': 0.8484848484848485, 'f1': 0.7777777777777778, 'number': 33} {'precision': 0.8926315789473684, 'recall': 0.9079229122055674, 'f1': 0.9002123142250531, 'number': 467} {'precision': 0.8870056497175142, 'recall': 0.7889447236180904, 'f1': 0.8351063829787234, 'number': 199} {'precision': 0.9216931216931217, 'recall': 0.9920273348519362, 'f1': 0.9555677454744925, 'number': 878} {'precision': 0.6577181208053692, 'recall': 0.9158878504672897, 'f1': 0.765625, 'number': 214} 0.8655 0.9177 0.8908 0.9624
0.0552 5.0 1135 0.1298 {'precision': 0.9479166666666666, 'recall': 0.7844827586206896, 'f1': 0.8584905660377359, 'number': 116} {'precision': 0.6444444444444445, 'recall': 0.8787878787878788, 'f1': 0.7435897435897436, 'number': 33} {'precision': 0.9274725274725275, 'recall': 0.9036402569593148, 'f1': 0.9154013015184383, 'number': 467} {'precision': 0.8775510204081632, 'recall': 0.864321608040201, 'f1': 0.870886075949367, 'number': 199} {'precision': 0.93048128342246, 'recall': 0.9908883826879271, 'f1': 0.9597352454495311, 'number': 878} {'precision': 0.7747035573122529, 'recall': 0.9158878504672897, 'f1': 0.8394004282655246, 'number': 214} 0.8990 0.9334 0.9159 0.9557
0.0552 6.0 1362 0.1594 {'precision': 0.9420289855072463, 'recall': 0.5603448275862069, 'f1': 0.7027027027027027, 'number': 116} {'precision': 0.7878787878787878, 'recall': 0.7878787878787878, 'f1': 0.7878787878787878, 'number': 33} {'precision': 0.8291746641074856, 'recall': 0.9250535331905781, 'f1': 0.874493927125506, 'number': 467} {'precision': 0.9195402298850575, 'recall': 0.8040201005025126, 'f1': 0.8579088471849866, 'number': 199} {'precision': 0.9086188992731049, 'recall': 0.9965831435079726, 'f1': 0.9505703422053232, 'number': 878} {'precision': 0.7791164658634538, 'recall': 0.9065420560747663, 'f1': 0.8380129589632829, 'number': 214} 0.8721 0.9187 0.8948 0.9573
0.0259 7.0 1589 0.1269 {'precision': 0.9716981132075472, 'recall': 0.8879310344827587, 'f1': 0.927927927927928, 'number': 116} {'precision': 0.7352941176470589, 'recall': 0.7575757575757576, 'f1': 0.746268656716418, 'number': 33} {'precision': 0.8997912317327766, 'recall': 0.9229122055674518, 'f1': 0.9112050739957717, 'number': 467} {'precision': 0.8877551020408163, 'recall': 0.8743718592964824, 'f1': 0.8810126582278481, 'number': 199} {'precision': 0.9390642002176278, 'recall': 0.9829157175398633, 'f1': 0.9604897050639954, 'number': 878} {'precision': 0.8551401869158879, 'recall': 0.8551401869158879, 'f1': 0.8551401869158879, 'number': 214} 0.9132 0.9329 0.9230 0.9711
0.0259 8.0 1816 0.1591 {'precision': 0.9655172413793104, 'recall': 0.7241379310344828, 'f1': 0.8275862068965517, 'number': 116} {'precision': 0.675, 'recall': 0.8181818181818182, 'f1': 0.7397260273972603, 'number': 33} {'precision': 0.9100642398286938, 'recall': 0.9100642398286938, 'f1': 0.9100642398286938, 'number': 467} {'precision': 0.8403755868544601, 'recall': 0.8994974874371859, 'f1': 0.8689320388349515, 'number': 199} {'precision': 0.9242262540021344, 'recall': 0.9863325740318907, 'f1': 0.9542699724517906, 'number': 878} {'precision': 0.768595041322314, 'recall': 0.8691588785046729, 'f1': 0.8157894736842105, 'number': 214} 0.8897 0.9266 0.9078 0.9680
0.0098 9.0 2043 0.1655 {'precision': 0.967391304347826, 'recall': 0.7672413793103449, 'f1': 0.8557692307692307, 'number': 116} {'precision': 0.7352941176470589, 'recall': 0.7575757575757576, 'f1': 0.746268656716418, 'number': 33} {'precision': 0.9079497907949791, 'recall': 0.9293361884368309, 'f1': 0.9185185185185185, 'number': 467} {'precision': 0.8911917098445595, 'recall': 0.864321608040201, 'f1': 0.8775510204081632, 'number': 199} {'precision': 0.9275053304904051, 'recall': 0.9908883826879271, 'f1': 0.9581497797356828, 'number': 878} {'precision': 0.7824267782426778, 'recall': 0.8738317757009346, 'f1': 0.82560706401766, 'number': 214} 0.9002 0.9318 0.9157 0.9699
0.0098 10.0 2270 0.1732 {'precision': 0.967032967032967, 'recall': 0.7586206896551724, 'f1': 0.8502415458937199, 'number': 116} {'precision': 0.7142857142857143, 'recall': 0.7575757575757576, 'f1': 0.7352941176470589, 'number': 33} {'precision': 0.9189765458422174, 'recall': 0.9229122055674518, 'f1': 0.920940170940171, 'number': 467} {'precision': 0.8775510204081632, 'recall': 0.864321608040201, 'f1': 0.870886075949367, 'number': 199} {'precision': 0.9275053304904051, 'recall': 0.9908883826879271, 'f1': 0.9581497797356828, 'number': 878} {'precision': 0.7916666666666666, 'recall': 0.8878504672897196, 'f1': 0.8370044052863437, 'number': 214} 0.9020 0.9313 0.9164 0.9695

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Kudod/roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2

Finetuned
(929)
this model
Finetunes
5 models