End of training
Browse files- README.md +20 -20
- logs/events.out.tfevents.1744262209.f3ec25b9a925.416.0 +2 -2
- logs/events.out.tfevents.1744264105.f3ec25b9a925.416.1 +3 -0
- model.safetensors +1 -1
- tokenizer.json +16 -2
README.md
CHANGED
|
@@ -16,14 +16,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
-
- Loss: 1.
|
| 20 |
-
- Answer: {'precision': 0.
|
| 21 |
-
- Header: {'precision': 0.
|
| 22 |
-
- Question: {'precision': 0.
|
| 23 |
-
- Overall Precision: 0.
|
| 24 |
-
- Overall Recall: 0.
|
| 25 |
-
- Overall F1: 0.
|
| 26 |
-
- Overall Accuracy: 0.
|
| 27 |
|
| 28 |
## Model description
|
| 29 |
|
|
@@ -55,18 +55,18 @@ The following hyperparameters were used during training:
|
|
| 55 |
|
| 56 |
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
| 57 |
|:-------------:|:--------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
| 58 |
-
| 0.
|
| 59 |
-
| 0.
|
| 60 |
-
| 0.
|
| 61 |
-
| 0.
|
| 62 |
-
| 0.
|
| 63 |
-
| 0.
|
| 64 |
-
| 0.
|
| 65 |
-
| 0.
|
| 66 |
-
| 0.
|
| 67 |
-
| 0.
|
| 68 |
-
| 0.
|
| 69 |
-
| 0.
|
| 70 |
|
| 71 |
|
| 72 |
### Framework versions
|
|
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
+
- Loss: 1.7701
|
| 20 |
+
- Answer: {'precision': 0.8803317535545023, 'recall': 0.9094247246022031, 'f1': 0.8946417820590005, 'number': 817}
|
| 21 |
+
- Header: {'precision': 0.5901639344262295, 'recall': 0.6050420168067226, 'f1': 0.5975103734439834, 'number': 119}
|
| 22 |
+
- Question: {'precision': 0.896551724137931, 'recall': 0.9173630454967502, 'f1': 0.9068379990821477, 'number': 1077}
|
| 23 |
+
- Overall Precision: 0.8719
|
| 24 |
+
- Overall Recall: 0.8957
|
| 25 |
+
- Overall F1: 0.8836
|
| 26 |
+
- Overall Accuracy: 0.8000
|
| 27 |
|
| 28 |
## Model description
|
| 29 |
|
|
|
|
| 55 |
|
| 56 |
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
| 57 |
|:-------------:|:--------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
| 58 |
+
| 0.4168 | 10.5263 | 200 | 1.1023 | {'precision': 0.8440046565774156, 'recall': 0.8873929008567931, 'f1': 0.8651551312649164, 'number': 817} | {'precision': 0.43884892086330934, 'recall': 0.5126050420168067, 'f1': 0.4728682170542636, 'number': 119} | {'precision': 0.8705673758865248, 'recall': 0.9117920148560817, 'f1': 0.890702947845805, 'number': 1077} | 0.8316 | 0.8783 | 0.8543 | 0.7820 |
|
| 59 |
+
| 0.0476 | 21.0526 | 400 | 1.2706 | {'precision': 0.8250276854928018, 'recall': 0.9118727050183598, 'f1': 0.866279069767442, 'number': 817} | {'precision': 0.5384615384615384, 'recall': 0.5294117647058824, 'f1': 0.5338983050847458, 'number': 119} | {'precision': 0.8871701546860783, 'recall': 0.9052924791086351, 'f1': 0.896139705882353, 'number': 1077} | 0.8414 | 0.8857 | 0.8630 | 0.8023 |
|
| 60 |
+
| 0.014 | 31.5789 | 600 | 1.4921 | {'precision': 0.8704600484261501, 'recall': 0.8800489596083231, 'f1': 0.8752282410225197, 'number': 817} | {'precision': 0.496551724137931, 'recall': 0.6050420168067226, 'f1': 0.5454545454545454, 'number': 119} | {'precision': 0.8803339517625232, 'recall': 0.8811513463324049, 'f1': 0.8807424593967518, 'number': 1077} | 0.8492 | 0.8644 | 0.8567 | 0.8020 |
|
| 61 |
+
| 0.0101 | 42.1053 | 800 | 1.4732 | {'precision': 0.8352668213457076, 'recall': 0.8812729498164015, 'f1': 0.8576533650982727, 'number': 817} | {'precision': 0.6228070175438597, 'recall': 0.5966386554621849, 'f1': 0.6094420600858369, 'number': 119} | {'precision': 0.8831521739130435, 'recall': 0.9052924791086351, 'f1': 0.8940852819807427, 'number': 1077} | 0.8490 | 0.8773 | 0.8629 | 0.7839 |
|
| 62 |
+
| 0.0044 | 52.6316 | 1000 | 1.5208 | {'precision': 0.8543922984356197, 'recall': 0.8690330477356181, 'f1': 0.8616504854368933, 'number': 817} | {'precision': 0.6422018348623854, 'recall': 0.5882352941176471, 'f1': 0.6140350877192983, 'number': 119} | {'precision': 0.8970189701897019, 'recall': 0.9220055710306406, 'f1': 0.9093406593406592, 'number': 1077} | 0.8661 | 0.8808 | 0.8734 | 0.8048 |
|
| 63 |
+
| 0.003 | 63.1579 | 1200 | 1.7060 | {'precision': 0.8451178451178452, 'recall': 0.9216646266829865, 'f1': 0.8817330210772834, 'number': 817} | {'precision': 0.6853932584269663, 'recall': 0.5126050420168067, 'f1': 0.5865384615384615, 'number': 119} | {'precision': 0.8949730700179533, 'recall': 0.9257195914577531, 'f1': 0.9100867183934277, 'number': 1077} | 0.8649 | 0.8997 | 0.8819 | 0.7982 |
|
| 64 |
+
| 0.0012 | 73.6842 | 1400 | 1.7775 | {'precision': 0.870023419203747, 'recall': 0.9094247246022031, 'f1': 0.8892878515858766, 'number': 817} | {'precision': 0.5803571428571429, 'recall': 0.5462184873949579, 'f1': 0.5627705627705628, 'number': 119} | {'precision': 0.8925022583559169, 'recall': 0.9173630454967502, 'f1': 0.9047619047619048, 'number': 1077} | 0.8664 | 0.8922 | 0.8791 | 0.7922 |
|
| 65 |
+
| 0.001 | 84.2105 | 1600 | 1.7301 | {'precision': 0.8525714285714285, 'recall': 0.9130966952264382, 'f1': 0.8817966903073285, 'number': 817} | {'precision': 0.6146788990825688, 'recall': 0.5630252100840336, 'f1': 0.5877192982456141, 'number': 119} | {'precision': 0.9037927844588344, 'recall': 0.9071494893221913, 'f1': 0.9054680259499537, 'number': 1077} | 0.8668 | 0.8892 | 0.8779 | 0.8001 |
|
| 66 |
+
| 0.0007 | 94.7368 | 1800 | 1.7705 | {'precision': 0.8801897983392646, 'recall': 0.9082007343941249, 'f1': 0.8939759036144578, 'number': 817} | {'precision': 0.5581395348837209, 'recall': 0.6050420168067226, 'f1': 0.5806451612903225, 'number': 119} | {'precision': 0.9056956115779645, 'recall': 0.9006499535747446, 'f1': 0.9031657355679702, 'number': 1077} | 0.8732 | 0.8862 | 0.8797 | 0.7972 |
|
| 67 |
+
| 0.0003 | 105.2632 | 2000 | 1.7701 | {'precision': 0.8803317535545023, 'recall': 0.9094247246022031, 'f1': 0.8946417820590005, 'number': 817} | {'precision': 0.5901639344262295, 'recall': 0.6050420168067226, 'f1': 0.5975103734439834, 'number': 119} | {'precision': 0.896551724137931, 'recall': 0.9173630454967502, 'f1': 0.9068379990821477, 'number': 1077} | 0.8719 | 0.8957 | 0.8836 | 0.8000 |
|
| 68 |
+
| 0.0004 | 115.7895 | 2200 | 1.8060 | {'precision': 0.8546910755148741, 'recall': 0.9143206854345165, 'f1': 0.8835008870490834, 'number': 817} | {'precision': 0.6018518518518519, 'recall': 0.5462184873949579, 'f1': 0.5726872246696034, 'number': 119} | {'precision': 0.8975297346752058, 'recall': 0.9108635097493036, 'f1': 0.904147465437788, 'number': 1077} | 0.8641 | 0.8907 | 0.8772 | 0.7971 |
|
| 69 |
+
| 0.0002 | 126.3158 | 2400 | 1.7987 | {'precision': 0.8646441073512252, 'recall': 0.9069767441860465, 'f1': 0.8853046594982078, 'number': 817} | {'precision': 0.544, 'recall': 0.5714285714285714, 'f1': 0.5573770491803279, 'number': 119} | {'precision': 0.9048938134810711, 'recall': 0.9099350046425255, 'f1': 0.9074074074074073, 'number': 1077} | 0.8663 | 0.8887 | 0.8774 | 0.7953 |
|
| 70 |
|
| 71 |
|
| 72 |
### Framework versions
|
logs/events.out.tfevents.1744262209.f3ec25b9a925.416.0
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:9858ecf3a396559ef041674aa84ab57226159bc091792e3bc93e9c0ba8119dfc
|
| 3 |
+
size 14373
|
logs/events.out.tfevents.1744264105.f3ec25b9a925.416.1
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:775bfb66411f0db218bcb7f3f7bf7629ca26c0a4924140f88dc6f44c4c2003d5
|
| 3 |
+
size 592
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 520727564
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:788f410751fae1cbd0ef6e565a5aa3305d6b45af37e93b240a5e739986dbbc9a
|
| 3 |
size 520727564
|
tokenizer.json
CHANGED
|
@@ -1,7 +1,21 @@
|
|
| 1 |
{
|
| 2 |
"version": "1.0",
|
| 3 |
-
"truncation":
|
| 4 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 5 |
"added_tokens": [
|
| 6 |
{
|
| 7 |
"id": 0,
|
|
|
|
| 1 |
{
|
| 2 |
"version": "1.0",
|
| 3 |
+
"truncation": {
|
| 4 |
+
"direction": "Right",
|
| 5 |
+
"max_length": 512,
|
| 6 |
+
"strategy": "LongestFirst",
|
| 7 |
+
"stride": 0
|
| 8 |
+
},
|
| 9 |
+
"padding": {
|
| 10 |
+
"strategy": {
|
| 11 |
+
"Fixed": 512
|
| 12 |
+
},
|
| 13 |
+
"direction": "Right",
|
| 14 |
+
"pad_to_multiple_of": null,
|
| 15 |
+
"pad_id": 1,
|
| 16 |
+
"pad_type_id": 0,
|
| 17 |
+
"pad_token": "<pad>"
|
| 18 |
+
},
|
| 19 |
"added_tokens": [
|
| 20 |
{
|
| 21 |
"id": 0,
|