Daniil Larionov commited on
Commit
4c9978d
·
1 Parent(s): 218dbd6

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +36 -32
README.md CHANGED
@@ -13,27 +13,31 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  This model is a fine-tuned version of [./ruBert-base/](https://huggingface.co/./ruBert-base/) on an unknown dataset.
15
  It achieves the following results on the evaluation set:
16
- - Loss: 0.2417
17
- - Predicate Precision: 0.9323
18
- - Predicate Recall: 0.9612
19
- - Predicate F1: 0.9466
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
  - Predicate Number: 129
21
- - Инструмент Precision: 0.0
22
- - Инструмент Recall: 0.0
23
- - Инструмент F1: 0.0
24
- - Инструмент Number: 1
25
- - Каузатор Precision: 0.7667
26
- - Каузатор Recall: 0.6301
27
- - Каузатор F1: 0.6917
28
- - Каузатор Number: 73
29
- - Экспериенцер Precision: 0.6939
30
- - Экспериенцер Recall: 0.8293
31
- - Экспериенцер F1: 0.7556
32
- - Экспериенцер Number: 41
33
- - Overall Precision: 0.8430
34
- - Overall Recall: 0.8361
35
- - Overall F1: 0.8395
36
- - Overall Accuracy: 0.9584
37
 
38
  ## Model description
39
 
@@ -63,18 +67,18 @@ The following hyperparameters were used during training:
63
 
64
  ### Training results
65
 
66
- | Training Loss | Epoch | Step | Validation Loss | Predicate Precision | Predicate Recall | Predicate F1 | Predicate Number | Инструмент Precision | Инструмент Recall | Инструмент F1 | Инструмент Number | Каузатор Precision | Каузатор Recall | Каузатор F1 | Каузатор Number | Экспериенцер Precision | Экспериенцер Recall | Экспериенцер F1 | Экспериенцер Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
67
- |:-------------:|:-----:|:----:|:---------------:|:-------------------:|:----------------:|:------------:|:----------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
68
- | 0.2462 | 1.0 | 54 | 0.1554 | 0.8897 | 1.0 | 0.9416 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.7708 | 0.5068 | 0.6116 | 73 | 0.6047 | 0.6341 | 0.6190 | 41 | 0.8136 | 0.7869 | 0.8 | 0.9486 |
69
- | 0.1863 | 2.0 | 108 | 0.1268 | 0.9014 | 0.9922 | 0.9446 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.8444 | 0.5205 | 0.6441 | 73 | 0.6829 | 0.6829 | 0.6829 | 41 | 0.8509 | 0.7951 | 0.8220 | 0.9557 |
70
- | 0.0668 | 3.0 | 162 | 0.1288 | 0.9338 | 0.9845 | 0.9585 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.8148 | 0.6027 | 0.6929 | 73 | 0.6957 | 0.7805 | 0.7356 | 41 | 0.8602 | 0.8320 | 0.8458 | 0.9600 |
71
- | 0.039 | 4.0 | 216 | 0.1695 | 0.9007 | 0.9845 | 0.9407 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.8298 | 0.5342 | 0.6500 | 73 | 0.6441 | 0.9268 | 0.76 | 41 | 0.8259 | 0.8361 | 0.8310 | 0.9557 |
72
- | 0.0187 | 5.0 | 270 | 0.1955 | 0.9323 | 0.9612 | 0.9466 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.75 | 0.5753 | 0.6512 | 73 | 0.7105 | 0.6585 | 0.6835 | 41 | 0.8502 | 0.7910 | 0.8195 | 0.9551 |
73
- | 0.0216 | 6.0 | 324 | 0.2083 | 0.9394 | 0.9612 | 0.9502 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.7586 | 0.6027 | 0.6718 | 73 | 0.6829 | 0.6829 | 0.6829 | 41 | 0.8485 | 0.8033 | 0.8253 | 0.9562 |
74
- | 0.0176 | 7.0 | 378 | 0.2203 | 0.9323 | 0.9612 | 0.9466 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.7273 | 0.6575 | 0.6906 | 73 | 0.68 | 0.8293 | 0.7473 | 41 | 0.8273 | 0.8443 | 0.8357 | 0.9578 |
75
- | 0.0037 | 8.0 | 432 | 0.2313 | 0.9323 | 0.9612 | 0.9466 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.7541 | 0.6301 | 0.6866 | 73 | 0.6809 | 0.7805 | 0.7273 | 41 | 0.8382 | 0.8279 | 0.8330 | 0.9567 |
76
- | 0.0089 | 9.0 | 486 | 0.2409 | 0.9323 | 0.9612 | 0.9466 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.7705 | 0.6438 | 0.7015 | 73 | 0.6939 | 0.8293 | 0.7556 | 41 | 0.8436 | 0.8402 | 0.8419 | 0.9589 |
77
- | 0.0043 | 10.0 | 540 | 0.2417 | 0.9323 | 0.9612 | 0.9466 | 129 | 0.0 | 0.0 | 0.0 | 1 | 0.7667 | 0.6301 | 0.6917 | 73 | 0.6939 | 0.8293 | 0.7556 | 41 | 0.8430 | 0.8361 | 0.8395 | 0.9584 |
78
 
79
 
80
  ### Framework versions
 
13
 
14
  This model is a fine-tuned version of [./ruBert-base/](https://huggingface.co/./ruBert-base/) on an unknown dataset.
15
  It achieves the following results on the evaluation set:
16
+ - Loss: 0.1537
17
+ - Causator Precision: 0.7677
18
+ - Causator Recall: 0.8352
19
+ - Causator F1: 0.8
20
+ - Causator Number: 91
21
+ - Expiriencer Precision: 0.9048
22
+ - Expiriencer Recall: 0.9694
23
+ - Expiriencer F1: 0.9360
24
+ - Expiriencer Number: 98
25
+ - Instrument Precision: 0.6
26
+ - Instrument Recall: 1.0
27
+ - Instrument F1: 0.7500
28
+ - Instrument Number: 6
29
+ - Other Precision: 0.0
30
+ - Other Recall: 0.0
31
+ - Other F1: 0.0
32
+ - Other Number: 1
33
+ - Predicate Precision: 0.9137
34
+ - Predicate Recall: 0.9845
35
+ - Predicate F1: 0.9478
36
  - Predicate Number: 129
37
+ - Overall Precision: 0.8612
38
+ - Overall Recall: 0.9354
39
+ - Overall F1: 0.8968
40
+ - Overall Accuracy: 0.9661
 
 
 
 
 
 
 
 
 
 
 
 
41
 
42
  ## Model description
43
 
 
67
 
68
  ### Training results
69
 
70
+ | Training Loss | Epoch | Step | Validation Loss | Causator Precision | Causator Recall | Causator F1 | Causator Number | Expiriencer Precision | Expiriencer Recall | Expiriencer F1 | Expiriencer Number | Instrument Precision | Instrument Recall | Instrument F1 | Instrument Number | Other Precision | Other Recall | Other F1 | Other Number | Predicate Precision | Predicate Recall | Predicate F1 | Predicate Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
71
+ |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:---------------:|:------------:|:--------:|:------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
72
+ | 0.3043 | 1.0 | 56 | 0.3538 | 0.75 | 0.6264 | 0.6826 | 91 | 0.7981 | 0.8469 | 0.8218 | 98 | 0.0 | 0.0 | 0.0 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.8741 | 0.9690 | 0.9191 | 129 | 0.8204 | 0.8154 | 0.8179 | 0.9142 |
73
+ | 0.2664 | 2.0 | 112 | 0.1961 | 0.8784 | 0.7143 | 0.7879 | 91 | 0.9175 | 0.9082 | 0.9128 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9398 | 0.9690 | 0.9542 | 129 | 0.9076 | 0.8769 | 0.8920 | 0.9399 |
74
+ | 0.0373 | 3.0 | 168 | 0.1275 | 0.8706 | 0.8132 | 0.8409 | 91 | 0.9223 | 0.9694 | 0.9453 | 98 | 0.625 | 0.8333 | 0.7143 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9338 | 0.9845 | 0.9585 | 129 | 0.9066 | 0.9262 | 0.9163 | 0.9641 |
75
+ | 0.0496 | 4.0 | 224 | 0.1683 | 0.8 | 0.8352 | 0.8172 | 91 | 0.9143 | 0.9796 | 0.9458 | 98 | 0.6667 | 1.0 | 0.8 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9270 | 0.9845 | 0.9549 | 129 | 0.8815 | 0.9385 | 0.9091 | 0.9608 |
76
+ | 0.0529 | 5.0 | 280 | 0.1526 | 0.7917 | 0.8352 | 0.8128 | 91 | 0.8991 | 1.0 | 0.9469 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9203 | 0.9845 | 0.9513 | 129 | 0.8697 | 0.9446 | 0.9056 | 0.9627 |
77
+ | 0.0419 | 6.0 | 336 | 0.1402 | 0.7755 | 0.8352 | 0.8042 | 91 | 0.8962 | 0.9694 | 0.9314 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9203 | 0.9845 | 0.9513 | 129 | 0.8636 | 0.9354 | 0.8981 | 0.9651 |
78
+ | 0.0156 | 7.0 | 392 | 0.1498 | 0.8105 | 0.8462 | 0.8280 | 91 | 0.9048 | 0.9694 | 0.9360 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9137 | 0.9845 | 0.9478 | 129 | 0.8739 | 0.9385 | 0.9050 | 0.9661 |
79
+ | 0.0066 | 8.0 | 448 | 0.1509 | 0.7835 | 0.8352 | 0.8085 | 91 | 0.9057 | 0.9796 | 0.9412 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9137 | 0.9845 | 0.9478 | 129 | 0.8665 | 0.9385 | 0.9010 | 0.9680 |
80
+ | 0.0084 | 9.0 | 504 | 0.1548 | 0.7755 | 0.8352 | 0.8042 | 91 | 0.9048 | 0.9694 | 0.9360 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9137 | 0.9845 | 0.9478 | 129 | 0.8636 | 0.9354 | 0.8981 | 0.9656 |
81
+ | 0.0083 | 10.0 | 560 | 0.1537 | 0.7677 | 0.8352 | 0.8 | 91 | 0.9048 | 0.9694 | 0.9360 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9137 | 0.9845 | 0.9478 | 129 | 0.8612 | 0.9354 | 0.8968 | 0.9661 |
82
 
83
 
84
  ### Framework versions