Etelis commited on
Commit
8da5fea
·
1 Parent(s): 02f2269

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +126 -0
README.md ADDED
@@ -0,0 +1,126 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - rotten_tomatoes
7
+ metrics:
8
+ - accuracy
9
+ model-index:
10
+ - name: rtm_roBERTa_5E
11
+ results:
12
+ - task:
13
+ name: Text Classification
14
+ type: text-classification
15
+ dataset:
16
+ name: rotten_tomatoes
17
+ type: rotten_tomatoes
18
+ config: default
19
+ split: train
20
+ args: default
21
+ metrics:
22
+ - name: Accuracy
23
+ type: accuracy
24
+ value: 0.8666666666666667
25
+ ---
26
+
27
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
+ should probably proofread and complete it, then remove this comment. -->
29
+
30
+ # rtm_roBERTa_5E
31
+
32
+ This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the rotten_tomatoes dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - Loss: 0.6545
35
+ - Accuracy: 0.8667
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 1e-05
55
+ - train_batch_size: 16
56
+ - eval_batch_size: 8
57
+ - seed: 42
58
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
+ - lr_scheduler_type: linear
60
+ - num_epochs: 5
61
+
62
+ ### Training results
63
+
64
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
65
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
66
+ | 0.6955 | 0.09 | 50 | 0.6752 | 0.7867 |
67
+ | 0.5362 | 0.19 | 100 | 0.4314 | 0.8333 |
68
+ | 0.4065 | 0.28 | 150 | 0.4476 | 0.8533 |
69
+ | 0.3563 | 0.37 | 200 | 0.3454 | 0.8467 |
70
+ | 0.3729 | 0.47 | 250 | 0.3421 | 0.86 |
71
+ | 0.3355 | 0.56 | 300 | 0.3253 | 0.8467 |
72
+ | 0.338 | 0.66 | 350 | 0.3859 | 0.8733 |
73
+ | 0.2875 | 0.75 | 400 | 0.3537 | 0.8533 |
74
+ | 0.3477 | 0.84 | 450 | 0.3636 | 0.8467 |
75
+ | 0.3259 | 0.94 | 500 | 0.3115 | 0.88 |
76
+ | 0.3204 | 1.03 | 550 | 0.4295 | 0.8333 |
77
+ | 0.2673 | 1.12 | 600 | 0.3369 | 0.88 |
78
+ | 0.2479 | 1.22 | 650 | 0.3620 | 0.8667 |
79
+ | 0.2821 | 1.31 | 700 | 0.3582 | 0.8733 |
80
+ | 0.2355 | 1.4 | 750 | 0.3130 | 0.8867 |
81
+ | 0.2357 | 1.5 | 800 | 0.3229 | 0.86 |
82
+ | 0.2725 | 1.59 | 850 | 0.3035 | 0.88 |
83
+ | 0.2425 | 1.69 | 900 | 0.3146 | 0.8533 |
84
+ | 0.1977 | 1.78 | 950 | 0.4079 | 0.86 |
85
+ | 0.2557 | 1.87 | 1000 | 0.4132 | 0.8733 |
86
+ | 0.2395 | 1.97 | 1050 | 0.3336 | 0.86 |
87
+ | 0.1951 | 2.06 | 1100 | 0.5068 | 0.84 |
88
+ | 0.1631 | 2.15 | 1150 | 0.5209 | 0.8867 |
89
+ | 0.2192 | 2.25 | 1200 | 0.4766 | 0.8733 |
90
+ | 0.1725 | 2.34 | 1250 | 0.3962 | 0.8667 |
91
+ | 0.2215 | 2.43 | 1300 | 0.4133 | 0.8867 |
92
+ | 0.1602 | 2.53 | 1350 | 0.5564 | 0.8533 |
93
+ | 0.1986 | 2.62 | 1400 | 0.5826 | 0.86 |
94
+ | 0.1972 | 2.72 | 1450 | 0.5412 | 0.8667 |
95
+ | 0.2299 | 2.81 | 1500 | 0.4636 | 0.8733 |
96
+ | 0.2028 | 2.9 | 1550 | 0.5096 | 0.8667 |
97
+ | 0.2591 | 3.0 | 1600 | 0.3790 | 0.8467 |
98
+ | 0.1197 | 3.09 | 1650 | 0.5704 | 0.8467 |
99
+ | 0.174 | 3.18 | 1700 | 0.5904 | 0.8467 |
100
+ | 0.1499 | 3.28 | 1750 | 0.6066 | 0.86 |
101
+ | 0.1687 | 3.37 | 1800 | 0.6353 | 0.8533 |
102
+ | 0.1463 | 3.46 | 1850 | 0.6434 | 0.8467 |
103
+ | 0.1373 | 3.56 | 1900 | 0.6507 | 0.8533 |
104
+ | 0.1339 | 3.65 | 1950 | 0.6014 | 0.86 |
105
+ | 0.1488 | 3.75 | 2000 | 0.7245 | 0.84 |
106
+ | 0.1725 | 3.84 | 2050 | 0.6214 | 0.86 |
107
+ | 0.1443 | 3.93 | 2100 | 0.6446 | 0.8533 |
108
+ | 0.1619 | 4.03 | 2150 | 0.6223 | 0.8533 |
109
+ | 0.1153 | 4.12 | 2200 | 0.6579 | 0.8333 |
110
+ | 0.1159 | 4.21 | 2250 | 0.6760 | 0.8667 |
111
+ | 0.0948 | 4.31 | 2300 | 0.7172 | 0.8467 |
112
+ | 0.1373 | 4.4 | 2350 | 0.7346 | 0.8467 |
113
+ | 0.1463 | 4.49 | 2400 | 0.6453 | 0.8533 |
114
+ | 0.0758 | 4.59 | 2450 | 0.6579 | 0.86 |
115
+ | 0.16 | 4.68 | 2500 | 0.6556 | 0.8667 |
116
+ | 0.112 | 4.78 | 2550 | 0.6490 | 0.88 |
117
+ | 0.1151 | 4.87 | 2600 | 0.6525 | 0.8667 |
118
+ | 0.2152 | 4.96 | 2650 | 0.6545 | 0.8667 |
119
+
120
+
121
+ ### Framework versions
122
+
123
+ - Transformers 4.24.0
124
+ - Pytorch 1.13.0
125
+ - Datasets 2.7.1
126
+ - Tokenizers 0.13.2