File size: 6,518 Bytes
6ef017d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
88fddb0
 
 
 
 
6ef017d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
88fddb0
 
 
6ef017d
 
 
88fddb0
6ef017d
 
 
88fddb0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6ef017d
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
---
library_name: transformers
license: mit
base_model: microsoft/deberta-v3-base
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: my-polarization-model
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# my-polarization-model

This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1047
- Accuracy: 0.6357
- F1: 0.4941
- Precision: 0.4041
- Recall: 0.6357

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-07
- train_batch_size: 100
- eval_batch_size: 100
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 200

### Training results

| Training Loss | Epoch    | Step | Accuracy | F1     | Validation Loss | Precision | Recall |
|:-------------:|:--------:|:----:|:--------:|:------:|:---------------:|:---------:|:------:|
| 1.2002        | 3.8462   | 100  | 0.3643   | 0.1946 | 1.1760          | 0.1327    | 0.3643 |
| 1.1958        | 7.6923   | 200  | 0.3643   | 0.1946 | 1.1699          | 0.1327    | 0.3643 |
| 1.1865        | 11.5385  | 300  | 0.3643   | 0.1946 | 1.1643          | 0.1327    | 0.3643 |
| 1.1827        | 15.3846  | 400  | 0.3643   | 0.1946 | 1.1590          | 0.1327    | 0.3643 |
| 1.1763        | 19.2308  | 500  | 0.3643   | 0.1946 | 1.1541          | 0.1327    | 0.3643 |
| 1.1697        | 23.0769  | 600  | 0.3814   | 0.2352 | 1.1496          | 0.6857    | 0.3814 |
| 1.1686        | 26.9231  | 700  | 0.5566   | 0.5531 | 1.1454          | 0.6590    | 0.5566 |
| 1.1653        | 30.7692  | 800  | 0.6496   | 0.5773 | 1.1415          | 0.6270    | 0.6496 |
| 1.1614        | 34.6154  | 900  | 0.6372   | 0.5031 | 1.1379          | 0.6238    | 0.6372 |
| 1.1589        | 38.4615  | 1000 | 0.6341   | 0.4933 | 1.1347          | 0.4037    | 0.6341 |
| 1.1524        | 42.3077  | 1100 | 0.6357   | 0.4941 | 1.1316          | 0.4041    | 0.6357 |
| 1.1472        | 46.1538  | 1200 | 0.6357   | 0.4941 | 1.1288          | 0.4041    | 0.6357 |
| 1.1465        | 50.0     | 1300 | 0.6357   | 0.4941 | 1.1263          | 0.4041    | 0.6357 |
| 1.1479        | 53.8462  | 1400 | 0.6357   | 0.4941 | 1.1240          | 0.4041    | 0.6357 |
| 1.147         | 57.6923  | 1500 | 0.6357   | 0.4941 | 1.1219          | 0.4041    | 0.6357 |
| 1.1489        | 61.5385  | 1600 | 0.6357   | 0.4941 | 1.1201          | 0.4041    | 0.6357 |
| 1.1421        | 65.3846  | 1700 | 0.6357   | 0.4941 | 1.1184          | 0.4041    | 0.6357 |
| 1.1424        | 69.2308  | 1800 | 0.6357   | 0.4941 | 1.1169          | 0.4041    | 0.6357 |
| 1.1363        | 73.0769  | 1900 | 0.6357   | 0.4941 | 1.1156          | 0.4041    | 0.6357 |
| 1.14          | 76.9231  | 2000 | 0.6357   | 0.4941 | 1.1144          | 0.4041    | 0.6357 |
| 1.1399        | 80.7692  | 2100 | 0.6357   | 0.4941 | 1.1134          | 0.4041    | 0.6357 |
| 1.1403        | 84.6154  | 2200 | 0.6357   | 0.4941 | 1.1124          | 0.4041    | 0.6357 |
| 1.1417        | 88.4615  | 2300 | 0.6357   | 0.4941 | 1.1115          | 0.4041    | 0.6357 |
| 1.1352        | 92.3077  | 2400 | 0.6357   | 0.4941 | 1.1108          | 0.4041    | 0.6357 |
| 1.127         | 96.1538  | 2500 | 0.6357   | 0.4941 | 1.1101          | 0.4041    | 0.6357 |
| 1.1245        | 100.0    | 2600 | 0.6357   | 0.4941 | 1.1095          | 0.4041    | 0.6357 |
| 1.1309        | 103.8462 | 2700 | 0.6357   | 0.4941 | 1.1090          | 0.4041    | 0.6357 |
| 1.1318        | 107.6923 | 2800 | 0.6357   | 0.4941 | 1.1085          | 0.4041    | 0.6357 |
| 1.1293        | 111.5385 | 2900 | 0.6357   | 0.4941 | 1.1080          | 0.4041    | 0.6357 |
| 1.1315        | 115.3846 | 3000 | 0.6357   | 0.4941 | 1.1076          | 0.4041    | 0.6357 |
| 1.1299        | 119.2308 | 3100 | 0.6357   | 0.4941 | 1.1073          | 0.4041    | 0.6357 |
| 1.1314        | 123.0769 | 3200 | 0.6357   | 0.4941 | 1.1070          | 0.4041    | 0.6357 |
| 1.1309        | 126.9231 | 3300 | 0.6357   | 0.4941 | 1.1067          | 0.4041    | 0.6357 |
| 1.1235        | 130.7692 | 3400 | 0.6357   | 0.4941 | 1.1064          | 0.4041    | 0.6357 |
| 1.1367        | 134.6154 | 3500 | 0.6357   | 0.4941 | 1.1062          | 0.4041    | 0.6357 |
| 1.1362        | 138.4615 | 3600 | 0.6357   | 0.4941 | 1.1060          | 0.4041    | 0.6357 |
| 1.1194        | 142.3077 | 3700 | 0.6357   | 0.4941 | 1.1058          | 0.4041    | 0.6357 |
| 1.1283        | 146.1538 | 3800 | 0.6357   | 0.4941 | 1.1057          | 0.4041    | 0.6357 |
| 1.1183        | 150.0    | 3900 | 0.6357   | 0.4941 | 1.1055          | 0.4041    | 0.6357 |
| 1.1252        | 153.8462 | 4000 | 0.6357   | 0.4941 | 1.1054          | 0.4041    | 0.6357 |
| 1.1357        | 157.6923 | 4100 | 0.6357   | 0.4941 | 1.1053          | 0.4041    | 0.6357 |
| 1.132         | 161.5385 | 4200 | 0.6357   | 0.4941 | 1.1052          | 0.4041    | 0.6357 |
| 1.1292        | 165.3846 | 4300 | 0.6357   | 0.4941 | 1.1051          | 0.4041    | 0.6357 |
| 1.1302        | 169.2308 | 4400 | 0.6357   | 0.4941 | 1.1050          | 0.4041    | 0.6357 |
| 1.1282        | 173.0769 | 4500 | 0.6357   | 0.4941 | 1.1049          | 0.4041    | 0.6357 |
| 1.1323        | 176.9231 | 4600 | 1.1049   | 0.6357 | 0.4941          | 0.4041    | 0.6357 |
| 1.1368        | 180.7692 | 4700 | 1.1048   | 0.6357 | 0.4941          | 0.4041    | 0.6357 |
| 1.1364        | 184.6154 | 4800 | 1.1048   | 0.6357 | 0.4941          | 0.4041    | 0.6357 |
| 1.1207        | 188.4615 | 4900 | 1.1048   | 0.6357 | 0.4941          | 0.4041    | 0.6357 |
| 1.1302        | 192.3077 | 5000 | 1.1048   | 0.6357 | 0.4941          | 0.4041    | 0.6357 |
| 1.1235        | 196.1538 | 5100 | 1.1047   | 0.6357 | 0.4941          | 0.4041    | 0.6357 |
| 1.1236        | 200.0    | 5200 | 1.1047   | 0.6357 | 0.4941          | 0.4041    | 0.6357 |


### Framework versions

- Transformers 4.57.1
- Pytorch 2.8.0+cu126
- Datasets 4.4.1
- Tokenizers 0.22.1