File size: 3,584 Bytes
48cf962
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f74c1d0
 
 
 
48cf962
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f74c1d0
48cf962
 
 
 
 
 
f74c1d0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
48cf962
 
 
 
 
f74c1d0
48cf962
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
---
library_name: transformers
base_model: microsoft/mpnet-base
tags:
- generated_from_trainer
metrics:
- f1
- precision
- recall
model-index:
- name: mpnet_token_cls_model
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# mpnet_token_cls_model

This model is a fine-tuned version of [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1318
- F1: 0.8327
- Precision: 0.8373
- Recall: 0.8281

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch  | Step  | Validation Loss | F1     | Precision | Recall |
|:-------------:|:------:|:-----:|:---------------:|:------:|:---------:|:------:|
| 0.174         | 0.1553 | 1000  | 0.1766          | 0.7930 | 0.8137    | 0.7734 |
| 0.1408        | 0.3105 | 2000  | 0.1469          | 0.8030 | 0.8143    | 0.7920 |
| 0.1326        | 0.4658 | 3000  | 0.1313          | 0.8187 | 0.8399    | 0.7985 |
| 0.1283        | 0.6210 | 4000  | 0.1308          | 0.8169 | 0.8188    | 0.8150 |
| 0.1259        | 0.7763 | 5000  | 0.1270          | 0.8195 | 0.8325    | 0.8069 |
| 0.12          | 0.9315 | 6000  | 0.1224          | 0.8162 | 0.8272    | 0.8055 |
| 0.1072        | 1.0868 | 7000  | 0.1221          | 0.8215 | 0.82      | 0.8230 |
| 0.1068        | 1.2420 | 8000  | 0.1216          | 0.8208 | 0.8234    | 0.8182 |
| 0.1022        | 1.3973 | 9000  | 0.1256          | 0.8234 | 0.8188    | 0.8281 |
| 0.1034        | 1.5526 | 10000 | 0.1217          | 0.8267 | 0.8292    | 0.8241 |
| 0.1051        | 1.7078 | 11000 | 0.1203          | 0.8288 | 0.8435    | 0.8146 |
| 0.1011        | 1.8631 | 12000 | 0.1246          | 0.8299 | 0.8284    | 0.8314 |
| 0.0917        | 2.0183 | 13000 | 0.1266          | 0.8248 | 0.8274    | 0.8223 |
| 0.0887        | 2.1736 | 14000 | 0.1213          | 0.8261 | 0.8260    | 0.8263 |
| 0.0863        | 2.3288 | 15000 | 0.1255          | 0.8272 | 0.8263    | 0.8281 |
| 0.0897        | 2.4841 | 16000 | 0.1265          | 0.8210 | 0.8302    | 0.8120 |
| 0.0835        | 2.6393 | 17000 | 0.1233          | 0.8299 | 0.8284    | 0.8314 |
| 0.0833        | 2.7946 | 18000 | 0.1259          | 0.8341 | 0.8398    | 0.8285 |
| 0.0829        | 2.9499 | 19000 | 0.1189          | 0.8328 | 0.8397    | 0.8259 |
| 0.0704        | 3.1051 | 20000 | 0.1308          | 0.8302 | 0.8290    | 0.8314 |
| 0.073         | 3.2604 | 21000 | 0.1273          | 0.8296 | 0.8330    | 0.8263 |
| 0.0711        | 3.4156 | 22000 | 0.1335          | 0.8304 | 0.8399    | 0.8212 |
| 0.0695        | 3.5709 | 23000 | 0.1325          | 0.8283 | 0.8353    | 0.8215 |
| 0.0708        | 3.7261 | 24000 | 0.1316          | 0.8319 | 0.8384    | 0.8255 |
| 0.0706        | 3.8814 | 25000 | 0.1318          | 0.8327 | 0.8373    | 0.8281 |


### Framework versions

- Transformers 4.50.0
- Pytorch 2.5.1+cu121
- Datasets 2.21.0
- Tokenizers 0.21.4