File size: 4,097 Bytes
82c23a5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
library_name: transformers
license: agpl-3.0
base_model: vinai/phobert-base-v2
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: phobert-v2_v2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# phobert-v2_v2

This model is a fine-tuned version of [vinai/phobert-base-v2](https://huggingface.co/vinai/phobert-base-v2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3130
- Accuracy: 0.9507
- Precision Macro: 0.8904
- Recall Macro: 0.8541
- F1 Macro: 0.8704
- F1 Weighted: 0.9497

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision Macro | Recall Macro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------------:|:------------:|:--------:|:-----------:|
| 0.4636        | 1.0   | 90   | 0.2180          | 0.9419   | 0.9099          | 0.7652       | 0.8049   | 0.9359      |
| 0.1882        | 2.0   | 180  | 0.1916          | 0.9419   | 0.8351          | 0.8649       | 0.8485   | 0.9433      |
| 0.1453        | 3.0   | 270  | 0.1898          | 0.9488   | 0.8743          | 0.8402       | 0.8555   | 0.9476      |
| 0.1175        | 4.0   | 360  | 0.1932          | 0.9526   | 0.9141          | 0.8267       | 0.8597   | 0.9500      |
| 0.0856        | 5.0   | 450  | 0.2092          | 0.9514   | 0.8708          | 0.8711       | 0.8709   | 0.9514      |
| 0.0826        | 6.0   | 540  | 0.2221          | 0.9526   | 0.9063          | 0.8516       | 0.8748   | 0.9512      |
| 0.0675        | 7.0   | 630  | 0.2342          | 0.9438   | 0.8419          | 0.8696       | 0.8545   | 0.9450      |
| 0.0618        | 8.0   | 720  | 0.2402          | 0.9469   | 0.8890          | 0.8430       | 0.8630   | 0.9456      |
| 0.0426        | 9.0   | 810  | 0.2503          | 0.9507   | 0.8797          | 0.8543       | 0.8660   | 0.9499      |
| 0.038         | 10.0  | 900  | 0.2786          | 0.9514   | 0.8999          | 0.8467       | 0.8692   | 0.9499      |
| 0.039         | 11.0  | 990  | 0.2795          | 0.9463   | 0.8628          | 0.8554       | 0.8589   | 0.9460      |
| 0.0263        | 12.0  | 1080 | 0.2817          | 0.9488   | 0.8733          | 0.8571       | 0.8648   | 0.9483      |
| 0.0209        | 13.0  | 1170 | 0.2840          | 0.9495   | 0.8802          | 0.8576       | 0.8681   | 0.9488      |
| 0.0221        | 14.0  | 1260 | 0.2769          | 0.9526   | 0.8904          | 0.8639       | 0.8761   | 0.9519      |
| 0.0172        | 15.0  | 1350 | 0.2861          | 0.9514   | 0.8985          | 0.8546       | 0.8739   | 0.9502      |
| 0.0159        | 16.0  | 1440 | 0.3031          | 0.9482   | 0.8850          | 0.8523       | 0.8671   | 0.9472      |
| 0.0124        | 17.0  | 1530 | 0.3119          | 0.9501   | 0.8792          | 0.8538       | 0.8655   | 0.9493      |
| 0.0107        | 18.0  | 1620 | 0.3173          | 0.9476   | 0.8948          | 0.8476       | 0.8681   | 0.9463      |
| 0.0106        | 19.0  | 1710 | 0.3094          | 0.9545   | 0.8979          | 0.8611       | 0.8776   | 0.9535      |
| 0.0108        | 20.0  | 1800 | 0.3130          | 0.9507   | 0.8904          | 0.8541       | 0.8704   | 0.9497      |


### Framework versions

- Transformers 4.55.0
- Pytorch 2.7.0+cu126
- Datasets 4.0.0
- Tokenizers 0.21.4