File size: 4,256 Bytes
4209a31
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
---
base_model: bert-base-chinese
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: classifier_adapter
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# classifier_adapter

This model is a fine-tuned version of [bert-base-chinese](https://huggingface.co/bert-base-chinese) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0386
- Accuracy: 0.9875
- Precision: 0.8841
- Recall: 0.7947
- F1: 0.8283
- Ap: 0.8850

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 12

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Ap     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
| No log        | 0.38  | 100  | 0.1590          | 0.9571   | 0.0       | 0.0    | 0.0    | 0.1046 |
| No log        | 0.75  | 200  | 0.1578          | 0.9571   | 0.0       | 0.0    | 0.0    | 0.1808 |
| No log        | 1.13  | 300  | 0.1185          | 0.9653   | 0.0899    | 0.0599 | 0.0680 | 0.4391 |
| No log        | 1.51  | 400  | 0.0898          | 0.9724   | 0.2199    | 0.1409 | 0.1617 | 0.6479 |
| 0.1405        | 1.89  | 500  | 0.0774          | 0.9750   | 0.3319    | 0.2273 | 0.2575 | 0.7417 |
| 0.1405        | 2.26  | 600  | 0.0683          | 0.9771   | 0.4118    | 0.3002 | 0.3294 | 0.7791 |
| 0.1405        | 2.64  | 700  | 0.0616          | 0.9804   | 0.6207    | 0.4336 | 0.4810 | 0.8187 |
| 0.1405        | 3.02  | 800  | 0.0556          | 0.9821   | 0.7210    | 0.4875 | 0.5435 | 0.8380 |
| 0.1405        | 3.4   | 900  | 0.0519          | 0.9830   | 0.7329    | 0.5224 | 0.5839 | 0.8566 |
| 0.0598        | 3.77  | 1000 | 0.0486          | 0.9846   | 0.7818    | 0.6063 | 0.6615 | 0.8629 |
| 0.0598        | 4.15  | 1100 | 0.0469          | 0.9853   | 0.8223    | 0.6807 | 0.7248 | 0.8633 |
| 0.0598        | 4.53  | 1200 | 0.0457          | 0.9856   | 0.8521    | 0.7235 | 0.7663 | 0.8666 |
| 0.0598        | 4.91  | 1300 | 0.0439          | 0.9859   | 0.8436    | 0.6955 | 0.7435 | 0.8753 |
| 0.0598        | 5.28  | 1400 | 0.0424          | 0.9862   | 0.8715    | 0.6964 | 0.7496 | 0.8739 |
| 0.0399        | 5.66  | 1500 | 0.0415          | 0.9869   | 0.8695    | 0.7621 | 0.7994 | 0.8772 |
| 0.0399        | 6.04  | 1600 | 0.0416          | 0.9865   | 0.8700    | 0.7670 | 0.8039 | 0.8853 |
| 0.0399        | 6.42  | 1700 | 0.0401          | 0.9871   | 0.8687    | 0.7686 | 0.8047 | 0.8846 |
| 0.0399        | 6.79  | 1800 | 0.0405          | 0.9867   | 0.8734    | 0.7851 | 0.8167 | 0.8848 |
| 0.0399        | 7.17  | 1900 | 0.0410          | 0.9865   | 0.8600    | 0.7708 | 0.8057 | 0.8770 |
| 0.0315        | 7.55  | 2000 | 0.0393          | 0.9873   | 0.8869    | 0.7718 | 0.8158 | 0.8819 |
| 0.0315        | 7.92  | 2100 | 0.0385          | 0.9871   | 0.8747    | 0.7861 | 0.8196 | 0.8856 |
| 0.0315        | 8.3   | 2200 | 0.0386          | 0.9877   | 0.8863    | 0.7856 | 0.8227 | 0.8857 |
| 0.0315        | 8.68  | 2300 | 0.0390          | 0.9869   | 0.8695    | 0.7949 | 0.8221 | 0.8830 |
| 0.0315        | 9.06  | 2400 | 0.0391          | 0.9872   | 0.8685    | 0.8081 | 0.8311 | 0.8830 |
| 0.026         | 9.43  | 2500 | 0.0386          | 0.9875   | 0.8841    | 0.7947 | 0.8283 | 0.8850 |
| 0.026         | 9.81  | 2600 | 0.0390          | 0.9871   | 0.8615    | 0.8064 | 0.8264 | 0.8840 |
| 0.026         | 10.19 | 2700 | 0.0386          | 0.9873   | 0.8689    | 0.8023 | 0.8264 | 0.8859 |
| 0.026         | 10.57 | 2800 | 0.0386          | 0.9873   | 0.8737    | 0.7986 | 0.8265 | 0.8860 |


### Framework versions

- Transformers 4.36.2
- Pytorch 2.2.1+cu121
- Tokenizers 0.15.2