File size: 3,865 Bytes
741657e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f4e2f71
 
 
 
 
 
 
 
741657e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3f993c7
741657e
 
 
 
 
f4e2f71
 
 
 
 
 
 
 
 
 
741657e
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-Multi_classification
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# distilbert-base-uncased-finetuned-Multi_classification

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5588
- Accuracy: 0.7266
- Macro Averaged Precision: 0.6830
- Micro Averaged Precision: 0.7266
- Macro Averaged Recall: 0.5652
- Micro Averaged Recall: 0.7266
- Macro Averaged F1: 0.5513
- Micro Averaged F1: 0.7266

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Macro Averaged Precision | Micro Averaged Precision | Macro Averaged Recall | Micro Averaged Recall | Macro Averaged F1 | Micro Averaged F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------------------------:|:------------------------:|:---------------------:|:---------------------:|:-----------------:|:-----------------:|
| 0.5811        | 1.0   | 635  | 0.5745          | 0.7055   | 0.3527                   | 0.7055                   | 0.5                   | 0.7055                | 0.4137            | 0.7055            |
| 0.5467        | 2.0   | 1270 | 0.5588          | 0.7266   | 0.6830                   | 0.7266                   | 0.5652                | 0.7266                | 0.5513            | 0.7266            |
| 0.4724        | 3.0   | 1905 | 0.6347          | 0.7109   | 0.6328                   | 0.7109                   | 0.5873                | 0.7109                | 0.5906            | 0.7109            |
| 0.2379        | 4.0   | 2540 | 0.9110          | 0.7078   | 0.6281                   | 0.7078                   | 0.5874                | 0.7078                | 0.5910            | 0.7078            |
| 0.1511        | 5.0   | 3175 | 1.2270          | 0.6953   | 0.6168                   | 0.6953                   | 0.5963                | 0.6953                | 0.6011            | 0.6953            |
| 0.1074        | 6.0   | 3810 | 1.6106          | 0.7188   | 0.6470                   | 0.7188                   | 0.5859                | 0.7188                | 0.5875            | 0.7188            |
| 0.0935        | 7.0   | 4445 | 1.8533          | 0.7070   | 0.6266                   | 0.7070                   | 0.5861                | 0.7070                | 0.5895            | 0.7070            |
| 0.037         | 8.0   | 5080 | 2.0315          | 0.6875   | 0.6082                   | 0.6875                   | 0.5923                | 0.6875                | 0.5964            | 0.6875            |
| 0.0294        | 9.0   | 5715 | 2.0726          | 0.7078   | 0.6295                   | 0.7078                   | 0.5928                | 0.7078                | 0.5975            | 0.7078            |
| 0.0238        | 10.0  | 6350 | 2.1236          | 0.7086   | 0.6303                   | 0.7086                   | 0.5918                | 0.7086                | 0.5963            | 0.7086            |


### Framework versions

- Transformers 4.28.1
- Pytorch 2.0.1+cu117
- Datasets 1.18.4
- Tokenizers 0.12.1