File size: 4,504 Bytes
12222cf
 
 
 
 
 
 
 
 
 
01e38cc
12222cf
 
 
 
 
 
 
 
01e38cc
12222cf
01e38cc
 
 
 
 
12222cf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
01e38cc
12222cf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
01e38cc
 
12222cf
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: Team_Gryffindor_NER
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Team-Gryffindor-distilbert-base-finetuned-NER-creditcardcontract-100epoch

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the Credit card agreement dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0470
- Precision:  0.7319
- Recall: 0.7064
- F1: 0.7190 
- Accuracy:  0.9920

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 11

### Training results

| Training Loss | Epoch | Step   | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:------:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0113        | 0.33  | 500    | 0.0443          | 0.6547    | 0.7028 | 0.6779 | 0.9908   |
| 0.0118        | 0.67  | 1000   | 0.0435          | 0.7207    | 0.6440 | 0.6802 | 0.9916   |
| 0.013         | 1.0   | 1500   | 0.0449          | 0.7113    | 0.6826 | 0.6966 | 0.9918   |
| 0.0113        | 1.34  | 2000   | 0.0434          | 0.7213    | 0.6697 | 0.6946 | 0.9915   |
| 0.0121        | 1.67  | 2500   | 0.0467          | 0.6955    | 0.6789 | 0.6871 | 0.9914   |
| 0.0125        | 2.01  | 3000   | 0.0417          | 0.7095    | 0.6991 | 0.7043 | 0.9920   |
| 0.0106        | 2.34  | 3500   | 0.0437          | 0.7191    | 0.6624 | 0.6896 | 0.9918   |
| 0.0114        | 2.68  | 4000   | 0.0468          | 0.7165    | 0.6679 | 0.6914 | 0.9920   |
| 0.0125        | 3.01  | 4500   | 0.0431          | 0.6888    | 0.6862 | 0.6875 | 0.9917   |
| 0.0107        | 3.35  | 5000   | 0.0446          | 0.7184    | 0.6459 | 0.6802 | 0.9913   |
| 0.0096        | 3.68  | 5500   | 0.0485          | 0.6926    | 0.6532 | 0.6723 | 0.9912   |
| 0.013         | 4.02  | 6000   | 0.0448          | 0.6134    | 0.6697 | 0.6404 | 0.9907   |
| 0.0102        | 4.35  | 6500   | 0.0497          | 0.6895    | 0.6642 | 0.6766 | 0.9913   |
| 0.0112        | 4.69  | 7000   | 0.0464          | 0.6759    | 0.6697 | 0.6728 | 0.9910   |
| 0.0117        | 5.02  | 7500   | 0.0484          | 0.7451    | 0.6275 | 0.6813 | 0.9916   |
| 0.0114        | 5.36  | 8000   | 0.0411          | 0.7086    | 0.6826 | 0.6953 | 0.9919   |
| 0.0108        | 5.69  | 8500   | 0.0443          | 0.7041    | 0.6679 | 0.6855 | 0.9916   |
| 0.0109        | 6.03  | 9000   | 0.0470          | 0.7228    | 0.6697 | 0.6952 | 0.9916   |
| 0.0099        | 6.36  | 9500   | 0.0471          | 0.7253    | 0.6881 | 0.7062 | 0.9913   |
| 0.0103        | 6.7   | 10000  | 0.0430          | 0.6986    | 0.7101 | 0.7043 | 0.9914   |
| 0.0117        | 7.03  | 10500  | 0.0462          | 0.7327    | 0.6991 | 0.7155 | 0.9918   |
| 0.0098        | 7.37  | 11000  | 0.0483          | 0.6910    | 0.6771 | 0.6840 | 0.9914   |
| 0.0107        | 7.7   | 11500  | 0.0468          | 0.7189    | 0.6899 | 0.7041 | 0.9916   |
| 0.0119        | 8.04  | 12000  | 0.0434          | 0.6970    | 0.6881 | 0.6925 | 0.9918   |
| 0.0112        | 8.37  | 12500  | 0.0469          | 0.7007    | 0.6917 | 0.6962 | 0.9918   |
| 0.011         | 8.71  | 13000  | 0.0469          | 0.6736    | 0.6514 | 0.6623 | 0.9914   |
| 0.0101        | 9.04  | 13500  | 0.0451          | 0.6691    | 0.6606 | 0.6648 | 0.9913   |
| 0.0099        | 9.38  | 14000  | 0.0462          | 0.7006    | 0.6826 | 0.6914 | 0.9918   |
| 0.0107        | 9.71  | 14500  | 0.0444          | 0.6840    | 0.6752 | 0.6796 | 0.9915   |
| 0.0118        | 10.05 | 15000  | 0.0457          | 0.7015    | 0.6771 | 0.6891 | 0.9918   |
| 0.0102        | 10.38 | 15500  | 0.0500          | 0.7413    | 0.6679 | 0.7027 | 0.9919   |
| 0.0107        | 10.72 | 16000  | 0.0470          | 0.7319    | 0.7064 | 0.7190 | 0.9920   |



### Framework versions

- Transformers 4.18.0
- Pytorch 1.10.0+cu111
- Datasets 2.0.0
- Tokenizers 0.11.6