Ethan615 commited on
Commit
97a453b
·
1 Parent(s): 556a565

End of training

Browse files
README.md CHANGED
@@ -3,26 +3,11 @@ license: apache-2.0
3
  base_model: distilbert-base-uncased
4
  tags:
5
  - generated_from_trainer
6
- datasets:
7
- - glue
8
  metrics:
9
  - matthews_correlation
10
  model-index:
11
  - name: try
12
- results:
13
- - task:
14
- name: Text Classification
15
- type: text-classification
16
- dataset:
17
- name: glue
18
- type: glue
19
- config: cola
20
- split: validation
21
- args: cola
22
- metrics:
23
- - name: Matthews Correlation
24
- type: matthews_correlation
25
- value: 0.5469892544665611
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -30,10 +15,10 @@ should probably proofread and complete it, then remove this comment. -->
30
 
31
  # try
32
 
33
- This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.7595
36
- - Matthews Correlation: 0.5470
37
 
38
  ## Model description
39
 
@@ -64,16 +49,16 @@ The following hyperparameters were used during training:
64
 
65
  | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
66
  |:-------------:|:-----:|:----:|:---------------:|:--------------------:|
67
- | 0.5165 | 1.0 | 535 | 0.4557 | 0.4389 |
68
- | 0.34 | 2.0 | 1070 | 0.4728 | 0.5038 |
69
- | 0.2325 | 3.0 | 1605 | 0.6235 | 0.5159 |
70
- | 0.1636 | 4.0 | 2140 | 0.7595 | 0.5470 |
71
- | 0.1206 | 5.0 | 2675 | 0.8536 | 0.5356 |
72
 
73
 
74
  ### Framework versions
75
 
76
  - Transformers 4.36.2
77
  - Pytorch 2.1.0+cu121
78
- - Datasets 2.15.0
79
  - Tokenizers 0.15.0
 
3
  base_model: distilbert-base-uncased
4
  tags:
5
  - generated_from_trainer
 
 
6
  metrics:
7
  - matthews_correlation
8
  model-index:
9
  - name: try
10
+ results: []
 
 
 
 
 
 
 
 
 
 
 
 
 
11
  ---
12
 
13
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
15
 
16
  # try
17
 
18
+ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.8553
21
+ - Matthews Correlation: 0.5332
22
 
23
  ## Model description
24
 
 
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
51
  |:-------------:|:-----:|:----:|:---------------:|:--------------------:|
52
+ | 0.5175 | 1.0 | 535 | 0.4515 | 0.4511 |
53
+ | 0.3429 | 2.0 | 1070 | 0.4648 | 0.5162 |
54
+ | 0.2373 | 3.0 | 1605 | 0.6564 | 0.5173 |
55
+ | 0.1631 | 4.0 | 2140 | 0.7276 | 0.5265 |
56
+ | 0.122 | 5.0 | 2675 | 0.8553 | 0.5332 |
57
 
58
 
59
  ### Framework versions
60
 
61
  - Transformers 4.36.2
62
  - Pytorch 2.1.0+cu121
63
+ - Datasets 2.16.0
64
  - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b404bb120042af38f76eefeb75f16126b9c6b6431ff3c5068b8778a694105697
3
  size 267832560
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b508e1af35173ae3f802f74a9e81434c4401fafb38281b6896fa211683516b75
3
  size 267832560
runs/Dec23_01-09-39_cab176ec49ea/events.out.tfevents.1703293789.cab176ec49ea.681.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f9ed8c4b84135e6c25872aac1b9381cd7743606089871142cc518182144b4cc9
3
- size 6293
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b1efaedbbe5bf8a757ecc144c0d82b6c20b192722b05c168a5d6b00f3113d3d4
3
+ size 7139
runs/Dec23_01-09-39_cab176ec49ea/events.out.tfevents.1703293973.cab176ec49ea.681.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:43f1d9497ffd7fa9cf9a73f189e7af81cf39922d1cc70dd67c249cca3a1e8883
3
+ size 423