29thDay commited on
Commit
cb72dd4
·
verified ·
1 Parent(s): 3af60c9

Delete README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -72
README.md DELETED
@@ -1,72 +0,0 @@
1
- ---
2
- library_name: transformers
3
- license: apache-2.0
4
- base_model: distilbert/distilbert-base-uncased
5
- tags:
6
- - generated_from_trainer
7
- metrics:
8
- - accuracy
9
- - f1
10
- model-index:
11
- - name: distilbert-base-uncased-classifier
12
- results: []
13
- ---
14
-
15
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
- should probably proofread and complete it, then remove this comment. -->
17
-
18
- # distilbert-base-uncased-classifier
19
-
20
- This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
21
- It achieves the following results on the evaluation set:
22
- - Loss: 0.2548
23
- - Accuracy: 0.9035
24
- - F1: 0.8134
25
-
26
- ## Model description
27
-
28
- More information needed
29
-
30
- ## Intended uses & limitations
31
-
32
- More information needed
33
-
34
- ## Training and evaluation data
35
-
36
- More information needed
37
-
38
- ## Training procedure
39
-
40
- ### Training hyperparameters
41
-
42
- The following hyperparameters were used during training:
43
- - learning_rate: 2e-05
44
- - train_batch_size: 32
45
- - eval_batch_size: 32
46
- - seed: 42
47
- - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
- - lr_scheduler_type: linear
49
- - num_epochs: 2
50
-
51
- ### Training results
52
-
53
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
54
- |:-------------:|:------:|:----:|:---------------:|:--------:|:------:|
55
- | No log | 0 | 0 | 0.6325 | 0.7464 | 0.0 |
56
- | No log | 0.2020 | 79 | 0.3489 | 0.8487 | 0.6624 |
57
- | No log | 0.4041 | 158 | 0.3422 | 0.8559 | 0.7462 |
58
- | No log | 0.6061 | 237 | 0.2983 | 0.8674 | 0.7444 |
59
- | No log | 0.8082 | 316 | 0.2837 | 0.8862 | 0.7871 |
60
- | No log | 1.0102 | 395 | 0.2743 | 0.8919 | 0.7967 |
61
- | No log | 1.2123 | 474 | 0.2772 | 0.8934 | 0.7956 |
62
- | 0.3453 | 1.4143 | 553 | 0.2552 | 0.9092 | 0.8245 |
63
- | 0.3453 | 1.6164 | 632 | 0.2486 | 0.9006 | 0.8056 |
64
- | 0.3453 | 1.8184 | 711 | 0.2548 | 0.9035 | 0.8134 |
65
-
66
-
67
- ### Framework versions
68
-
69
- - Transformers 4.51.3
70
- - Pytorch 2.7.0+cu126
71
- - Datasets 3.5.0
72
- - Tokenizers 0.21.1