File size: 4,081 Bytes
d8c5c1b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
---
library_name: transformers
license: apache-2.0
base_model: timm/tf_efficientnetv2_s.in21k
tags:
- timm
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: EfficientNetV2_Small_v1
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# EfficientNetV2_Small_v1

This model is a fine-tuned version of [timm/tf_efficientnetv2_s.in21k](https://huggingface.co/timm/tf_efficientnetv2_s.in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0340
- Accuracy: 0.9935
- Precision: 0.9981
- Recall: 0.9878
- F1: 0.9929
- Tp: 1618
- Tn: 1907
- Fp: 3
- Fn: 20

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 442
- num_epochs: 20
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Tp   | Tn   | Fp | Fn |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:----:|:----:|:--:|:--:|
| 0.1995        | 1.0   | 222  | 0.1349          | 0.9628   | 0.9575    | 0.9621 | 0.9598 | 1576 | 1840 | 70 | 62 |
| 0.1442        | 2.0   | 444  | 0.0940          | 0.9789   | 0.9956    | 0.9585 | 0.9767 | 1570 | 1903 | 7  | 68 |
| 0.1625        | 3.0   | 666  | 0.0827          | 0.9837   | 0.9925    | 0.9719 | 0.9821 | 1592 | 1898 | 12 | 46 |
| 0.1592        | 4.0   | 888  | 0.0926          | 0.9752   | 0.9708    | 0.9756 | 0.9732 | 1598 | 1862 | 48 | 40 |
| 0.1100        | 5.0   | 1110 | 0.0544          | 0.9876   | 0.9950    | 0.9780 | 0.9865 | 1602 | 1902 | 8  | 36 |
| 0.1497        | 6.0   | 1332 | 0.0635          | 0.9868   | 0.9877    | 0.9835 | 0.9856 | 1611 | 1890 | 20 | 27 |
| 0.1125        | 7.0   | 1554 | 0.0485          | 0.9896   | 0.9957    | 0.9817 | 0.9886 | 1608 | 1903 | 7  | 30 |
| 0.1202        | 8.0   | 1776 | 0.0774          | 0.9794   | 0.9740    | 0.9817 | 0.9778 | 1608 | 1867 | 43 | 30 |
| 0.1031        | 9.0   | 1998 | 0.0507          | 0.9893   | 0.9938    | 0.9829 | 0.9883 | 1610 | 1900 | 10 | 28 |
| 0.1211        | 10.0  | 2220 | 0.0434          | 0.9915   | 0.9975    | 0.9841 | 0.9908 | 1612 | 1906 | 4  | 26 |
| 0.1239        | 11.0  | 2442 | 0.0400          | 0.9918   | 0.9975    | 0.9847 | 0.9911 | 1613 | 1906 | 4  | 25 |
| 0.1066        | 12.0  | 2664 | 0.0403          | 0.9927   | 0.9988    | 0.9853 | 0.9920 | 1614 | 1908 | 2  | 24 |
| 0.1065        | 13.0  | 2886 | 0.0363          | 0.9927   | 0.9994    | 0.9847 | 0.9920 | 1613 | 1909 | 1  | 25 |
| 0.1074        | 14.0  | 3108 | 0.0378          | 0.9930   | 0.9988    | 0.9860 | 0.9923 | 1615 | 1908 | 2  | 23 |
| 0.1128        | 15.0  | 3330 | 0.0327          | 0.9924   | 0.9981    | 0.9853 | 0.9917 | 1614 | 1907 | 3  | 24 |
| 0.0963        | 16.0  | 3552 | 0.0309          | 0.9930   | 0.9988    | 0.9860 | 0.9923 | 1615 | 1908 | 2  | 23 |
| 0.1379        | 17.0  | 3774 | 0.0366          | 0.9927   | 0.9969    | 0.9872 | 0.9920 | 1617 | 1905 | 5  | 21 |
| 0.1070        | 18.0  | 3996 | 0.0331          | 0.9930   | 0.9981    | 0.9866 | 0.9923 | 1616 | 1907 | 3  | 22 |
| 0.1332        | 19.0  | 4218 | 0.0343          | 0.9930   | 0.9981    | 0.9866 | 0.9923 | 1616 | 1907 | 3  | 22 |
| 0.1294        | 20.0  | 4440 | 0.0340          | 0.9935   | 0.9981    | 0.9878 | 0.9929 | 1618 | 1907 | 3  | 20 |


### Framework versions

- Transformers 5.2.0
- Pytorch 2.9.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.2