File size: 4,489 Bytes
f544b4a
 
67b0710
 
 
 
 
f544b4a
 
67b0710
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
---
library_name: transformers
tags:
- generated_from_trainer
model-index:
- name: tiny-audio-qformer
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# tiny-audio-qformer

This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2513

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 6
- eval_batch_size: 6
- seed: 42
- gradient_accumulation_steps: 3
- total_train_batch_size: 18
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.95) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 1

### Training results

| Training Loss | Epoch  | Step  | Validation Loss |
|:-------------:|:------:|:-----:|:---------------:|
| 3.6078        | 0.0168 | 1000  | 3.3166          |
| 1.9204        | 0.0335 | 2000  | 1.3223          |
| 0.5808        | 0.0503 | 3000  | 0.3887          |
| 0.4371        | 0.0670 | 4000  | 0.3685          |
| 0.4322        | 0.0838 | 5000  | 0.3489          |
| 0.4205        | 0.1005 | 6000  | 0.3477          |
| 0.4579        | 0.1173 | 7000  | 0.3328          |
| 0.4638        | 0.1341 | 8000  | 0.3205          |
| 0.4099        | 0.1508 | 9000  | 0.3320          |
| 0.3847        | 0.1676 | 10000 | 0.3239          |
| 0.4399        | 0.1843 | 11000 | 0.3229          |
| 0.389         | 0.2011 | 12000 | 0.3203          |
| 0.4119        | 0.2179 | 13000 | 0.3127          |
| 0.3982        | 0.2346 | 14000 | 0.3168          |
| 0.4111        | 0.2514 | 15000 | 0.3098          |
| 0.384         | 0.2681 | 16000 | 0.3187          |
| 0.3788        | 0.2849 | 17000 | 0.2990          |
| 0.3883        | 0.3016 | 18000 | 0.2989          |
| 0.3853        | 0.3184 | 19000 | 0.2907          |
| 0.353         | 0.3352 | 20000 | 0.2921          |
| 0.3598        | 0.3519 | 21000 | 0.2864          |
| 0.3581        | 0.3687 | 22000 | 0.2887          |
| 0.3985        | 0.3854 | 23000 | 0.2859          |
| 0.35          | 0.4022 | 24000 | 0.2767          |
| 0.3463        | 0.4189 | 25000 | 0.2793          |
| 0.4123        | 0.4357 | 26000 | 0.2819          |
| 0.3481        | 0.4525 | 27000 | 0.2757          |
| 0.3205        | 0.4692 | 28000 | 0.2728          |
| 0.3538        | 0.4860 | 29000 | 0.2726          |
| 0.3485        | 0.5027 | 30000 | 0.2763          |
| 0.3865        | 0.5195 | 31000 | 0.2724          |
| 0.3744        | 0.5363 | 32000 | 0.2671          |
| 0.3458        | 0.5530 | 33000 | 0.2702          |
| 0.3151        | 0.5698 | 34000 | 0.2622          |
| 0.3505        | 0.5865 | 35000 | 0.2632          |
| 0.339         | 0.6033 | 36000 | 0.2632          |
| 0.3511        | 0.6200 | 37000 | 0.2606          |
| 0.3205        | 0.6368 | 38000 | 0.2598          |
| 0.3586        | 0.6536 | 39000 | 0.2593          |
| 0.3196        | 0.6703 | 40000 | 0.2592          |
| 0.3499        | 0.6871 | 41000 | 0.2567          |
| 0.3773        | 0.7038 | 42000 | 0.2552          |
| 0.3271        | 0.7206 | 43000 | 0.2547          |
| 0.3329        | 0.7374 | 44000 | 0.2546          |
| 0.3539        | 0.7541 | 45000 | 0.2536          |
| 0.3616        | 0.7709 | 46000 | 0.2515          |
| 0.3242        | 0.7876 | 47000 | 0.2527          |
| 0.3248        | 0.8044 | 48000 | 0.2534          |
| 0.3105        | 0.8211 | 49000 | 0.2520          |
| 0.3311        | 0.8379 | 50000 | 0.2515          |
| 0.3074        | 0.8547 | 51000 | 0.2512          |
| 0.3085        | 0.8714 | 52000 | 0.2513          |
| 0.3233        | 0.8882 | 53000 | 0.2515          |
| 0.3161        | 0.9049 | 54000 | 0.2513          |
| 0.3405        | 0.9217 | 55000 | 0.2516          |
| 0.3169        | 0.9384 | 56000 | 0.2513          |
| 0.3281        | 0.9552 | 57000 | 0.2514          |
| 0.3278        | 0.9720 | 58000 | 0.2512          |
| 0.3054        | 0.9887 | 59000 | 0.2513          |


### Framework versions

- Transformers 4.57.3
- Pytorch 2.8.0+cu128
- Datasets 3.6.0
- Tokenizers 0.22.1