cringgaard commited on
Commit
8cf6ef7
·
verified ·
1 Parent(s): d5e2d69

Model save

Browse files
Files changed (1) hide show
  1. README.md +74 -0
README.md ADDED
@@ -0,0 +1,74 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: openai/clip-vit-large-patch14
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: sail-clip
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # sail-clip
15
+
16
+ This model is a fine-tuned version of [openai/clip-vit-large-patch14](https://huggingface.co/openai/clip-vit-large-patch14) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 2.0815
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 5e-05
38
+ - train_batch_size: 64
39
+ - eval_batch_size: 64
40
+ - seed: 42
41
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
42
+ - lr_scheduler_type: linear
43
+ - num_epochs: 1
44
+
45
+ ### Training results
46
+
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:------:|:----:|:---------------:|
49
+ | 4.8985 | 0.0552 | 10 | 4.0903 |
50
+ | 4.1607 | 0.1105 | 20 | 3.9977 |
51
+ | 3.9999 | 0.1657 | 30 | 3.8481 |
52
+ | 3.7439 | 0.2210 | 40 | 3.6745 |
53
+ | 3.6873 | 0.2762 | 50 | 3.5240 |
54
+ | 3.4241 | 0.3315 | 60 | 3.2912 |
55
+ | 3.2521 | 0.3867 | 70 | 3.1707 |
56
+ | 3.0498 | 0.4420 | 80 | 2.9794 |
57
+ | 2.9275 | 0.4972 | 90 | 2.8728 |
58
+ | 2.8409 | 0.5525 | 100 | 2.6969 |
59
+ | 2.6954 | 0.6077 | 110 | 2.6175 |
60
+ | 2.5344 | 0.6630 | 120 | 2.5060 |
61
+ | 2.5042 | 0.7182 | 130 | 2.4477 |
62
+ | 2.2965 | 0.7735 | 140 | 2.3057 |
63
+ | 2.3179 | 0.8287 | 150 | 2.2107 |
64
+ | 2.2797 | 0.8840 | 160 | 2.1689 |
65
+ | 2.0838 | 0.9392 | 170 | 2.1016 |
66
+ | 1.9926 | 0.9945 | 180 | 2.0815 |
67
+
68
+
69
+ ### Framework versions
70
+
71
+ - Transformers 4.49.0
72
+ - Pytorch 2.5.1+cu118
73
+ - Datasets 3.2.0
74
+ - Tokenizers 0.21.0