zer0int commited on
Commit
7dc2a2c
·
verified ·
1 Parent(s): 328ca0b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +116 -15
README.md CHANGED
@@ -2,13 +2,119 @@
2
  license: mit
3
  ---
4
 
5
- Placeholder
6
 
7
  -------
 
8
 
 
 
 
 
 
 
 
 
 
 
9
  ### 📊 Standard Benchmark Evaluation
10
  🌟 = This Model
11
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  #### Zero-Shot (CLIP Benchmark)
13
 
14
  | Task / Dataset | Metric | pretrained | 🌟 regr-norm | regr-brut |
@@ -59,14 +165,6 @@ Placeholder
59
  | Linear Probe Top-1 (%) | 72.35 | 70.94 | 65.09 |
60
  | Linear Probe Top-5 (%) | 93.42 | 93.29 | 89.60 |
61
 
62
- #### One-shot ImageNet-1k (Top-5), linear classifier on final embedding — own scripts
63
-
64
- | Representation | pretrained | 🌟 regr-norm | regr-brut |
65
- |---|---:|---:|---:|
66
- | One-shot Top-5 (FULL) | 0.437 | 0.599 | 0.553 |
67
- | One-shot Top-5 (PATCH-ONLY) | 0.443 | 0.603 | 0.550 |
68
- | One-shot Top-5 (REG-ONLY) | 0.154 | 0.010 | 0.007 |
69
-
70
  🔗 Note: 'own scripts' available at [github.com/zer0int/CLIP-fine-tune](https://github.com/zer0int/CLIP-fine-tune)
71
 
72
  -------
@@ -81,7 +179,8 @@ Please see the paper for more information!
81
  |---|---|---:|---:|---:|
82
  | NoSCAM (1162) | CLS | 0.9905 | 0.9897 | 0.9897 |
83
  | NoSCAM (1162) | CLS-PATCHSUB | 0.9544 | 0.9845 | 0.9811 |
84
- | NoSCAM (1162) | CLS-PATCHREG | 0.9466 | 0.9888 | 0.9888 |
 
85
  | NoSCAM (1162) | REG-L23-NOPC | 0.9380 | 0.9613 | 0.9570 |
86
  | NoSCAM (1162) | REG-L23-1PC | 0.9630 | 0.9802 | 0.9802 |
87
  | NoSCAM (1162) | REG-L23-8PC | 0.9509 | 0.9664 | 0.9604 |
@@ -89,7 +188,8 @@ Please see the paper for more information!
89
  | NoSCAM (1162) | PATCHΔ | 0.9690 | 0.9905 | 0.9888 |
90
  | SCAM (1162) | CLS | 0.4182 | 0.8038 | 0.8830 |
91
  | SCAM (1162) | CLS-PATCHSUB | 0.4957 | 0.8632 | 0.9002 |
92
- | SCAM (1162) | CLS-PATCHREG | 0.8761 | 0.8537 | 0.9174 |
 
93
  | SCAM (1162) | REG-L23-NOPC | 0.7410 | 0.8244 | 0.7719 |
94
  | SCAM (1162) | REG-L23-1PC | 0.7539 | 0.8726 | 0.7943 |
95
  | SCAM (1162) | REG-L23-8PC | 0.7057 | 0.8038 | 0.7143 |
@@ -97,7 +197,8 @@ Please see the paper for more information!
97
  | SCAM (1162) | PATCHΔ | 0.8778 | 0.8451 | 0.8744 |
98
  | SynthSCAM (1162) | CLS | 0.3219 | 0.8021 | 0.8804 |
99
  | SynthSCAM (1162) | CLS-PATCHSUB | 0.4406 | 0.8580 | 0.9071 |
100
- | SynthSCAM (1162) | CLS-PATCHREG | 0.8890 | 0.8460 | 0.9200 |
 
101
  | SynthSCAM (1162) | REG-L23-NOPC | 0.7823 | 0.8382 | 0.7771 |
102
  | SynthSCAM (1162) | REG-L23-1PC | 0.8055 | 0.8812 | 0.8072 |
103
  | SynthSCAM (1162) | REG-L23-8PC | 0.7289 | 0.8167 | 0.7126 |
@@ -105,10 +206,10 @@ Please see the paper for more information!
105
  | SynthSCAM (1162) | PATCHΔ | 0.9217 | 0.8614 | 0.8769 |
106
  | MVT (200382) | CLS | 0.8830 | 0.8730 | 0.8573 |
107
  | MVT (200382) | CLS-PATCHSUB | 0.4720 | 0.8246 | 0.8057 |
108
- | MVT (200382) | CLS-PATCHREG | 0.7166 | 0.8703 | 0.8518 |
 
109
  | MVT (200382) | REG-L23-NOPC | 0.7640 | 0.7935 | 0.7680 |
110
  | MVT (200382) | REG-L23-1PC | 0.7921 | 0.8193 | 0.8032 |
111
  | MVT (200382) | REG-L23-8PC | 0.7724 | 0.8057 | 0.7812 |
112
  | MVT (200382) | PATCH-L23 | 0.3414 | 0.8652 | 0.8191 |
113
- | MVT (200382) | PATCHΔ | 0.6881 | 0.8667 | 0.8510 |
114
-
 
2
  license: mit
3
  ---
4
 
5
+ Placeholder
6
 
7
  -------
8
+ Love ❤️ this CLIP?
9
 
10
+ ᐅ [Buy me a coffee](https://ko-fi.com/zer0int) on Ko-Fi ☕
11
+ <details>
12
+ <summary>Or click here for address to send 🪙₿ BTC</summary>
13
+
14
+ ```
15
+ 3PscBrWYvrutXedLmvpcnQbE12Py8qLqMK
16
+ ```
17
+ </details>
18
+
19
+ -------
20
  ### 📊 Standard Benchmark Evaluation
21
  🌟 = This Model
22
 
23
+ #### Zero-Shot (Typographic Attack)
24
+
25
+ | Task / Dataset | Metric | pretrained | 🌟 regr-norm | regr-brut |
26
+ |---|---|---:|---:|---:|
27
+ | SCAM::NoSCAM | acc | 0.9905 | 0.9897 | 0.9897 |
28
+ | SCAM::SCAM | acc | 0.4191 | 0.8046 | 0.8830 |
29
+ | SCAM::SynthSCAM | acc | 0.3227 | 0.8029 | 0.8804 |
30
+ | RTA100 | acc | 0.4330 | 0.7880 | 0.8930 |
31
+
32
+
33
+ <details>
34
+ <summary>👉 CLICK to reproduce: Expand SCAM typographic attack benchmark code ⚡💻</summary>
35
+
36
+ ```
37
+ from datasets import load_dataset
38
+ from transformers import CLIPModel, CLIPProcessor
39
+ import torch
40
+ from PIL import Image
41
+ from tqdm import tqdm
42
+ import pandas as pd
43
+
44
+ device = "cuda" if torch.cuda.is_available() else "cpu"
45
+
46
+ # BLISS / SCAM Typographic Attack Dataset
47
+ # https://huggingface.co/datasets/BLISS-e-V/SCAM
48
+ ds = load_dataset("BLISS-e-V/SCAM", split="train")
49
+
50
+ # Benchmark pre-trained model against my fine-tune
51
+ model_variants = [
52
+ ("OpenAI ", "openai/clip-vit-large-patch14-336", "openai/clip-vit-large-patch14-336"),
53
+ ("regr-norm", "zer0int/CLIP-Regression-ViT-L-14", "zer0int/CLIP-Regression-ViT-L-14"),
54
+ ("regr-brut", "zer0int/CLIP-Regression-BRUT-ViT-L-14", "zer0int/CLIP-Regression-BRUT-ViT-L-14"),
55
+ ]
56
+
57
+ models = {}
58
+ for name, model_path, processor_path in model_variants:
59
+ model = CLIPModel.from_pretrained(model_path).to(device).float()
60
+ processor = CLIPProcessor.from_pretrained(processor_path)
61
+ models[name] = (model, processor)
62
+
63
+ for variant in ["NoSCAM", "SCAM", "SynthSCAM"]:
64
+ print(f"\n=== Evaluating var.: {variant} ===")
65
+ idxs = [i for i, v in enumerate(ds['id']) if v.startswith(variant)]
66
+ if not idxs:
67
+ print(f" No samples for {variant}")
68
+ continue
69
+ subset = [ds[i] for i in idxs]
70
+
71
+ for model_name, (model, processor) in models.items():
72
+ results = []
73
+ for entry in tqdm(subset, desc=f"{model_name}", ncols=30, bar_format="{l_bar}{bar}| {n_fmt}/{total_fmt} |"):
74
+ img = entry['image']
75
+ object_label = entry['object_label']
76
+ attack_word = entry['attack_word']
77
+
78
+ texts = [f"a photo of a {object_label}", f"a photo of a {attack_word}"]
79
+ inputs = processor(
80
+ text=texts,
81
+ images=img,
82
+ return_tensors="pt",
83
+ padding=True
84
+ )
85
+ for k in inputs:
86
+ if isinstance(inputs[k], torch.Tensor):
87
+ inputs[k] = inputs[k].to(device)
88
+
89
+ with torch.no_grad():
90
+ outputs = model(**inputs)
91
+ image_features = outputs.image_embeds
92
+ text_features = outputs.text_embeds
93
+
94
+ logits = image_features @ text_features.T
95
+ probs = logits.softmax(dim=-1).cpu().numpy().flatten()
96
+ pred_idx = probs.argmax()
97
+ pred_label = [object_label, attack_word][pred_idx]
98
+ is_correct = (pred_label == object_label)
99
+
100
+ results.append({
101
+ "id": entry['id'],
102
+ "object_label": object_label,
103
+ "attack_word": attack_word,
104
+ "pred_label": pred_label,
105
+ "is_correct": is_correct,
106
+ "type": entry['type'],
107
+ "model": model_name
108
+ })
109
+
110
+ n_total = len(results)
111
+ n_correct = sum(r['is_correct'] for r in results)
112
+ acc = n_correct / n_total if n_total else float('nan')
113
+ print(f"| > > > > Zero-shot accuracy for {variant}, {model_name}: {n_correct}/{n_total} = {acc:.4f}")
114
+ ```
115
+ </details>
116
+
117
+
118
  #### Zero-Shot (CLIP Benchmark)
119
 
120
  | Task / Dataset | Metric | pretrained | 🌟 regr-norm | regr-brut |
 
165
  | Linear Probe Top-1 (%) | 72.35 | 70.94 | 65.09 |
166
  | Linear Probe Top-5 (%) | 93.42 | 93.29 | 89.60 |
167
 
 
 
 
 
 
 
 
 
168
  🔗 Note: 'own scripts' available at [github.com/zer0int/CLIP-fine-tune](https://github.com/zer0int/CLIP-fine-tune)
169
 
170
  -------
 
179
  |---|---|---:|---:|---:|
180
  | NoSCAM (1162) | CLS | 0.9905 | 0.9897 | 0.9897 |
181
  | NoSCAM (1162) | CLS-PATCHSUB | 0.9544 | 0.9845 | 0.9811 |
182
+ | NoSCAM (1162) | CLS-PATCHREG-I | 0.9466 | 0.9888 | 0.9888 |
183
+ | NoSCAM (1162) | CLS-PATCHREG-N | 0.9871 | 0.9897 | 0.9888 |
184
  | NoSCAM (1162) | REG-L23-NOPC | 0.9380 | 0.9613 | 0.9570 |
185
  | NoSCAM (1162) | REG-L23-1PC | 0.9630 | 0.9802 | 0.9802 |
186
  | NoSCAM (1162) | REG-L23-8PC | 0.9509 | 0.9664 | 0.9604 |
 
188
  | NoSCAM (1162) | PATCHΔ | 0.9690 | 0.9905 | 0.9888 |
189
  | SCAM (1162) | CLS | 0.4182 | 0.8038 | 0.8830 |
190
  | SCAM (1162) | CLS-PATCHSUB | 0.4957 | 0.8632 | 0.9002 |
191
+ | SCAM (1162) | CLS-PATCHREG-I | 0.8761 | 0.8537 | 0.9174 |
192
+ | SCAM (1162) | CLS-PATCHREG-N | 0.9286 | 0.8537 | 0.9165 |
193
  | SCAM (1162) | REG-L23-NOPC | 0.7410 | 0.8244 | 0.7719 |
194
  | SCAM (1162) | REG-L23-1PC | 0.7539 | 0.8726 | 0.7943 |
195
  | SCAM (1162) | REG-L23-8PC | 0.7057 | 0.8038 | 0.7143 |
 
197
  | SCAM (1162) | PATCHΔ | 0.8778 | 0.8451 | 0.8744 |
198
  | SynthSCAM (1162) | CLS | 0.3219 | 0.8021 | 0.8804 |
199
  | SynthSCAM (1162) | CLS-PATCHSUB | 0.4406 | 0.8580 | 0.9071 |
200
+ | SynthSCAM (1162) | CLS-PATCHREG-I | 0.8890 | 0.8460 | 0.9200 |
201
+ | SynthSCAM (1162) | CLS-PATCHREG-N | 0.9449 | 0.8494 | 0.9200 |
202
  | SynthSCAM (1162) | REG-L23-NOPC | 0.7823 | 0.8382 | 0.7771 |
203
  | SynthSCAM (1162) | REG-L23-1PC | 0.8055 | 0.8812 | 0.8072 |
204
  | SynthSCAM (1162) | REG-L23-8PC | 0.7289 | 0.8167 | 0.7126 |
 
206
  | SynthSCAM (1162) | PATCHΔ | 0.9217 | 0.8614 | 0.8769 |
207
  | MVT (200382) | CLS | 0.8830 | 0.8730 | 0.8573 |
208
  | MVT (200382) | CLS-PATCHSUB | 0.4720 | 0.8246 | 0.8057 |
209
+ | MVT (200382) | CLS-PATCHREG-I | 0.7166 | 0.8703 | 0.8518 |
210
+ | MVT (200382) | CLS-PATCHREG-N | 0.5695 | 0.8675 | 0.8478 |
211
  | MVT (200382) | REG-L23-NOPC | 0.7640 | 0.7935 | 0.7680 |
212
  | MVT (200382) | REG-L23-1PC | 0.7921 | 0.8193 | 0.8032 |
213
  | MVT (200382) | REG-L23-8PC | 0.7724 | 0.8057 | 0.7812 |
214
  | MVT (200382) | PATCH-L23 | 0.3414 | 0.8652 | 0.8191 |
215
+ | MVT (200382) | PATCHΔ | 0.6881 | 0.8667 | 0.8510 |