Shaquille Williams commited on
Commit
ff7d4f8
Β·
0 Parent(s):

Security Research: Pickle RCE PoC (VERIFIED) - Huntr Bug Bounty Disclosure

Browse files
Files changed (4) hide show
  1. .gitattributes +5 -0
  2. .metadata +0 -0
  3. README.md +203 -0
  4. config.json +17 -0
.gitattributes ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ *.bin filter=lfs diff=lfs merge=lfs -text
2
+ *.pkl filter=lfs diff=lfs merge=lfs -text
3
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
4
+ *.pt filter=lfs diff=lfs merge=lfs -text
5
+ *.pth filter=lfs diff=lfs merge=lfs -text
.metadata ADDED
Binary file (234 Bytes). View file
 
README.md ADDED
@@ -0,0 +1,203 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - security-research
5
+ - vulnerability-disclosure
6
+ - poc
7
+ - do-not-use
8
+ ---
9
+ # ⚠️ SECURITY RESEARCH - MALICIOUS MODEL POC
10
+
11
+ ## 🚨 WARNING: DO NOT USE IN PRODUCTION
12
+
13
+ This is a **proof-of-concept malicious model** created for responsible security disclosure.
14
+
15
+ **Purpose:** Demonstrate arbitrary code execution vulnerability in Hugging Face Transformers
16
+ **Program:** Huntr Bug Bounty (MFV - Model File Vulnerabilities)
17
+ **Severity:** CRITICAL (CVSS 9.6)
18
+ **Status:** Responsibly disclosed
19
+
20
+ ---
21
+
22
+ ## Vulnerability Details
23
+
24
+ ### Summary
25
+
26
+ This model contains a weaponized `.metadata` pickle file that executes arbitrary code when
27
+ loaded by the transformers library's OLMo3 conversion script **without any user warning or
28
+ TRUST_REMOTE_CODE validation**.
29
+
30
+ ### Affected Code
31
+
32
+ **File:** `src/transformers/models/olmo3/convert_olmo3_weights_to_hf.py`
33
+ **Lines:** 259-260
34
+
35
+ ```python
36
+ with (Path(model_path) / ".metadata").open("rb") as metadata_file:
37
+ metadata = pickle.load(metadata_file) # ❌ UNSAFE - No validation!
38
+ keys = [key for key in metadata.state_dict_metadata.keys() if key.startswith("model.")]
39
+ ```
40
+
41
+ ### Proof of Exploitation
42
+
43
+ When this model is converted using the official conversion script:
44
+
45
+ ```bash
46
+ python src/transformers/models/olmo3/convert_olmo3_weights_to_hf.py \
47
+ --model-path ./malicious_olmo3_poc \
48
+ --output-dir ./converted
49
+ ```
50
+
51
+ **What happens:**
52
+ 1. βœ… Conversion script loads `.metadata` using `pickle.load()`
53
+ 2. βœ… Pickle exploit triggers via Python's `__reduce__` magic method
54
+ 3. βœ… Arbitrary code executes (creates `/tmp/huntr_poc_exploited.txt`)
55
+ 4. βœ… **NO WARNING** shown to user
56
+ 5. βœ… **NO TRUST_REMOTE_CODE** check required
57
+
58
+ **Result:** Full arbitrary code execution with process privileges.
59
+
60
+ ### Security Impact
61
+
62
+ **CVSS Score:** 9.6 (Critical)
63
+ **Attack Vector:** Network (AV:N)
64
+ **Attack Complexity:** Low (AC:L)
65
+ **Privileges Required:** None (PR:N)
66
+ **User Interaction:** Required (UI:R)
67
+ **Scope:** Changed (S:C)
68
+ **Confidentiality:** High (C:H)
69
+ **Integrity:** High (I:H)
70
+ **Availability:** High (A:H)
71
+
72
+ **Impact:**
73
+ - πŸ”΄ Arbitrary code execution at model load time
74
+ - πŸ”΄ Full system compromise
75
+ - πŸ”΄ Data exfiltration (SSH keys, credentials, API tokens)
76
+ - πŸ”΄ Persistent backdoor installation
77
+ - πŸ”΄ Supply chain attack vector
78
+
79
+ ### ProtectAI Scanner Bypass
80
+
81
+ This vulnerability **bypasses HuggingFace's ProtectAI security scanner** because:
82
+
83
+ 1. βœ— Scanner focuses on `.pkl` weight files, not `.metadata` files
84
+ 2. βœ— Hidden file (starts with `.`) often ignored by scanners
85
+ 3. βœ— Loaded by conversion scripts, not main model loading paths
86
+ 4. βœ— No file extension indicating pickle format
87
+ 5. βœ— No `TRUST_REMOTE_CODE` validation in this code path
88
+
89
+ **Result:** Malicious model can be uploaded to HuggingFace Hub and bypass automated security checks.
90
+
91
+ ---
92
+
93
+ ## Responsible Disclosure
94
+
95
+ ### Disclosure Timeline
96
+
97
+ - **Discovery Date:** October 6, 2025
98
+ - **Disclosure Platform:** Huntr (https://huntr.com)
99
+ - **Program:** Model File Vulnerabilities (MFV)
100
+ - **Status:** Reported to maintainers
101
+ - **CVE:** Pending assignment
102
+
103
+ ### Affected Versions
104
+
105
+ - βœ— Hugging Face Transformers: All versions with OLMo3 support
106
+ - βœ— Affected Models: OLMo3 checkpoints requiring conversion
107
+
108
+ ### Remediation
109
+
110
+ **Immediate Fix:**
111
+
112
+ ```python
113
+ # BEFORE (UNSAFE):
114
+ with (Path(model_path) / ".metadata").open("rb") as metadata_file:
115
+ metadata = pickle.load(metadata_file)
116
+
117
+ # AFTER (SAFE):
118
+ import json
119
+ with (Path(model_path) / ".metadata.json").open("r") as metadata_file:
120
+ metadata = json.load(metadata_file) # Safe - no code execution
121
+ ```
122
+
123
+ **Long-term Fix:**
124
+ 1. Migrate all metadata to JSON format
125
+ 2. Deprecate pickle for model metadata
126
+ 3. Add `TRUST_REMOTE_CODE` checks if pickle is necessary
127
+ 4. Update ProtectAI scanner to detect `.metadata` files
128
+
129
+ ---
130
+
131
+ ## Usage Instructions (For Security Researchers)
132
+
133
+ ### Testing the Vulnerability
134
+
135
+ ```bash
136
+ # 1. Clone transformers repository
137
+ git clone https://github.com/huggingface/transformers.git
138
+ cd transformers
139
+
140
+ # 2. Clone this PoC model
141
+ git lfs install
142
+ git clone https://huggingface.co/<USERNAME>/malicious-olmo3-poc
143
+
144
+ # 3. Run conversion script (triggers exploit)
145
+ python src/transformers/models/olmo3/convert_olmo3_weights_to_hf.py \
146
+ --model-path ./malicious-olmo3-poc \
147
+ --output-dir ./converted
148
+
149
+ # 4. Verify exploitation
150
+ cat /tmp/huntr_poc_exploited.txt
151
+ # Expected output: "VULNERABILITY EXPLOITED - Arbitrary code executed via pickle deserialization"
152
+ ```
153
+
154
+ ### What This Proves
155
+
156
+ βœ… Arbitrary code execution during model loading
157
+ βœ… No user warning or consent required
158
+ βœ… Bypasses security scanning
159
+ βœ… Affects real-world usage scenarios
160
+
161
+ ---
162
+
163
+ ## Ethical Guidelines
164
+
165
+ ### βœ… ALLOWED:
166
+ - Security research and testing in isolated environments
167
+ - Responsible disclosure through official channels
168
+ - Educational use to understand vulnerabilities
169
+ - Contributing fixes to open source projects
170
+
171
+ ### ❌ PROHIBITED:
172
+ - Using this model in production environments
173
+ - Distributing without security context
174
+ - Executing on shared or production systems
175
+ - Using for malicious purposes
176
+ - Weaponizing with harmful payloads
177
+
178
+ ---
179
+
180
+ ## Contact
181
+
182
+ **Security Researcher:** [Your Name/Handle]
183
+ **Disclosure Platform:** Huntr (https://huntr.com)
184
+ **Project:** Hugging Face Transformers
185
+ **Maintainer Contact:** security@huggingface.co
186
+
187
+ ---
188
+
189
+ ## Acknowledgments
190
+
191
+ - Hugging Face team for Transformers library
192
+ - ProtectAI for security scanning infrastructure
193
+ - Huntr platform for responsible disclosure process
194
+
195
+ ---
196
+
197
+ **This model is for security research and responsible disclosure only.**
198
+ **Use in accordance with applicable laws and ethical guidelines.**
199
+
200
+ ## License
201
+
202
+ This PoC is provided for security research under responsible disclosure principles.
203
+ The Transformers library is licensed under Apache 2.0.
config.json ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "allenai/OLMo-3-7B",
3
+ "architectures": [
4
+ "OLMo3ForCausalLM"
5
+ ],
6
+ "model_type": "olmo3",
7
+ "torch_dtype": "float32",
8
+ "transformers_version": "4.45.0",
9
+ "hidden_size": 4096,
10
+ "intermediate_size": 11008,
11
+ "num_hidden_layers": 32,
12
+ "num_attention_heads": 32,
13
+ "num_key_value_heads": 32,
14
+ "vocab_size": 50280,
15
+ "max_position_embeddings": 4096,
16
+ "rms_norm_eps": 1e-05
17
+ }