ModelScan SavedModel SaveV2 Default-Serving Write PoC

Benign security proof-of-concept for a scanner-clean TensorFlow SavedModel that writes a TensorFlow checkpoint from its default serving signature.

Summary

The SavedModel in artifacts/savedmodel_savev2_default_write_variant/ exports a no-input serving_default signature. When invoked, that default serving path reaches tf.raw_ops.SaveV2 and writes a benign checkpoint marker:

TFSM_SAVEV2_DEFAULT_WRITE_MARKER_2026

ModelScan 0.8.8 scans saved_model.pb and fingerprint.pb and reports zero issues. The issue is not TensorFlow executing graph ops; the reportable scanner weakness is that ModelScan misses a default-inference-reachable filesystem write op while it already treats generic filesystem-writing TensorFlow ops as high risk.

Severity

High, CVSS 7.1.

Rationale: scanner-clean SavedModel with inference-time local file write under the loader process privileges. This is not arbitrary code execution and the write content is constrained to TensorFlow checkpoint output, but the behavior is reachable through the default no-input serving signature after normal tf.saved_model.load().

Tested Versions

  • Python 3.12.3
  • tensorflow==2.21.0
  • modelscan==0.8.8
  • keras==3.14.1
  • numpy==2.4.4

Files

  • artifacts/savedmodel_savev2_default_write_variant/: primary SavedModel artifact.
  • verify_poc.py: loads the SavedModel, invokes serving_default, verifies the checkpoint marker, and confirms ModelScan reports zero issues.
  • scripts/generate_savev2_default_write_variant.py: artifact generator.
  • evidence/modelscan_savev2_default_write_variant.json: captured ModelScan output.
  • evidence/runtime_savev2_default_write_variant.json: captured runtime output.
  • evidence/savev2_default_write_variant_reachability.txt: exported signature and graph-op reachability summary.
  • evidence/savev2_default_write_variant_sha256sums.txt: artifact hashes.

Reproduce

python -m venv .venv
. .venv/bin/activate
pip install -r requirements.txt

CUDA_VISIBLE_DEVICES=-1 python verify_poc.py

Expected key output:

{
  "loaded_marker": "TFSM_SAVEV2_DEFAULT_WRITE_MARKER_2026",
  "marker_matched": true,
  "modelscan_total_issues": 0,
  "result": {
    "done": true
  }
}

The graph reachability summary includes:

serving_default inputs: 0
serving_default outputs: done
__inference_fixed_savev2_write_9: Const, Identity, SaveV2

Impact

A scanner-clean SavedModel can carry an inference-reachable filesystem write in its default serving path. A registry, validation service, or model-serving workflow that trusts ModelScan output before loading and invoking SavedModels can miss this write behavior.

The PoC is benign: it writes only TensorFlow checkpoint files named savev2_runtime_ckpt.* in the local PoC directory.

Limitations

  • Not arbitrary code execution.
  • The marker write occurs when serving_default is invoked, not merely on tf.saved_model.load().
  • SaveV2 is also used in normal TensorFlow save/restore internals, so a useful scanner fix should account for reachability from exported serving signatures rather than flagging every internal save function blindly.
  • The output format is TensorFlow checkpoint data, not attacker-arbitrary file contents.

Primary Artifact Hashes

8f5765fe961ea8d59295db904a44e949188b6447953f828927738cd116a71972  artifacts/savedmodel_savev2_default_write_variant/saved_model.pb
2b05fdd92e3a7cb12287cb5e6d247c7ad1b57863bc6778079ce8fd17cc5d5086  artifacts/savedmodel_savev2_default_write_variant/fingerprint.pb
c963156fdc06f10667e742b6b59fb3f5edadb808ecf276bf2c6b8c482dae59cd  artifacts/savedmodel_savev2_default_write_variant/variables/variables.index
653403b5a48cec2e4b68a61eea7a6328ea8b58c5c393bcfaca06e146b11d939c  artifacts/savedmodel_savev2_default_write_variant/variables/variables.data-00000-of-00001

Suggested Mitigation

ModelScan should detect filesystem-writing TensorFlow ops that are reachable from exported SavedModel signatures, including SaveV2, and report them with severity based on reachability and whether the path/content is constant, input-controlled, or otherwise model-controlled.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support