You need to agree to share your contact information to access this dataset

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this dataset content.

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

PoC: Heap OOB Read via strlen() on Non-Null-Terminated Charsmap Data

Target: llama.cpp (ggml-org/llama.cpp) File: src/llama-vocab.cpp:1129 CWE: CWE-125 (Out-of-bounds Read)

Vulnerability

The UGM tokenizer's normalize_prefix() function calls strlen() on charsmap replacement data without verifying a null terminator exists within the allocation. A crafted precompiled_charsmap fills the replacement region with non-null bytes, causing strlen() to read past the heap allocation boundary.

This is a distinct vulnerability from the undersized charsmap uint32_t dereference bug (line 809). This bug is at line 1129 and triggers during tokenization, not initialization.

Files

  • poc_charsmap_strlen.py — Generates the crafted GGUF file (661 bytes)
  • poc_charsmap_strlen.gguf — Pre-built PoC file
  • reproduce.sh — Full reproduction script (clone, build with ASAN, trigger)

Quick Reproduce

# Build llama.cpp with ASAN
git clone https://github.com/ggml-org/llama.cpp && cd llama.cpp
mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Debug \
    -DCMAKE_C_FLAGS="-fsanitize=address -fno-omit-frame-pointer" \
    -DCMAKE_CXX_FLAGS="-fsanitize=address -fno-omit-frame-pointer"
cmake --build . --target llama-tokenize -j$(nproc)

# Trigger
./bin/llama-tokenize -m /path/to/poc_charsmap_strlen.gguf -p $'\x0b'

Expected: AddressSanitizer: heap-buffer-overflow READ at llama-vocab.cpp:1129

Downloads last month
2