Datasets:

Modalities:
Text
Formats:
text
Size:
< 1K
ArXiv:
Libraries:
Datasets
llama.cpp / examples /lookup /README.md
dlxj
todo: 基于 CUDA 13.0 编译
2517be1

llama.cpp/examples/lookup

Demonstration of Prompt Lookup Decoding

https://github.com/apoorvumang/prompt-lookup-decoding

The key parameters for lookup decoding are ngram_min, ngram_max and n_draft. The first two determine the size of the ngrams to search for in the prompt for a match. The latter specifies how many subsequent tokens to draft if a match is found.

More info:

https://github.com/ggml-org/llama.cpp/pull/4484 https://github.com/ggml-org/llama.cpp/issues/4226