Instructions to use crumb/doc2desc_3b_gguf with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- llama-cpp-python
How to use crumb/doc2desc_3b_gguf with llama-cpp-python:
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="crumb/doc2desc_3b_gguf", filename="doc2desc_bf16.gguf", )
llm.create_chat_completion( messages = "\"The tower is 324 metres (1,063 ft) tall, about the same height as an 81-storey building, and the tallest structure in Paris. Its base is square, measuring 125 metres (410 ft) on each side. During its construction, the Eiffel Tower surpassed the Washington Monument to become the tallest man-made structure in the world, a title it held for 41 years until the Chrysler Building in New York City was finished in 1930. It was the first structure to reach a height of 300 metres. Due to the addition of a broadcasting aerial at the top of the tower in 1957, it is now taller than the Chrysler Building by 5.2 metres (17 ft). Excluding transmitters, the Eiffel Tower is the second tallest free-standing structure in France after the Millau Viaduct.\"" )
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- llama.cpp
How to use crumb/doc2desc_3b_gguf with llama.cpp:
Install from brew
brew install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf crumb/doc2desc_3b_gguf:BF16 # Run inference directly in the terminal: llama-cli -hf crumb/doc2desc_3b_gguf:BF16
Install from WinGet (Windows)
winget install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf crumb/doc2desc_3b_gguf:BF16 # Run inference directly in the terminal: llama-cli -hf crumb/doc2desc_3b_gguf:BF16
Use pre-built binary
# Download pre-built binary from: # https://github.com/ggerganov/llama.cpp/releases # Start a local OpenAI-compatible server with a web UI: ./llama-server -hf crumb/doc2desc_3b_gguf:BF16 # Run inference directly in the terminal: ./llama-cli -hf crumb/doc2desc_3b_gguf:BF16
Build from source code
git clone https://github.com/ggerganov/llama.cpp.git cd llama.cpp cmake -B build cmake --build build -j --target llama-server llama-cli # Start a local OpenAI-compatible server with a web UI: ./build/bin/llama-server -hf crumb/doc2desc_3b_gguf:BF16 # Run inference directly in the terminal: ./build/bin/llama-cli -hf crumb/doc2desc_3b_gguf:BF16
Use Docker
docker model run hf.co/crumb/doc2desc_3b_gguf:BF16
- LM Studio
- Jan
- Ollama
How to use crumb/doc2desc_3b_gguf with Ollama:
ollama run hf.co/crumb/doc2desc_3b_gguf:BF16
- Unsloth Studio new
How to use crumb/doc2desc_3b_gguf with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for crumb/doc2desc_3b_gguf to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for crumb/doc2desc_3b_gguf to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for crumb/doc2desc_3b_gguf to start chatting
- Pi new
How to use crumb/doc2desc_3b_gguf with Pi:
Start the llama.cpp server
# Install llama.cpp: brew install llama.cpp # Start a local OpenAI-compatible server: llama-server -hf crumb/doc2desc_3b_gguf:BF16
Configure the model in Pi
# Install Pi: npm install -g @mariozechner/pi-coding-agent # Add to ~/.pi/agent/models.json: { "providers": { "llama-cpp": { "baseUrl": "http://localhost:8080/v1", "api": "openai-completions", "apiKey": "none", "models": [ { "id": "crumb/doc2desc_3b_gguf:BF16" } ] } } }Run Pi
# Start Pi in your project directory: pi
- Hermes Agent new
How to use crumb/doc2desc_3b_gguf with Hermes Agent:
Start the llama.cpp server
# Install llama.cpp: brew install llama.cpp # Start a local OpenAI-compatible server: llama-server -hf crumb/doc2desc_3b_gguf:BF16
Configure Hermes
# Install Hermes: curl -fsSL https://hermes-agent.nousresearch.com/install.sh | bash hermes setup # Point Hermes at the local server: hermes config set model.provider custom hermes config set model.base_url http://127.0.0.1:8080/v1 hermes config set model.default crumb/doc2desc_3b_gguf:BF16
Run Hermes
hermes
- Docker Model Runner
How to use crumb/doc2desc_3b_gguf with Docker Model Runner:
docker model run hf.co/crumb/doc2desc_3b_gguf:BF16
- Lemonade
How to use crumb/doc2desc_3b_gguf with Lemonade:
Pull the model
# Download Lemonade from https://lemonade-server.ai/ lemonade pull crumb/doc2desc_3b_gguf:BF16
Run and chat with the model
lemonade run user.doc2desc_3b_gguf-BF16
List all available models
lemonade list
llm.create_chat_completion(
messages = "\"The tower is 324 metres (1,063 ft) tall, about the same height as an 81-storey building, and the tallest structure in Paris. Its base is square, measuring 125 metres (410 ft) on each side. During its construction, the Eiffel Tower surpassed the Washington Monument to become the tallest man-made structure in the world, a title it held for 41 years until the Chrysler Building in New York City was finished in 1930. It was the first structure to reach a height of 300 metres. Due to the addition of a broadcasting aerial at the top of the tower in 1957, it is now taller than the Chrysler Building by 5.2 metres (17 ft). Excluding transmitters, the Eiffel Tower is the second tallest free-standing structure in France after the Millau Viaduct.\""
)DOC2DESC 3B
This is Qwen/Qwen2.5-3B tuned with the following format, on a mix of handwritten and Deepseek-V3 generated descriptions (few-shot w/ handwritten descriptions) for texts from https://textfiles.com, to make sure it has the ability to label unsafe content. This is being used to generate heaps of description/document pairs for training another model to do the reverse, for automatically generating documents to create control-vectors from.
Context format
During training it saw this format:
[[DOCUMENT]]{document}[[/DOCUMENT]]
[[DESCRIPTION]]{description}[[/DESCRIPTION]]
| Position | Delimiter |
|---|---|
| before user | [[DOCUMENT]] |
| after user | [[/DOCUMENT]] |
| before assistant | [[DESCRIPTION]] |
| after assistant | [[/DESCRIPTION]] |
you may also want to add "[[" as a stop string.. light tune, isn't perfect ๐
Output qualities
The outputs are something like informal summaries, for example, on the first element from the C4 dataset here are some outputs (at temperature 0.8):
(text from C4):
Beginners BBQ Class Taking Place in Missoula!
Do you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.
He will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.
The cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.
(outputs):
ad to get better at making delicious BBQ by world class bbq champion from lonestar smoke rangers.
ad for BBQ class at lonestar smoke rangers by world class bbq champ tony balay; includes techniques, recipes
event ad: beginners BBQ Class Taking Place in Missoula! from world class bbs champion tony balay
- Downloads last month
- 15
16-bit
Model tree for crumb/doc2desc_3b_gguf
Base model
Qwen/Qwen2.5-3B
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="crumb/doc2desc_3b_gguf", filename="doc2desc_bf16.gguf", )