Instructions to use cduk/gadsby with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- llama-cpp-python
How to use cduk/gadsby with llama-cpp-python:
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="cduk/gadsby", filename="gadsby-0.6B-Q4_K.gguf", )
llm.create_chat_completion( messages = [ { "role": "user", "content": "What is the capital of France?" } ] ) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- llama.cpp
How to use cduk/gadsby with llama.cpp:
Install from brew
brew install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf cduk/gadsby # Run inference directly in the terminal: llama-cli -hf cduk/gadsby
Install from WinGet (Windows)
winget install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf cduk/gadsby # Run inference directly in the terminal: llama-cli -hf cduk/gadsby
Use pre-built binary
# Download pre-built binary from: # https://github.com/ggerganov/llama.cpp/releases # Start a local OpenAI-compatible server with a web UI: ./llama-server -hf cduk/gadsby # Run inference directly in the terminal: ./llama-cli -hf cduk/gadsby
Build from source code
git clone https://github.com/ggerganov/llama.cpp.git cd llama.cpp cmake -B build cmake --build build -j --target llama-server llama-cli # Start a local OpenAI-compatible server with a web UI: ./build/bin/llama-server -hf cduk/gadsby # Run inference directly in the terminal: ./build/bin/llama-cli -hf cduk/gadsby
Use Docker
docker model run hf.co/cduk/gadsby
- LM Studio
- Jan
- vLLM
How to use cduk/gadsby with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "cduk/gadsby" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "cduk/gadsby", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/cduk/gadsby
- Ollama
How to use cduk/gadsby with Ollama:
ollama run hf.co/cduk/gadsby
- Unsloth Studio new
How to use cduk/gadsby with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for cduk/gadsby to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for cduk/gadsby to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for cduk/gadsby to start chatting
- Pi new
How to use cduk/gadsby with Pi:
Start the llama.cpp server
# Install llama.cpp: brew install llama.cpp # Start a local OpenAI-compatible server: llama-server -hf cduk/gadsby
Configure the model in Pi
# Install Pi: npm install -g @mariozechner/pi-coding-agent # Add to ~/.pi/agent/models.json: { "providers": { "llama-cpp": { "baseUrl": "http://localhost:8080/v1", "api": "openai-completions", "apiKey": "none", "models": [ { "id": "cduk/gadsby" } ] } } }Run Pi
# Start Pi in your project directory: pi
- Hermes Agent new
How to use cduk/gadsby with Hermes Agent:
Start the llama.cpp server
# Install llama.cpp: brew install llama.cpp # Start a local OpenAI-compatible server: llama-server -hf cduk/gadsby
Configure Hermes
# Install Hermes: curl -fsSL https://hermes-agent.nousresearch.com/install.sh | bash hermes setup # Point Hermes at the local server: hermes config set model.provider custom hermes config set model.base_url http://127.0.0.1:8080/v1 hermes config set model.default cduk/gadsby
Run Hermes
hermes
- Docker Model Runner
How to use cduk/gadsby with Docker Model Runner:
docker model run hf.co/cduk/gadsby
- Lemonade
How to use cduk/gadsby with Lemonade:
Pull the model
# Download Lemonade from https://lemonade-server.ai/ lemonade pull cduk/gadsby
Run and chat with the model
lemonade run user.gadsby-{{QUANT_TAG}}List all available models
lemonade list
Gadsby
These are versions of Qwen3-0.6B and Qwen3-8B that modifies the output weights so that any tokens containing the letter 'e' has probability set to zero.
In effect, this model never outputs the letter 'e'.
The models are offered as GGUFs quantized to Q4.
Hint: as the model can quickly get stuck in a low probability hole, it is useful to use some kind of beam search or back-tracking algorithm to select tokens.
Observations
- The model can be creative in working around the lack of a letter 'e', for example instead of sheep it can say "adult lamb" or "dominant wool-producing animal".
- Sometimes it drops the 'e' and mis-spells a word
- I did not filter out some cyrillic letters which look like and 'e' and sometimes it uses those (or even Chinese words)
- The smaller model can output refusals if it doesn't find a suitable token
- Instead of greedy sampling or probability sampling, beam search with back-tracking is useful to avoid leading the generation into an 'e' dominated dead-end
Example
PROMPT: "tell me about different farm animals, esp. those that are used for wool"
Farm animals play a vital part in human civilization, providing food, labor, and various products such as wool, milk, and transportation. Among farm animals, wool-producing animals hold a significant position in many agricultural and pastoral traditions. Wool is a natural, insulating, and highly sought commodity, and it is primarily grown from animals that naturally carry a thick coat of hair. In this discussion, I'll focus on wool-producing farm animals, with a particular focus on dominant wool-producing animals, and also touch on non-wool farm animals for a broad comparison.
1. Wool-Producing Farm Animals
A. Lambs (Wool from Lambs)
- Primary Wool-Producing Animal: **Wool is primarily from lambs, which is a young dominant wool-producing animal.
- Common Wool-Producing Lamb Variants:
- Shropshir โ Known for high-quality, long, and soft wool.
- Ramboult โ Known for high wool production and good quality.
- Suffolk โ Known for high wool production and good quality.
- Downloads last month
- 6