VibeScript - Code to DSL Converter

vibecoder-discern converts natural language and code into VibeScript - a compact symbolic DSL for expressing programming concepts.

What is VibeScript?

VibeScript compresses verbose code into symbolic notation:

Code VibeScript
function add(a, b) { return a + b; } Ω> add!(a, b)
const users = await db.query(...) δ.m.p.query()
app.get('/api/users', ...) θ.m.route(θ.e, ζ.x)
if (error) { throw new Error(...) } ~system~γ#error!

Model Variants

Path Format Size Use Case
/lora-adapter/ LoRA ~13MB Merge with your own Qwen3-1.7B
/merged-model/ HuggingFace ~3.4GB Ready-to-use transformers
/gguf/ GGUF Q4_K_M ~1.1GB llama.cpp / Ollama

Quick Start

llama.cpp (GGUF)

# Download
wget https://huggingface.co/calebboud/vibescript/resolve/main/gguf/vibecoder-discern-1.7B-Q4_K_M.gguf

# Run
llama-cli -m vibecoder-discern-1.7B-Q4_K_M.gguf \
  -p "Convert this to vibescript: function multiply(x, y) { return x * y; }" \
  -n 100 --temp 0.7

Transformers (Merged Model)

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("calebboud/vibescript", subfolder="merged-model")
tokenizer = AutoTokenizer.from_pretrained("calebboud/vibescript", subfolder="merged-model")

prompt = "Convert this to vibescript: console.log('Hello World')"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

LoRA Adapter (Merge Yourself)

from peft import PeftModel
from transformers import AutoModelForCausalLM

base = AutoModelForCausalLM.from_pretrained("Qwen/Qwen3-1.7B")
model = PeftModel.from_pretrained(base, "calebboud/vibescript", subfolder="lora-adapter")
merged = model.merge_and_unload()

Training Details

  • Base Model: Qwen/Qwen3-1.7B
  • Method: LoRA (r=8, alpha=16)
  • Target Modules: q_proj, k_proj, v_proj, o_proj
  • Dataset: 885 code → vibescript examples
  • Task: CAUSAL_LM

VibeScript Symbols

Symbol Meaning
Ω> Function definition
Σ Route/scaffold
δ Database operations
θ HTTP/API
γ Error handling
ζ Structure/scaffold
α Analysis
ε Dependencies

Coming Soon

  • vibecoder-expand: VibeScript → Code (reverse direction)

License

Apache 2.0

Downloads last month
3
GGUF
Model size
2B params
Architecture
qwen3
Hardware compatibility
Log In to add your hardware

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for calebboud/vibescript

Finetuned
Qwen/Qwen3-1.7B
Adapter
(429)
this model