Error: No text generated for this prompt

#1
by deepakgarg08 - opened

When I try to run the below official code provided, I also tried with 3bit, it gives me this error: "Error: No text generated for this prompt". I tried different prompts also same

from mlx_lm import load, generate

model, tokenizer = load("mlx-community/SmolLM3-3B-Base-4bit")

prompt = "hello"

if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)

response = generate(model, tokenizer, prompt=prompt, verbose=True)

Sign up or log in to comment