Configuration Parsing Warning: In UNKNOWN_FILENAME: "auto_map.AutoTokenizer" must be a string

Neurocoder

From-scratch narrow-domain coding SLM for React + Tailwind generation and unified-diff edits.

Includes trained model.safetensors weights.

Transformers Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

model_id = "Sharjeelbaig/neurocoder"
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True)

prompt = "Generate a landing page for marketing agency titled Velocity Landing"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(
    **inputs,
    max_new_tokens=220,
    do_sample=False,
    repetition_penalty=1.22,
    no_repeat_ngram_size=6,
    use_cache=True,
)
text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(text.split("\nAssistant:", 1)[-1].strip())
Downloads last month
189
Safetensors
Model size
18.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Sharjeelbaig/neurocoder

Adapters
1 model