(Note: From short testing, this Alt version generated much better code)
Alternate version of DeepMagic-Coder-7b which can be found bellow.
This version uses a diffrent config setup, with the actual base model of the two merges as the "base_model". Test both for yourself and see which is better at coding. Benchmarks coming soon.
Config can be found bellow:
models:
- model: deepseek-ai_deepseek-coder-6.7b-instruct
parameters:
weight: 1
- model: ise-uiuc_Magicoder-S-DS-6.7B
parameters:
weight: 1
merge_method: task_arithmetic
base_model: deepseek-ai_deepseek-coder-6.7b-base
parameters:
normalize: true
int8_mask: true
dtype: float16
- Downloads last month
- 9

Install from pip and serve model
# Install vLLM from pip: pip install vllm# Start the vLLM server: vllm serve "rombodawg/DeepMagic-Coder-7b-Alt"# Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "rombodawg/DeepMagic-Coder-7b-Alt", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'