ocm-coder
LoRA adapters fine-tuned on the Open Component Model (OCM) and OCI specification ecosystem.
Base model: mlx-community/Qwen2.5-Coder-32B-Instruct-4bit
Usage
from mlx_lm import load, generate
model, tokenizer = load(
"mlx-community/Qwen2.5-Coder-32B-Instruct-4bit",
adapter_path="piotrjanik/ocm-coder",
)
- Downloads last month
- 23
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for piotrjanik/ocm-coder
Base model
Qwen/Qwen2.5-32B
Finetuned
Qwen/Qwen2.5-Coder-32B
Finetuned
Qwen/Qwen2.5-Coder-32B-Instruct