Model Credits - Microcoder-1.5B
Base Model
This fine-tuned model is built upon Qwen 2.5 Coder 1.5B Instruct, created and maintained by Alibaba Cloud.
Original Model Information
- Model Name: Qwen 2.5 Coder 1.5B Instruct
- Creator: Alibaba Cloud
- Repository: Qwen Hugging Face
- License: Apache 2.0
The Qwen 2.5 Coder series represents a significant advancement in code generation models, optimized for programming tasks and instruction following.
Model Redistribution
We acknowledge Unsloth for their role in redistributing and optimizing the base model, making it more accessible to the community.
- Organization: Unsloth
- Website: Unsloth.ai
Fine-Tuned Model (Microcoder-1.5B)
- License: BSD-3-Clause
- Status: This fine-tuned version incorporates specialized training and optimizations
License Summary
| Component | License |
|---|---|
| Base Model (Qwen 2.5 Coder 1.5B) | Apache 2.0 |
| Fine-tuned Model (Microcoder-1.5B) | BSD-3-Clause |
Dataset Credits
For detailed information about the datasets used in the fine-tuning process, please refer to DATASET_CREDITS.md.
Attribution
When using Microcoder-1.5B, please provide appropriate attribution to:
- Alibaba Cloud - for the original Qwen 2.5 Coder model
- Unsloth - for model redistribution and optimization
- Microcoder Contributors - for the fine-tuning and improvements
Citation
If you use this model in your research or projects, please consider citing:
@misc{microcoder2026,
title={Microcoder-1.5B: A Fine-tuned Code Generation Model},
author={[pedrodev2026]},
year={2026},
url={[https://huggingface.co/pedrodev2026/microcoder-1.5b]}
}
And also cite the original Qwen model:
@article{hui2024qwen2,
title={Qwen2.5-Coder Technical Report},
author={Hui, Binyuan and Yang, Jian and Cui, Zeyu and Yang, Jiaxi and Liu, Dayiheng and Zhang, Lei and Liu, Tianyu and Zhang, Jiajun and Yu, Bowen and Dang, Kai and others},
journal={arXiv preprint arXiv:2409.12186},
year={2024}
}
Last Updated: 2026 Model Version: 1.5B