Qwen2.5-Coder Technical Report
Paper • 2409.12186 • Published • 152
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("toolevalxm/qwen2.5-7b-coder_spec") tokenizer = AutoTokenizer.from_pretrained("toolevalxm/qwen2.5-7b-coder_spec") ## Citation If you use this model, please cite our paper. Base Model This model is fine-tuned from Qwen/Qwen2.5-Coder-7B. ## BibTeX Citation bibtex @article{hui2024qwen2, title={Qwen2. 5-Coder Technical Report}, author={Hui, Binyuan and Yang, Jian and Cui, Zeyu and Yang, Jiaxi and Liu, Dayiheng and Zhang, Lei and Liu, Tianyu and Zhang, Jiajun and Yu, Bowen and Dang, Kai and others}, journal={arXiv preprint arXiv:2409.12186}, year={2024} } How to Cite If you use this model, please cite both the base Qwen2.5-Coder model and our Annoy project paper.
License
The license for this model is other.