|
|
--- |
|
|
library_name: transformers |
|
|
license: other |
|
|
base_model: Qwen/Qwen2.5-Coder-32B-Instruct |
|
|
tags: |
|
|
- llama-factory |
|
|
- full |
|
|
- generated_from_trainer |
|
|
model-index: |
|
|
- name: MAS-GPT-32B |
|
|
results: [] |
|
|
--- |
|
|
|
|
|
# MAS-GPT-32B |
|
|
|
|
|
This model can generate query-specific LLM-based multi-agent system, which is fine-tuned on [Qwen/Qwen2.5-Coder-32B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-32B-Instruct). |
|
|
|
|
|
See our paper [MAS-GPT: Training LLMs to Build LLM-based Multi-Agent Systems](https://arxiv.org/pdf/2503.03686). |
|
|
|
|
|
## Citation |
|
|
``` |
|
|
@article{ye2025mas, |
|
|
title={MAS-GPT: Training LLMs to build LLM-based multi-agent systems}, |
|
|
author={Ye, Rui and Tang, Shuo and Ge, Rui and Du, Yaxin and Yin, Zhenfei and Chen, Siheng and Shao, Jing}, |
|
|
journal={arXiv preprint arXiv:2503.03686}, |
|
|
year={2025} |
|
|
} |
|
|
``` |