| license: mit | |
| task_categories: | |
| - text-generation | |
| language: | |
| - en | |
| tags: | |
| - diffusion-models | |
| - reinforcement-learning | |
| - math-reasoning | |
| - code-generation | |
| - reasoning | |
| # Revolutionizing Reinforcement Learning Framework for Diffusion Large Language Models Datasets | |
| This repository contains datasets used in the paper [Revolutionizing Reinforcement Learning Framework for Diffusion Large Language Models](https://huggingface.co/papers/2509.06949). These datasets are crucial for building, training, and deploying Diffusion Large Language Models (DLMs) within the TraceRL framework, particularly for improving reasoning performance on complex math and coding tasks. | |
| - **Paper:** [Revolutionizing Reinforcement Learning Framework for Diffusion Large Language Models](https://huggingface.co/papers/2509.06949) | |
| - **Code (GitHub):** [https://github.com/Gen-Verse/dLLM-RL](https://github.com/Gen-Verse/dLLM-RL) | |
| - **Project Page (Hugging Face Collection):** [https://huggingface.co/collections/Gen-Verse/trado-series-68beb6cd6a26c27cde9fe3af](https://huggingface.co/collections/Gen-Verse/trado-series-68beb6cd6a26c27cde9fe3af) | |
| ## Sample Usage (Data Download) | |
| You can navigate to the `./data` directory within the associated GitHub repository to download datasets for evaluation and training. In that directory, you will also find detailed instructions on how to modify your own dataset. | |
| For example, to download the `MATH500` and `MATH_train` datasets: | |
| ```bash | |
| cd data | |
| python download_data.py --dataset MATH500 | |
| python download_data.py --dataset MATH_train | |
| cd .. | |
| ``` | |
| ## Citation | |
| If you use these datasets in your research, please cite the associated paper: | |
| ```bibtex | |
| @article{wang2025trado, | |
| title={Revolutionizing Reinforcement Learning Framework for Diffusion Large Language Models}, | |
| author={Wang, Yinjie and Yang, Ling and Li, Bowen and Tian, Ye and Shen, Ke and Wang, Mengdi}, | |
| journal={arXiv preprint arXiv:2509.06949}, | |
| year={2025} | |
| } | |
| ``` |