pipeline_tag: text-to-3d
AssetFormer: Modular 3D Assets Generation with Autoregressive Transformer
AssetFormer is an autoregressive Transformer-based model designed to generate modular 3D assets from textual descriptions. By adapting module sequencing and decoding techniques inspired by language models, the framework enhances the quality of 3D asset generation composed of primitives that adhere to constrained design parameters.
- Paper: AssetFormer: Modular 3D Assets Generation with Autoregressive Transformer
- Repository: https://github.com/Advocate99/AssetFormer
Installation
To get started, clone the official repository and install the dependencies:
git clone https://github.com/Advocate99/AssetFormer.git
cd AssetFormer
conda create -n assetformer python=3.12
conda activate assetformer
pip install -r requirements.txt
Preparation
Download the
flan-t5-xlmodels and place them in the./pretrained_models/t5-ckpt/folder:huggingface-cli download google/flan-t5-xl --local-dir ./pretrained_models/t5-ckpt/flan-t5-xlDownload the
inference_model.ptfrom this Hugging Face repository and place it in the./pretrained_models/directory.
Inference
Run the following command to sample 3D assets as JSON files:
python sample.py --gpt-ckpt ./pretrained_models/inference_model.pt
After sampling, you can use the Blender scripts provided in the official repository (./blender_script/) to render the 3D assets with the modular fbx files.
Citation
If you find this work useful, please kindly cite:
@article{zhu2026assetformer,
title={AssetFormer: Modular 3D Assets Generation with Autoregressive Transformer},
author={Zhu, Lingting and Qian, Shengju and Fan, Haidi and Dong, Jiayu and Jin, Zhenchao and Zhou, Siwei and Dong, Gen and Wang, Xin and Yu, Lequan},
journal={arXiv preprint arXiv:2602.12100},
year={2026}
}
Acknowledgement
The codebase is developed based on LlamaGen.