File size: 2,085 Bytes
8c82a6c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 |
---
license: mit
---
**Quantized GGUF version of SCAIL-Preview.**
**Original model link:** [https://huggingface.co/zai-org/SCAIL-Preview](https://huggingface.co/zai-org/SCAIL-Preview)
**Watch us at Youtube:** [@VantageWithAI](https://www.youtube.com/@vantagewithai)
# SCAIL: Towards Studio-Grade Character Animation via In-Context Learning of 3D-Consistent Pose Representations
<div align="center">
<a href='https://arxiv.org/abs/2512.05905'><img src='https://img.shields.io/badge/π arXiv-2512.05905-red'></a>
<a href='https://teal024.github.io/SCAIL/'><img src='https://img.shields.io/badge/π Project Page-green'></a>
</div>
This repository contains the model weights for **SCAIL (Studio-Grade Character Animation via In-Context Learning)**, a framework that enables high-fidelity character animation under diverse and challenging conditions, including large motion variations, stylized characters, and multi-character interactions.
## π Project Page
Check our model architecture design, our video demo, as well as more comparisons against other baselines at [this link](https://teal024.github.io/SCAIL/), more creative examples will be added to the gallery soon.
## π TODOs
- [x] **Model Weights for Preview 14B SCAIL Model(512p)**
- [ ] **Model Weights for Official 1.3B/14B SCAIL Model(720p with history support)**
## π Note
This repository only contains the model weights for the SCAIL model, for model inference, please refer to the [official repository](https://github.com/teal024/SCAIL-Official). For pose extraction, please refer to the [SCAIL-Pose](https://github.com/teal024/SCAIL-Pose).
## π Citation
If you find this work useful in your research, please cite:
```bibtex
@article{yan2025scail,
title={SCAIL: Towards Studio-Grade Character Animation via In-Context Learning of 3D-Consistent Pose Representations},
author={Yan, Wenhao and Ye, Sheng and Yang, Zhuoyi and Teng, Jiayan and Dong, ZhenHui and Wen, Kairui and Gu, Xiaotao and Liu, Yong-Jin and Tang, Jie},
journal={arXiv preprint arXiv:2512.05905},
year={2025}
}
```
|