|
|
--- |
|
|
license: apache-2.0 |
|
|
library_name: transformers |
|
|
pipeline_tag: image-text-to-text |
|
|
--- |
|
|
|
|
|
# Model Card for AtomThinkPRM |
|
|
|
|
|
The model is fine-tuned with atomic step execution based on math-psa and can be used for process supervision in multimodal reasoning chains. It is part of the **AtomThink** framework, which introduces "slow thinking" into multimodal large language models (MLLMs). |
|
|
|
|
|
- **Paper:** [AtomThink: Multimodal Slow Thinking with Atomic Step Reasoning](https://huggingface.co/papers/2411.11930) |
|
|
- **Repository:** [https://github.com/Kun-Xiang/AtomThink](https://github.com/Kun-Xiang/AtomThink) |
|
|
|
|
|
## Description |
|
|
|
|
|
AtomThink incorporates the notion of "slow thinking" into MLLMs, allowing models to adaptively use different levels of reasoning for questions of varying complexity. It proposes a novel paradigm of Self-structured Chain of Thought (SCoT), which consists of minimal semantic atomic steps. |
|
|
|
|
|
AtomThinkPRM is designed for process supervision, enabling the evaluation of single-step reasoning quality within these multimodal chains. |
|
|
|
|
|
# Citation |
|
|
If you use this model in your research, please cite: |
|
|
|
|
|
```bibtex |
|
|
@article{xiang2024atomthink, |
|
|
title={AtomThink: A Slow Thinking Framework for Multimodal Mathematical Reasoning}, |
|
|
author={Xiang, Kun and Liu, Zhili and Jiang, Zihao and Nie, Yunshuang and Huang, Runhui and Fan, Haoxiang and Li, Hanhui and Huang, Weiran and Zeng, Yihan and Han, Jianhua and others}, |
|
|
journal={arXiv preprint arXiv:2411.11930}, |
|
|
year={2024} |
|
|
} |
|
|
|
|
|
@article{wang2024openr, |
|
|
title={OpenR: An Open Source Framework for Advanced Reasoning with Large Language Models}, |
|
|
author={Wang, Jun and Fang, Meng and Wan, Ziyu and Wen, Muning and Zhu, Jiachen and Liu, Anjie and Gong, Ziqin and Song, Yan and Chen, Lei and Ni, Lionel M and others}, |
|
|
journal={arXiv preprint arXiv:2410.09671}, |
|
|
year={2024} |
|
|
} |
|
|
``` |
|
|
|
|
|
# License |
|
|
The checkpoint is released under the Apache 2.0 license. Please ensure proper attribution when using this checkpoint. |