File size: 1,499 Bytes
cc7b875 5009538 cc7b875 dd75838 cc7b875 dd75838 cc7b875 dd75838 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 | ---
library_name: transformers
license: apache-2.0
license_link: https://huggingface.co/OpenOneRec/OneRec-8B/blob/main/LICENSE
---
<div align="center">
<h1>OpenOneRec</h1>
<p align="center">
<strong>An Open Foundation Model and Benchmark to Accelerate Generative Recommendation</strong>
</p>
<p align="center">
<a href="https://huggingface.co/OpenOneRec">
<img alt="Hugging Face" src="https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-OneRec-ffc107?color=ffc107&logoColor=white" />
</a>
<a href="https://github.com/Kuaishou-OneRec/OpenOneRec">
<img alt="GitHub Code" src="https://img.shields.io/badge/GitHub-OpenOneRec-black?logo=github" />
</a>
<a href="https://arxiv.org/pdf/2512.24762">
<img alt="Paper" src="https://img.shields.io/badge/Paper-ArXiv-b31b1b?logo=arxiv" />
</a>
<a href="#license">
<img alt="License" src="https://img.shields.io/badge/License-Apache%202.0-green" />
</a>
</p>
</div>
<br>
## 📖 OneRec-Foundation-Pretrain Models
This repository provides the pre-trained weights of the OneRec-Foundation series, which has undergone Itemic-Text Alignment and Full-Parameter Co-Pretraining.
We release this checkpoint to enable users to perform customized post-training or alignment tailored to their specific downstream tasks and datasets, providing greater flexibility for specialized research.
For technical details on the pre-training architecture, please refer to our Technical Report. |