---
library_name: transformers
license: apache-2.0
license_link: https://huggingface.co/OpenOneRec/OneRec-8B/blob/main/LICENSE
---
## 📖 OneRec-Foundation-Pretrain Models
This repository provides the pre-trained weights of the OneRec-Foundation series, which has undergone Itemic-Text Alignment and Full-Parameter Co-Pretraining.
We release this checkpoint to enable users to perform customized post-training or alignment tailored to their specific downstream tasks and datasets, providing greater flexibility for specialized research.
For technical details on the pre-training architecture, please refer to our Technical Report.