gpt2-RMT-2 / README.md
KotshinZ's picture
Update README.md
1572dbc verified
---
base_model: openai-community/gpt2
datasets: HuggingFaceFW/fineweb-edu
library_name: transformers
model_name: gpt2-RMT-2
tags:
- generated_from_trainer
- open-r1
- trl
- sft
licence: license
license: apache-2.0
---
# Model Card for gpt2-RMT-2
This model is a fine-tuned version of [openai-community/gpt2](https://huggingface.co/openai-community/gpt2) on the [HuggingFaceFW/fineweb-edu](https://huggingface.co/datasets/HuggingFaceFW/fineweb-edu) dataset.
It has been trained using [TRL](https://github.com/huggingface/trl).
For use this model. You need clone [KotShinZ/Recurrent-Memory-Transformer_PreTrained.git](https://github.com/KotShinZ/Recurrent-Memory-Transformer_PreTrained)
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="KotshinZ/gpt2-RMT-2", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/shin2021001-osaka-city-university/huggingface/runs/5xyxy6y1)
This model was trained with SFT.
This model memory_size = 10, n_backward = 2.
### Framework versions
- TRL: 0.15.2
- Transformers: 4.50.0.dev0
- Pytorch: 2.5.1
- Datasets: 3.3.2
- Tokenizers: 0.21.0
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
```