TiMaGPT2-2011 / README.md
luckycat37's picture
Update README.md
457e258 verified
---
license: cc0-1.0
---
The following model is trained on entirely historical data up to the cutoff date "31-12-2011". The training data comes from the WMT News dataset (https://data.statmt.org/news-crawl/en/) and Wikipedia. The exact training dataset for this model is available on Huggingface at the following location: "TiMa/TiMaGPT2-2011".
Please refer to and cite the following paper when using this model in any downstream applications:
@inproceedings{drinkall-tima-2024,
title = "Time Machine GPT",
author = "Drinkall, Felix and
Zohren, Stefan and
Pierrehumbert, Janet",
booktitle = "Findings of the Association for Computational Linguistics: NAACL 2024",
month = june,
year = "2024",
publisher = "Association for Computational Linguistics"
}