TiMaGPT2-2012 / README.md
luckycat37's picture
Update README.md
ea0c650 verified
|
raw
history blame contribute delete
No virus
752 Bytes
metadata
license: cc0-1.0

The following model is trained on entirely historical data up to the cutoff date "31-12-2012". The training data comes from the WMT News dataset (https://data.statmt.org/news-crawl/en/) and Wikipedia. The exact training dataset for this model is available on Huggingface at the following location: "TiMa/TiMaGPT2-2012".

Please refer to and cite the following paper when using this model in any downstream applications:

@inproceedings{drinkall-tima-2024, title = "Time Machine GPT", author = "Drinkall, Felix and Zohren, Stefan and Pierrehumbert, Janet", booktitle = "Findings of the Association for Computational Linguistics: NAACL 2024", month = june, year = "2024", publisher = "Association for Computational Linguistics" }