language: en | |
tags: | |
- gpt2 | |
# CRiPT Model Large (Critical Thinking Intermediarily Pretrained Transformer) | |
Large version of the trained model (`SYL01-2020-10-24-72K/gpt2-large-train03-72K`) presented in the paper "Critical Thinking for Language Models" (Betz, Voigt and Richardson 2020). See also: | |
* [blog entry](https://debatelab.github.io/journal/critical-thinking-language-models.html) | |
* [GitHub repo](https://github.com/debatelab/aacorpus) | |
* [paper](https://arxiv.org/pdf/2009.07185) |