gpt2-largeGPTQ / README.md
pavfi-at-m's picture
Create README.md
8541f63
|
raw
history blame
439 Bytes
metadata
license: mit
datasets:
  - wikitext
language:
  - en
metrics:
  - accuracy
library_name: transformers
tags:
  - general

Quantized GPT2 model.

Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on BookCorpus, a dataset of over 7,000 unpublished fiction books from various genres, and trained on a dataset of 8 million web pages.