gpt2-small-greek-v2 / README.md
ClassCat's picture
Create README.md
37b1b55

Hugging Face's logo Hugging Face Search models, datasets, users... Models Datasets Spaces Docs Solutions Pricing

ClassCat / gpt2-greek Copied private Text Generation PyTorch Transformers

cc100 Greek gpt2 License: cc-by-sa-4.0 Model card Files and versions Community Settings gpt2-greek / README.md ClassCat's picture ClassCat Create README.md 0765806 less than a minute ago raw history blame edit delete 965 Bytes

language: el license: cc-by-sa-4.0 datasets: - cc100 widget: - text: "Αυτό είναι ένα" - text: "Ανοιξα την" - text: "Ευχαριστώ για το" - text: "Έχει πολύ καιρό που δεν έχουμε"

Greek GPT2 model (Uncased)

Prerequisites

transformers==4.19.2

Model architecture

This model uses approximately half the size of GPT2 base model parameters.

Tokenizer

Using BPE tokenizer with vocabulary size 50,000.

Training Data

Usage

from transformers import pipeline
generator = pipeline('text-generation', model='ClassCat/gpt2-greek')
generator("Αυτό είναι ένα", max_length=50, num_return_sequences=5)