|
--- |
|
license: afl-3.0 |
|
language: uk |
|
--- |
|
## GPT2 being trained on Ukrainian news. |
|
|
|
### General info: |
|
The model is not ready yet but I'm working on it. It also has a relatively small context window, which makes it quite uninteresting. |
|
|
|
### Example of usage: |
|
```python |
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("kyryl0s/gpt2-uk-xxs") |
|
model = AutoModelForCausalLM.from_pretrained("kyryl0s/gpt2-uk-xxs") |
|
input_ids = tokenizer.encode("Путін — ", add_special_tokens=False, return_tensors='pt') |
|
outputs = model.generate( |
|
input_ids, |
|
do_sample=True, |
|
num_return_sequences=3, |
|
max_length=50 |
|
) |
|
for i, out in enumerate(outputs): |
|
print("{}: {}".format(i, tokenizer.decode(out))) |
|
|
|
``` |