File size: 784 Bytes
7f12bbb c017834 7f12bbb c017834 c0da55a c017834 a9c272d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
license: afl-3.0
language: uk
---
## GPT2 being trained on Ukrainian news.
### General info:
The model is not ready yet, but I'm working on it. It also has a relatively small context window, which makes it quite uninteresting.
### Example of usage:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("kyryl0s/gpt2-uk-xxs")
model = AutoModelForCausalLM.from_pretrained("kyryl0s/gpt2-uk-xxs")
input_ids = tokenizer.encode("Путін — ", add_special_tokens=False, return_tensors='pt')
outputs = model.generate(
input_ids,
do_sample=True,
num_return_sequences=3,
max_length=50
)
for i, out in enumerate(outputs):
print("{}: {}".format(i, tokenizer.decode(out)))
``` |