File size: 480 Bytes
f2f7c58 d64244b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
language:
- ru
tags:
- PyTorch
- Transformers
thumbnail: "https://github.com/sberbank-ai/ru-gpts"
---
# rugpt3small\_based\_on\_gpt2
Model was trained with sequence length 1024 using transformers by [SberDevices](https://sberdevices.ru/) team on 80B tokens around 3 epoch. After that model was finetuned on 2048 context.
Total training time took around one week on 32 GPUs.
# Authors
+ NLP core team RnD [Telegram channel](https://t.me/nlpcoreteam):
+ Dmitry Zmitrovich
|