rugpt3small_based_on_gpt2 safetensors variant
Model was trained with sequence length 1024 using transformers by SberDevices team on 80B tokens around 3 epoch. After that model was finetuned on 2048 context.
Total training time took around one week on 32 GPUs.
Authors
- NLP core team RnD Telegram channel:
- Dmitry Zmitrovich
- Safetensors variant by Sashkanik13
- Downloads last month
- 24
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support