ai-forever's picture
Update README.md
d64244b
|
raw
history blame
480 Bytes
metadata
language:
  - ru
tags:
  - PyTorch
  - Transformers
thumbnail: https://github.com/sberbank-ai/ru-gpts

rugpt3small_based_on_gpt2

Model was trained with sequence length 1024 using transformers by SberDevices team on 80B tokens around 3 epoch. After that model was finetuned on 2048 context.

Total training time took around one week on 32 GPUs.

Authors