ai-forever commited on
Commit
8201db0
1 Parent(s): aa2b602

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -11,4 +11,8 @@ thumbnail: "https://github.com/sberbank-ai/ru-gpts"
11
  Model was trained with sequence length 1024 using transformers lib by [SberDevices](https://sberdevices.ru/) team on 80B tokens for 3 epochs. After that model was finetuned 1 epoch with sequence length 2048.
12
 
13
  Total training time was around 14 days on 128 GPUs for 1024 context and few days on 16 GPUs for 2048 context.
14
- Final perplexity on test set is `13.6`.
 
 
 
 
 
11
  Model was trained with sequence length 1024 using transformers lib by [SberDevices](https://sberdevices.ru/) team on 80B tokens for 3 epochs. After that model was finetuned 1 epoch with sequence length 2048.
12
 
13
  Total training time was around 14 days on 128 GPUs for 1024 context and few days on 16 GPUs for 2048 context.
14
+ Final perplexity on test set is `13.6`.
15
+
16
+ # Authors
17
+ + NLP core team RnD [Telegram channel](https://t.me/nlpcoreteam):
18
+ + Dmitry Zmitrovich