burakaytan commited on
Commit
e14811b
1 Parent(s): a7b55dc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -6,7 +6,7 @@ This is a Turkish RoBERTa base model pretrained on Turkish Wikipedia, Turkish OS
6
 
7
  The final training corpus has a size of 38 GB and 329.720.508 sentences.
8
 
9
- Thanks to Turkcell we could train the model on Intel(R) Xeon(R) Gold 6230R CPU @ 2.10GHz 256GB RAM 2 x GV100GL [Tesla V100 PCIe 32GB] GPU
10
 
11
  # Usage
12
  Load transformers library with:
 
6
 
7
  The final training corpus has a size of 38 GB and 329.720.508 sentences.
8
 
9
+ Thanks to Turkcell we could train the model on Intel(R) Xeon(R) Gold 6230R CPU @ 2.10GHz 256GB RAM 2 x GV100GL [Tesla V100 PCIe 32GB] GPU for 2.5M steps.
10
 
11
  # Usage
12
  Load transformers library with: