Sabine commited on
Commit
85b19e1
1 Parent(s): 21a5828

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -1,12 +1,12 @@
1
  # random-roberta-base
2
 
3
- We introduce random-roberta-base, which is a unpretrained version of RoBERTa model. The weight of random-roberta-base is randomly initiated and this can be particuarly useful when we aim to train a language model from scratch or benchmark the effect of pretraining.
4
 
5
- It's important to note that tokenizer of random-roberta-base is the same as roberta-base because it's not a trivial task to get a random tokenizer and it's less meanful compared to the random weight.
6
 
7
  A debatable advantage of pulling random-roberta-base from Huggingface is to avoid using random seed in order to obtain the same randomness at each time.
8
 
9
- The code to obtain a such random model:
10
 
11
  ```python
12
 
1
  # random-roberta-base
2
 
3
+ We introduce random-roberta-base, which is a unpretrained version of RoBERTa model. The weight of random-roberta-base is randomly initiated and this can be particularly useful when we aim to train a language model from scratch or benchmark the effect of pretraining.
4
 
5
+ It's important to note that tokenizer of random-roberta-base is the same as roberta-base because it's not a trivial task to get a random tokenizer and it's less meaningful compared to the random weight.
6
 
7
  A debatable advantage of pulling random-roberta-base from Huggingface is to avoid using random seed in order to obtain the same randomness at each time.
8
 
9
+ The code to obtain such a random model:
10
 
11
  ```python
12