aajrami commited on
Commit
49568c4
1 Parent(s): 1020e7d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -4,7 +4,7 @@ tags:
4
  license: cc-by-4.0
5
  ---
6
  ## bert-ascii-base
7
- is a BERT base Language Model pre-trained by predicting the summation of the **ascii** code values of the characters in a masked token as a pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to [How does the pre-training objective affect what large language models learn about linguistic properties?](https://arxiv.org/abs/2203.10415)
8
 
9
  ## License
10
  CC BY 4.0
 
4
  license: cc-by-4.0
5
  ---
6
  ## bert-ascii-base
7
+ is a BERT base Language Model pre-trained by predicting the summation of the **ASCII** code values of the characters in a masked token as a pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to [How does the pre-training objective affect what large language models learn about linguistic properties?](https://arxiv.org/abs/2203.10415)
8
 
9
  ## License
10
  CC BY 4.0