aajrami commited on
Commit
a830381
1 Parent(s): 51afe57

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -4,7 +4,7 @@ tags:
4
  license: cc-by-4.0
5
  ---
6
  ## bert-ascii-base
7
- is a BERT base Language Model pre-trained by predicting the summation of the **ASCII** code values of the characters in a masked token as a pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to [How does the pre-training objective affect what large language models learn about linguistic properties?](https://aclanthology.org/2022.acl-short.16/)
8
 
9
  ## License
10
  CC BY 4.0
 
4
  license: cc-by-4.0
5
  ---
6
  ## bert-ascii-base
7
+ A BERT base Language Model pre-trained by predicting the summation of the **ASCII** code values of the characters in a masked token as a pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to [How does the pre-training objective affect what large language models learn about linguistic properties?](https://aclanthology.org/2022.acl-short.16/)
8
 
9
  ## License
10
  CC BY 4.0