Update README.md
Browse files
README.md
CHANGED
@@ -4,7 +4,7 @@ tags:
|
|
4 |
license: cc-by-4.0
|
5 |
---
|
6 |
## bert-ascii-base
|
7 |
-
is a BERT base Language Model pre-trained by predicting the summation of the **
|
8 |
|
9 |
## License
|
10 |
CC BY 4.0
|
|
|
4 |
license: cc-by-4.0
|
5 |
---
|
6 |
## bert-ascii-base
|
7 |
+
is a BERT base Language Model pre-trained by predicting the summation of the **ASCII** code values of the characters in a masked token as a pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to [How does the pre-training objective affect what large language models learn about linguistic properties?](https://arxiv.org/abs/2203.10415)
|
8 |
|
9 |
## License
|
10 |
CC BY 4.0
|