bert-rand-medium / README.md
aajrami's picture
Update README.md
cfa2731
|
raw
history blame
857 Bytes
metadata
tags:
  - bert
license: cc-by-4.0

bert-rand-medium

is a medium-size BERT Language Model with a random pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to How does the pre-training objective affect what large language models learn about linguistic properties?

License

CC BY 4.0

Citation

If you use this model, please cite the following paper:

@inproceedings{alajrami2022does,
  title={How does the pre-training objective affect what large language models learn about linguistic properties?},
  author={Alajrami, Ahmed and Aletras, Nikolaos},
  booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)},
  pages={131--147},
  year={2022}
}