bert-fc-base / README.md
aajrami's picture
Update README.md
ec09c09
|
raw
history blame
866 Bytes
metadata
tags:
  - bert
license: cc-by-4.0

bert-fc-base

is a BERT base Language Model with a first character prediction pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to How does the pre-training objective affect what large language models learn about linguistic properties?

License

CC BY 4.0

Citation

If you use this model, please cite the following paper:

@inproceedings{alajrami2022does,
  title={How does the pre-training objective affect what large language models learn about linguistic properties?},
  author={Alajrami, Ahmed and Aletras, Nikolaos},
  booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)},
  pages={131--147},
  year={2022}
}