--- tags: - bert license: cc-by-4.0 --- ## bert-fc-base is a BERT base Language Model with a **first character** prediction pre-training objective. For more details about the pre-training objective and the pre-training hyperparameters, please refer to [How does the pre-training objective affect what large language models learn about linguistic properties?](https://arxiv.org/abs/2203.10415) ## License CC BY 4.0 ## Citation If you use this model, please cite the following paper: ``` @article{alajrami2022does, title={How does the pre-training objective affect what large language models learn about linguistic properties?}, author={Alajrami, Ahmed and Aletras, Nikolaos}, journal={arXiv preprint arXiv:2203.10415}, year={2022} } ```