--- language: "mn" tags: - mongolian - cased --- # BERT-BASE-MONGOLIAN-CASED [Link to Official Mongolian-BERT repo](https://github.com/tugstugi/mongolian-bert) ## Model description This repository contains pre-trained Mongolian [BERT](https://arxiv.org/abs/1810.04805) models trained by [tugstugi](https://github.com/tugstugi), [enod](https://github.com/enod) and [sharavsambuu](https://github.com/sharavsambuu). Special thanks to [nabar](https://github.com/nabar) who provided 5x TPUs. This repository is based on the following open source projects: [google-research/bert](https://github.com/google-research/bert/), [huggingface/pytorch-pretrained-BERT](https://github.com/huggingface/pytorch-pretrained-BERT) and [yoheikikuta/bert-japanese](https://github.com/yoheikikuta/bert-japanese). #### How to use ```python from transformers import pipeline, AutoTokenizer, BertForMaskedLM tokenizer = AutoTokenizer.from_pretrained('tugstugi/bert-base-mongolian-cased') model = BertForMaskedLM.from_pretrained('tugstugi/bert-base-mongolian-cased') ## declare task ## pipe = pipeline(task="fill-mask", model=model, tokenizer=tokenizer) ## example ## input_ = 'Миний [MASK] хоол идэх нь тун чухал.' output_ = pipe(input_) for i in range(len(output_)): print(output_[i]) ## Output ## # {'sequence': 'Улаанбаатар хотын Сонгино хайрхан дүүрэг.', 'score': 0.9701077342033386, 'token': 281, 'token_str': 'Улаанбаатар'} # {'sequence': 'УБ хотын Сонгино хайрхан дүүрэг.', 'score': 0.02008666843175888, 'token': 7389, 'token_str': 'УБ'} # {'sequence': 'Нийслэл хотын Сонгино хайрхан дүүрэг.', 'score': 0.006682577542960644, 'token': 4059, 'token_str': 'Нийслэл'} # {'sequence': 'Улаанбаатар хотын Сонгино хайрхан дүүрэг.', 'score': 0.0008267111843451858, 'token': 2328, 'token_str': 'Улаанбаатар'} # {'sequence': 'Улаанбаатарын хотын Сонгино хайрхан дүүрэг.', 'score': 0.0003509577363729477, 'token': 5593, 'token_str': 'Улаанбаатарын'} ``` ## Training data Mongolian Wikipedia and the 700 million word Mongolian news data set [[Pretraining Procedure](https://github.com/tugstugi/mongolian-bert#pre-training)] ### BibTeX entry and citation info ```bibtex @misc{mongolian-bert, author = {Tuguldur, Erdene-Ochir and Gunchinish, Sharavsambuu and Bataa, Enkhbold}, title = {BERT Pretrained Models on Mongolian Datasets}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/tugstugi/mongolian-bert/}} } ```