metadata
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- lm-head
- bert
- zh
license: gpl-3.0
CKIP Oldhan BERT Base Chinese
Pretrained model on oldhan Chinese language using a masked language modeling (MLM) objective.
Homepage
Training Datasets
The copyright of the datasets belongs to the Institute of Linguistics, Academia Sinica.
Contributors
- Chin-Tung Lin at CKIP
Usage
Using our model in your script
from transformers import ( AutoTokenizer, AutoModel, ) tokenizer = AutoTokenizer.from_pretrained("ckiplab/oldhan-bert-base-chinese") model = AutoModel.from_pretrained("ckiplab/oldhan-bert-base-chinese")
Using our model for inference
>>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='ckiplab/oldhan-bert-base-chinese') >>> unmasker("黎民[MASK]變時雍")