KoBERT-LM
- Further pretrained model for re-training LM Mask Head
How to use
If you want to import KoBERT tokenizer with
AutoTokenizer
, you should givetrust_remote_code=True
.
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("monologg/kobert-lm")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert-lm", trust_remote_code=True)
Reference
- Downloads last month
- 90
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
HF Inference deployability: The model authors have turned it off explicitly.