indonesian-roberta-base / flax_to_torch.py
w11wo's picture
8 epochs of training
4343083
raw
history blame
234 Bytes
from transformers import RobertaForMaskedLM, AutoTokenizer
model = RobertaForMaskedLM.from_pretrained("./", from_flax=True)
model.save_pretrained("./")
tokenizer = AutoTokenizer.from_pretrained("./")
tokenizer.save_pretrained("./")