roberta-base-exp-8 / generate_pt_model.py
pere's picture
Original
fdae827
raw
history blame
224 Bytes
from transformers import XLMRobertaForMaskedLM, XLMRobertaConfig
config = XLMRobertaConfig.from_pretrained("./")
model = XLMRobertaForMaskedLM.from_pretrained("./",config=config,from_flax=True)
model.save_pretrained("./")