roberta-debug-32 / generate_pt_model.py
pere's picture
first push
cb63820
raw
history blame
224 Bytes
from transformers import XLMRobertaForMaskedLM, XLMRobertaConfig
config = XLMRobertaConfig.from_pretrained("./")
model = XLMRobertaForMaskedLM.from_pretrained("./",config=config,from_flax=True)
model.save_pretrained("./")