roberta_NCC_des_128_decayfrom200 / generate_pytorch_model.py
pere's picture
Saving weights and logs of step 10000
9419a58
# This script overwrites any existing PyTorch model. Generates a new one with an LM head from the pretrained Flax model.
from transformers import RobertaForMaskedLM
model = RobertaForMaskedLM.from_pretrained(".",from_flax=True)
model.save_pretrained(".")