You may consider adding `ignore_mismatched_sizes=True` in the model `from_pretrained` method.

#56
by EthanMiao - opened

ERROR 2023-07-17 11:38:27,049-1d: Error(s) in loading state_dict for BertModel:
size mismatch for embeddings.word_embeddings.weight: copying a param with shape torch.Size([21128, 1024]) from checkpoint, the shape in current model is torch.Size([21128, 768]).

have you guys met this problem?

Sign up or log in to comment