Not loading word embedding weights after the addition of safetensors

#4
by DevBerge - opened

I'm getting this for code that properly loaded the model before the PR that added safetensors 8 days ago.

Some weights of NorbertModel were not initialized from the model checkpoint at ltg/norbert3-base and are newly initialized: ['embedding.word_embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

Can you look into it? This is a major issue.
Should be replicated by just doing the lines below:
tokenizer = AutoTokenizer.from_pretrained('ltg/norbert3-base')
model = AutoModel.from_pretrained('ltg/norbert3-base', trust_remote_code=True)

Language Technology Group (University of Oslo) org

Hi, thanks for letting us know, I have deleted the safetensors checkpoint for now. Note to myself: don't blindly trust the official HuggingFace conversion code :)

davda54 changed discussion status to closed

Nps :)

Sign up or log in to comment