tie_word_embeddings=true ?

#6
by salmitta - opened

Hi, for this specific model , in the config.json, tie_word_embeddings is set to true. While in the technical report it is said that no tying was used. Wanted to check if there's a mistake in the config?

Qwen org

no. the 0.5b is a special one with weight tying.

jklj077 changed discussion status to closed

Sign up or log in to comment