Rotary position embeddings not loaded

#39
by cwbc - opened

When I load the model weights to transformers.LlavaLlamaForCausalLM, it says the rotary position embeddings rotary_emb.inv_freq are loaded from the checkpoint. Does it affect the model performance?

Meta Llama org

No

Sign up or log in to comment