Model weights are not loaded
#3
by
MarvelousMouse
- opened
I'm trying to load the model using the suggested code:
from transformers import pipeline
messages = [
{"role": "user", "content": "Who are you?"},
]
pipe = pipeline("text-generation", model="neuralmagic/Meta-Llama-3.1-8B-Instruct-quantized.w8a16")
pipe(messages)
When I run it, I get a warning that basically none of the wights were used when initializing LlamaForCausalLM. As a result, the generated response (see below) is a bit odd. I'm using transformers version 4.44.2.
2024-08-29 10:40:44.019913: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2024-08-29 10:40:44.025049: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2024-08-29 10:40:44.089174: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-08-29 10:40:45.296907: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Loading checkpoint shards: 100%
2/2 [00:01<00:00, 1.54it/s]
Some weights of the model checkpoint at neuralmagic/Meta-Llama-3.1-8B-Instruct-quantized.w8a16 were not used when initializing LlamaForCausalLM: ['model.layers.0.mlp.down_proj.weight_packed', 'model.layers.0.mlp.down_proj.weight_scale', 'model.layers.0.mlp.down_proj.weight_shape', 'model.layers.0.mlp.gate_proj.weight_packed', 'model.layers.0.mlp.gate_proj.weight_scale', 'model.layers.0.mlp.gate_proj.weight_shape', 'model.layers.0.mlp.up_proj.weight_packed', 'model.layers.0.mlp.up_proj.weight_scale', 'model.layers.0.mlp.up_proj.weight_shape', 'model.layers.0.self_attn.k_proj.weight_packed', 'model.layers.0.self_attn.k_proj.weight_scale', 'model.layers.0.self_attn.k_proj.weight_shape', 'model.layers.0.self_attn.o_proj.weight_packed', 'model.layers.0.self_attn.o_proj.weight_scale', 'model.layers.0.self_attn.o_proj.weight_shape', 'model.layers.0.self_attn.q_proj.weight_packed', 'model.layers.0.self_attn.q_proj.weight_scale', 'model.layers.0.self_attn.q_proj.weight_shape', 'model.layers.0.self_attn.v_proj.weight_packed', 'model.layers.0.self_attn.v_proj.weight_scale', 'model.layers.0.self_attn.v_proj.weight_shape', 'model.layers.1.mlp.down_proj.weight_packed', 'model.layers.1.mlp.down_proj.weight_scale', 'model.layers.1.mlp.down_proj.weight_shape', 'model.layers.1.mlp.gate_proj.weight_packed', 'model.layers.1.mlp.gate_proj.weight_scale', 'model.layers.1.mlp.gate_proj.weight_shape', 'model.layers.1.mlp.up_proj.weight_packed', 'model.layers.1.mlp.up_proj.weight_scale', 'model.layers.1.mlp.up_proj.weight_shape', 'model.layers.1.self_attn.k_proj.weight_packed', 'model.layers.1.self_attn.k_proj.weight_scale', 'model.layers.1.self_attn.k_proj.weight_shape', 'model.layers.1.self_attn.o_proj.weight_packed', 'model.layers.1.self_attn.o_proj.weight_scale', 'model.layers.1.self_attn.o_proj.weight_shape', 'model.layers.1.self_attn.q_proj.weight_packed', 'model.layers.1.self_attn.q_proj.weight_scale', 'model.layers.1.self_attn.q_proj.weight_shape', 'model.layers.1.self_attn.v_proj.weight_packed', 'model.layers.1.self_attn.v_proj.weight_scale', 'model.layers.1.self_attn.v_proj.weight_shape', 'model.layers.10.mlp.down_proj.weight_packed', 'model.layers.10.mlp.down_proj.weight_scale', 'model.layers.10.mlp.down_proj.weight_shape', 'model.layers.10.mlp.gate_proj.weight_packed', 'model.layers.10.mlp.gate_proj.weight_scale', 'model.layers.10.mlp.gate_proj.weight_shape', 'model.layers.10.mlp.up_proj.weight_packed', 'model.layers.10.mlp.up_proj.weight_scale', 'model.layers.10.mlp.up_proj.weight_shape', 'model.layers.10.self_attn.k_proj.weight_packed', 'model.layers.10.self_attn.k_proj.weight_scale', 'model.layers.10.self_attn.k_proj.weight_shape', 'model.layers.10.self_attn.o_proj.weight_packed', 'model.layers.10.self_attn.o_proj.weight_scale', 'model.layers.10.self_attn.o_proj.weight_shape', 'model.layers.10.self_attn.q_proj.weight_packed', 'model.layers.10.self_attn.q_proj.weight_scale', 'model.layers.10.self_attn.q_proj.weight_shape', 'model.layers.10.self_attn.v_proj.weight_packed', 'model.layers.10.self_attn.v_proj.weight_scale', 'model.layers.10.self_attn.v_proj.weight_shape', 'model.layers.11.mlp.down_proj.weight_packed', 'model.layers.11.mlp.down_proj.weight_scale', 'model.layers.11.mlp.down_proj.weight_shape', 'model.layers.11.mlp.gate_proj.weight_packed', 'model.layers.11.mlp.gate_proj.weight_scale', 'model.layers.11.mlp.gate_proj.weight_shape', 'model.layers.11.mlp.up_proj.weight_packed', 'model.layers.11.mlp.up_proj.weight_scale', 'model.layers.11.mlp.up_proj.weight_shape', 'model.layers.11.self_attn.k_proj.weight_packed', 'model.layers.11.self_attn.k_proj.weight_scale', 'model.layers.11.self_attn.k_proj.weight_shape', 'model.layers.11.self_attn.o_proj.weight_packed', 'model.layers.11.self_attn.o_proj.weight_scale', 'model.layers.11.self_attn.o_proj.weight_shape', 'model.layers.11.self_attn.q_proj.weight_packed', 'model.layers.11.self_attn.q_proj.weight_scale', 'model.layers.11.self_attn.q_proj.weight_shape', 'model.layers.11.self_attn.v_proj.weight_packed', 'model.layers.11.self_attn.v_proj.weight_scale', 'model.layers.11.self_attn.v_proj.weight_shape', 'model.layers.12.mlp.down_proj.weight_packed', 'model.layers.12.mlp.down_proj.weight_scale', 'model.layers.12.mlp.down_proj.weight_shape', 'model.layers.12.mlp.gate_proj.weight_packed', 'model.layers.12.mlp.gate_proj.weight_scale', 'model.layers.12.mlp.gate_proj.weight_shape', 'model.layers.12.mlp.up_proj.weight_packed', 'model.layers.12.mlp.up_proj.weight_scale', 'model.layers.12.mlp.up_proj.weight_shape', 'model.layers.12.self_attn.k_proj.weight_packed', 'model.layers.12.self_attn.k_proj.weight_scale', 'model.layers.12.self_attn.k_proj.weight_shape', 'model.layers.12.self_attn.o_proj.weight_packed', 'model.layers.12.self_attn.o_proj.weight_scale', 'model.layers.12.self_attn.o_proj.weight_shape', 'model.layers.12.self_attn.q_proj.weight_packed', 'model.layers.12.self_attn.q_proj.weight_scale', 'model.layers.12.self_attn.q_proj.weight_shape', 'model.layers.12.self_attn.v_proj.weight_packed', 'model.layers.12.self_attn.v_proj.weight_scale', 'model.layers.12.self_attn.v_proj.weight_shape', 'model.layers.13.mlp.down_proj.weight_packed', 'model.layers.13.mlp.down_proj.weight_scale', 'model.layers.13.mlp.down_proj.weight_shape', 'model.layers.13.mlp.gate_proj.weight_packed', 'model.layers.13.mlp.gate_proj.weight_scale', 'model.layers.13.mlp.gate_proj.weight_shape', 'model.layers.13.mlp.up_proj.weight_packed', 'model.layers.13.mlp.up_proj.weight_scale', 'model.layers.13.mlp.up_proj.weight_shape', 'model.layers.13.self_attn.k_proj.weight_packed', 'model.layers.13.self_attn.k_proj.weight_scale', 'model.layers.13.self_attn.k_proj.weight_shape', 'model.layers.13.self_attn.o_proj.weight_packed', 'model.layers.13.self_attn.o_proj.weight_scale', 'model.layers.13.self_attn.o_proj.weight_shape', 'model.layers.13.self_attn.q_proj.weight_packed', 'model.layers.13.self_attn.q_proj.weight_scale', 'model.layers.13.self_attn.q_proj.weight_shape', 'model.layers.13.self_attn.v_proj.weight_packed', 'model.layers.13.self_attn.v_proj.weight_scale', 'model.layers.13.self_attn.v_proj.weight_shape', 'model.layers.14.mlp.down_proj.weight_packed', 'model.layers.14.mlp.down_proj.weight_scale', 'model.layers.14.mlp.down_proj.weight_shape', 'model.layers.14.mlp.gate_proj.weight_packed', 'model.layers.14.mlp.gate_proj.weight_scale', 'model.layers.14.mlp.gate_proj.weight_shape', 'model.layers.14.mlp.up_proj.weight_packed', 'model.layers.14.mlp.up_proj.weight_scale', 'model.layers.14.mlp.up_proj.weight_shape', 'model.layers.14.self_attn.k_proj.weight_packed', 'model.layers.14.self_attn.k_proj.weight_scale', 'model.layers.14.self_attn.k_proj.weight_shape', 'model.layers.14.self_attn.o_proj.weight_packed', 'model.layers.14.self_attn.o_proj.weight_scale', 'model.layers.14.self_attn.o_proj.weight_shape', 'model.layers.14.self_attn.q_proj.weight_packed', 'model.layers.14.self_attn.q_proj.weight_scale', 'model.layers.14.self_attn.q_proj.weight_shape', 'model.layers.14.self_attn.v_proj.weight_packed', 'model.layers.14.self_attn.v_proj.weight_scale', 'model.layers.14.self_attn.v_proj.weight_shape', 'model.layers.15.mlp.down_proj.weight_packed', 'model.layers.15.mlp.down_proj.weight_scale', 'model.layers.15.mlp.down_proj.weight_shape', 'model.layers.15.mlp.gate_proj.weight_packed', 'model.layers.15.mlp.gate_proj.weight_scale', 'model.layers.15.mlp.gate_proj.weight_shape', 'model.layers.15.mlp.up_proj.weight_packed', 'model.layers.15.mlp.up_proj.weight_scale', 'model.layers.15.mlp.up_proj.weight_shape', 'model.layers.15.self_attn.k_proj.weight_packed', 'model.layers.15.self_attn.k_proj.weight_scale', 'model.layers.15.self_attn.k_proj.weight_shape', 'model.layers.15.self_attn.o_proj.weight_packed', 'model.layers.15.self_attn.o_proj.weight_scale', 'model.layers.15.self_attn.o_proj.weight_shape', 'model.layers.15.self_attn.q_proj.weight_packed', 'model.layers.15.self_attn.q_proj.weight_scale', 'model.layers.15.self_attn.q_proj.weight_shape', 'model.layers.15.self_attn.v_proj.weight_packed', 'model.layers.15.self_attn.v_proj.weight_scale', 'model.layers.15.self_attn.v_proj.weight_shape', 'model.layers.16.mlp.down_proj.weight_packed', 'model.layers.16.mlp.down_proj.weight_scale', 'model.layers.16.mlp.down_proj.weight_shape', 'model.layers.16.mlp.gate_proj.weight_packed', 'model.layers.16.mlp.gate_proj.weight_scale', 'model.layers.16.mlp.gate_proj.weight_shape', 'model.layers.16.mlp.up_proj.weight_packed', 'model.layers.16.mlp.up_proj.weight_scale', 'model.layers.16.mlp.up_proj.weight_shape', 'model.layers.16.self_attn.k_proj.weight_packed', 'model.layers.16.self_attn.k_proj.weight_scale', 'model.layers.16.self_attn.k_proj.weight_shape', 'model.layers.16.self_attn.o_proj.weight_packed', 'model.layers.16.self_attn.o_proj.weight_scale', 'model.layers.16.self_attn.o_proj.weight_shape', 'model.layers.16.self_attn.q_proj.weight_packed', 'model.layers.16.self_attn.q_proj.weight_scale', 'model.layers.16.self_attn.q_proj.weight_shape', 'model.layers.16.self_attn.v_proj.weight_packed', 'model.layers.16.self_attn.v_proj.weight_scale', 'model.layers.16.self_attn.v_proj.weight_shape', 'model.layers.17.mlp.down_proj.weight_packed', 'model.layers.17.mlp.down_proj.weight_scale', 'model.layers.17.mlp.down_proj.weight_shape', 'model.layers.17.mlp.gate_proj.weight_packed', 'model.layers.17.mlp.gate_proj.weight_scale', 'model.layers.17.mlp.gate_proj.weight_shape', 'model.layers.17.mlp.up_proj.weight_packed', 'model.layers.17.mlp.up_proj.weight_scale', 'model.layers.17.mlp.up_proj.weight_shape', 'model.layers.17.self_attn.k_proj.weight_packed', 'model.layers.17.self_attn.k_proj.weight_scale', 'model.layers.17.self_attn.k_proj.weight_shape', 'model.layers.17.self_attn.o_proj.weight_packed', 'model.layers.17.self_attn.o_proj.weight_scale', 'model.layers.17.self_attn.o_proj.weight_shape', 'model.layers.17.self_attn.q_proj.weight_packed', 'model.layers.17.self_attn.q_proj.weight_scale', 'model.layers.17.self_attn.q_proj.weight_shape', 'model.layers.17.self_attn.v_proj.weight_packed', 'model.layers.17.self_attn.v_proj.weight_scale', 'model.layers.17.self_attn.v_proj.weight_shape', 'model.layers.18.mlp.down_proj.weight_packed', 'model.layers.18.mlp.down_proj.weight_scale', 'model.layers.18.mlp.down_proj.weight_shape', 'model.layers.18.mlp.gate_proj.weight_packed', 'model.layers.18.mlp.gate_proj.weight_scale', 'model.layers.18.mlp.gate_proj.weight_shape', 'model.layers.18.mlp.up_proj.weight_packed', 'model.layers.18.mlp.up_proj.weight_scale', 'model.layers.18.mlp.up_proj.weight_shape', 'model.layers.18.self_attn.k_proj.weight_packed', 'model.layers.18.self_attn.k_proj.weight_scale', 'model.layers.18.self_attn.k_proj.weight_shape', 'model.layers.18.self_attn.o_proj.weight_packed', 'model.layers.18.self_attn.o_proj.weight_scale', 'model.layers.18.self_attn.o_proj.weight_shape', 'model.layers.18.self_attn.q_proj.weight_packed', 'model.layers.18.self_attn.q_proj.weight_scale', 'model.layers.18.self_attn.q_proj.weight_shape', 'model.layers.18.self_attn.v_proj.weight_packed', 'model.layers.18.self_attn.v_proj.weight_scale', 'model.layers.18.self_attn.v_proj.weight_shape', 'model.layers.19.mlp.down_proj.weight_packed', 'model.layers.19.mlp.down_proj.weight_scale', 'model.layers.19.mlp.down_proj.weight_shape', 'model.layers.19.mlp.gate_proj.weight_packed', 'model.layers.19.mlp.gate_proj.weight_scale', 'model.layers.19.mlp.gate_proj.weight_shape', 'model.layers.19.mlp.up_proj.weight_packed', 'model.layers.19.mlp.up_proj.weight_scale', 'model.layers.19.mlp.up_proj.weight_shape', 'model.layers.19.self_attn.k_proj.weight_packed', 'model.layers.19.self_attn.k_proj.weight_scale', 'model.layers.19.self_attn.k_proj.weight_shape', 'model.layers.19.self_attn.o_proj.weight_packed', 'model.layers.19.self_attn.o_proj.weight_scale', 'model.layers.19.self_attn.o_proj.weight_shape', 'model.layers.19.self_attn.q_proj.weight_packed', 'model.layers.19.self_attn.q_proj.weight_scale', 'model.layers.19.self_attn.q_proj.weight_shape', 'model.layers.19.self_attn.v_proj.weight_packed', 'model.layers.19.self_attn.v_proj.weight_scale', 'model.layers.19.self_attn.v_proj.weight_shape', 'model.layers.2.mlp.down_proj.weight_packed', 'model.layers.2.mlp.down_proj.weight_scale', 'model.layers.2.mlp.down_proj.weight_shape', 'model.layers.2.mlp.gate_proj.weight_packed', 'model.layers.2.mlp.gate_proj.weight_scale', 'model.layers.2.mlp.gate_proj.weight_shape', 'model.layers.2.mlp.up_proj.weight_packed', 'model.layers.2.mlp.up_proj.weight_scale', 'model.layers.2.mlp.up_proj.weight_shape', 'model.layers.2.self_attn.k_proj.weight_packed', 'model.layers.2.self_attn.k_proj.weight_scale', 'model.layers.2.self_attn.k_proj.weight_shape', 'model.layers.2.self_attn.o_proj.weight_packed', 'model.layers.2.self_attn.o_proj.weight_scale', 'model.layers.2.self_attn.o_proj.weight_shape', 'model.layers.2.self_attn.q_proj.weight_packed', 'model.layers.2.self_attn.q_proj.weight_scale', 'model.layers.2.self_attn.q_proj.weight_shape', 'model.layers.2.self_attn.v_proj.weight_packed', 'model.layers.2.self_attn.v_proj.weight_scale', 'model.layers.2.self_attn.v_proj.weight_shape', 'model.layers.20.mlp.down_proj.weight_packed', 'model.layers.20.mlp.down_proj.weight_scale', 'model.layers.20.mlp.down_proj.weight_shape', 'model.layers.20.mlp.gate_proj.weight_packed', 'model.layers.20.mlp.gate_proj.weight_scale', 'model.layers.20.mlp.gate_proj.weight_shape', 'model.layers.20.mlp.up_proj.weight_packed', 'model.layers.20.mlp.up_proj.weight_scale', 'model.layers.20.mlp.up_proj.weight_shape', 'model.layers.20.self_attn.k_proj.weight_packed', 'model.layers.20.self_attn.k_proj.weight_scale', 'model.layers.20.self_attn.k_proj.weight_shape', 'model.layers.20.self_attn.o_proj.weight_packed', 'model.layers.20.self_attn.o_proj.weight_scale', 'model.layers.20.self_attn.o_proj.weight_shape', 'model.layers.20.self_attn.q_proj.weight_packed', 'model.layers.20.self_attn.q_proj.weight_scale', 'model.layers.20.self_attn.q_proj.weight_shape', 'model.layers.20.self_attn.v_proj.weight_packed', 'model.layers.20.self_attn.v_proj.weight_scale', 'model.layers.20.self_attn.v_proj.weight_shape', 'model.layers.21.mlp.down_proj.weight_packed', 'model.layers.21.mlp.down_proj.weight_scale', 'model.layers.21.mlp.down_proj.weight_shape', 'model.layers.21.mlp.gate_proj.weight_packed', 'model.layers.21.mlp.gate_proj.weight_scale', 'model.layers.21.mlp.gate_proj.weight_shape', 'model.layers.21.mlp.up_proj.weight_packed', 'model.layers.21.mlp.up_proj.weight_scale', 'model.layers.21.mlp.up_proj.weight_shape', 'model.layers.21.self_attn.k_proj.weight_packed', 'model.layers.21.self_attn.k_proj.weight_scale', 'model.layers.21.self_attn.k_proj.weight_shape', 'model.layers.21.self_attn.o_proj.weight_packed', 'model.layers.21.self_attn.o_proj.weight_scale', 'model.layers.21.self_attn.o_proj.weight_shape', 'model.layers.21.self_attn.q_proj.weight_packed', 'model.layers.21.self_attn.q_proj.weight_scale', 'model.layers.21.self_attn.q_proj.weight_shape', 'model.layers.21.self_attn.v_proj.weight_packed', 'model.layers.21.self_attn.v_proj.weight_scale', 'model.layers.21.self_attn.v_proj.weight_shape', 'model.layers.22.mlp.down_proj.weight_packed', 'model.layers.22.mlp.down_proj.weight_scale', 'model.layers.22.mlp.down_proj.weight_shape', 'model.layers.22.mlp.gate_proj.weight_packed', 'model.layers.22.mlp.gate_proj.weight_scale', 'model.layers.22.mlp.gate_proj.weight_shape', 'model.layers.22.mlp.up_proj.weight_packed', 'model.layers.22.mlp.up_proj.weight_scale', 'model.layers.22.mlp.up_proj.weight_shape', 'model.layers.22.self_attn.k_proj.weight_packed', 'model.layers.22.self_attn.k_proj.weight_scale', 'model.layers.22.self_attn.k_proj.weight_shape', 'model.layers.22.self_attn.o_proj.weight_packed', 'model.layers.22.self_attn.o_proj.weight_scale', 'model.layers.22.self_attn.o_proj.weight_shape', 'model.layers.22.self_attn.q_proj.weight_packed', 'model.layers.22.self_attn.q_proj.weight_scale', 'model.layers.22.self_attn.q_proj.weight_shape', 'model.layers.22.self_attn.v_proj.weight_packed', 'model.layers.22.self_attn.v_proj.weight_scale', 'model.layers.22.self_attn.v_proj.weight_shape', 'model.layers.23.mlp.down_proj.weight_packed', 'model.layers.23.mlp.down_proj.weight_scale', 'model.layers.23.mlp.down_proj.weight_shape', 'model.layers.23.mlp.gate_proj.weight_packed', 'model.layers.23.mlp.gate_proj.weight_scale', 'model.layers.23.mlp.gate_proj.weight_shape', 'model.layers.23.mlp.up_proj.weight_packed', 'model.layers.23.mlp.up_proj.weight_scale', 'model.layers.23.mlp.up_proj.weight_shape', 'model.layers.23.self_attn.k_proj.weight_packed', 'model.layers.23.self_attn.k_proj.weight_scale', 'model.layers.23.self_attn.k_proj.weight_shape', 'model.layers.23.self_attn.o_proj.weight_packed', 'model.layers.23.self_attn.o_proj.weight_scale', 'model.layers.23.self_attn.o_proj.weight_shape', 'model.layers.23.self_attn.q_proj.weight_packed', 'model.layers.23.self_attn.q_proj.weight_scale', 'model.layers.23.self_attn.q_proj.weight_shape', 'model.layers.23.self_attn.v_proj.weight_packed', 'model.layers.23.self_attn.v_proj.weight_scale', 'model.layers.23.self_attn.v_proj.weight_shape', 'model.layers.24.mlp.down_proj.weight_packed', 'model.layers.24.mlp.down_proj.weight_scale', 'model.layers.24.mlp.down_proj.weight_shape', 'model.layers.24.mlp.gate_proj.weight_packed', 'model.layers.24.mlp.gate_proj.weight_scale', 'model.layers.24.mlp.gate_proj.weight_shape', 'model.layers.24.mlp.up_proj.weight_packed', 'model.layers.24.mlp.up_proj.weight_scale', 'model.layers.24.mlp.up_proj.weight_shape', 'model.layers.24.self_attn.k_proj.weight_packed', 'model.layers.24.self_attn.k_proj.weight_scale', 'model.layers.24.self_attn.k_proj.weight_shape', 'model.layers.24.self_attn.o_proj.weight_packed', 'model.layers.24.self_attn.o_proj.weight_scale', 'model.layers.24.self_attn.o_proj.weight_shape', 'model.layers.24.self_attn.q_proj.weight_packed', 'model.layers.24.self_attn.q_proj.weight_scale', 'model.layers.24.self_attn.q_proj.weight_shape', 'model.layers.24.self_attn.v_proj.weight_packed', 'model.layers.24.self_attn.v_proj.weight_scale', 'model.layers.24.self_attn.v_proj.weight_shape', 'model.layers.25.mlp.down_proj.weight_packed', 'model.layers.25.mlp.down_proj.weight_scale', 'model.layers.25.mlp.down_proj.weight_shape', 'model.layers.25.mlp.gate_proj.weight_packed', 'model.layers.25.mlp.gate_proj.weight_scale', 'model.layers.25.mlp.gate_proj.weight_shape', 'model.layers.25.mlp.up_proj.weight_packed', 'model.layers.25.mlp.up_proj.weight_scale', 'model.layers.25.mlp.up_proj.weight_shape', 'model.layers.25.self_attn.k_proj.weight_packed', 'model.layers.25.self_attn.k_proj.weight_scale', 'model.layers.25.self_attn.k_proj.weight_shape', 'model.layers.25.self_attn.o_proj.weight_packed', 'model.layers.25.self_attn.o_proj.weight_scale', 'model.layers.25.self_attn.o_proj.weight_shape', 'model.layers.25.self_attn.q_proj.weight_packed', 'model.layers.25.self_attn.q_proj.weight_scale', 'model.layers.25.self_attn.q_proj.weight_shape', 'model.layers.25.self_attn.v_proj.weight_packed', 'model.layers.25.self_attn.v_proj.weight_scale', 'model.layers.25.self_attn.v_proj.weight_shape', 'model.layers.26.mlp.down_proj.weight_packed', 'model.layers.26.mlp.down_proj.weight_scale', 'model.layers.26.mlp.down_proj.weight_shape', 'model.layers.26.mlp.gate_proj.weight_packed', 'model.layers.26.mlp.gate_proj.weight_scale', 'model.layers.26.mlp.gate_proj.weight_shape', 'model.layers.26.mlp.up_proj.weight_packed', 'model.layers.26.mlp.up_proj.weight_scale', 'model.layers.26.mlp.up_proj.weight_shape', 'model.layers.26.self_attn.k_proj.weight_packed', 'model.layers.26.self_attn.k_proj.weight_scale', 'model.layers.26.self_attn.k_proj.weight_shape', 'model.layers.26.self_attn.o_proj.weight_packed', 'model.layers.26.self_attn.o_proj.weight_scale', 'model.layers.26.self_attn.o_proj.weight_shape', 'model.layers.26.self_attn.q_proj.weight_packed', 'model.layers.26.self_attn.q_proj.weight_scale', 'model.layers.26.self_attn.q_proj.weight_shape', 'model.layers.26.self_attn.v_proj.weight_packed', 'model.layers.26.self_attn.v_proj.weight_scale', 'model.layers.26.self_attn.v_proj.weight_shape', 'model.layers.27.mlp.down_proj.weight_packed', 'model.layers.27.mlp.down_proj.weight_scale', 'model.layers.27.mlp.down_proj.weight_shape', 'model.layers.27.mlp.gate_proj.weight_packed', 'model.layers.27.mlp.gate_proj.weight_scale', 'model.layers.27.mlp.gate_proj.weight_shape', 'model.layers.27.mlp.up_proj.weight_packed', 'model.layers.27.mlp.up_proj.weight_scale', 'model.layers.27.mlp.up_proj.weight_shape', 'model.layers.27.self_attn.k_proj.weight_packed', 'model.layers.27.self_attn.k_proj.weight_scale', 'model.layers.27.self_attn.k_proj.weight_shape', 'model.layers.27.self_attn.o_proj.weight_packed', 'model.layers.27.self_attn.o_proj.weight_scale', 'model.layers.27.self_attn.o_proj.weight_shape', 'model.layers.27.self_attn.q_proj.weight_packed', 'model.layers.27.self_attn.q_proj.weight_scale', 'model.layers.27.self_attn.q_proj.weight_shape', 'model.layers.27.self_attn.v_proj.weight_packed', 'model.layers.27.self_attn.v_proj.weight_scale', 'model.layers.27.self_attn.v_proj.weight_shape', 'model.layers.28.mlp.down_proj.weight_packed', 'model.layers.28.mlp.down_proj.weight_scale', 'model.layers.28.mlp.down_proj.weight_shape', 'model.layers.28.mlp.gate_proj.weight_packed', 'model.layers.28.mlp.gate_proj.weight_scale', 'model.layers.28.mlp.gate_proj.weight_shape', 'model.layers.28.mlp.up_proj.weight_packed', 'model.layers.28.mlp.up_proj.weight_scale', 'model.layers.28.mlp.up_proj.weight_shape', 'model.layers.28.self_attn.k_proj.weight_packed', 'model.layers.28.self_attn.k_proj.weight_scale', 'model.layers.28.self_attn.k_proj.weight_shape', 'model.layers.28.self_attn.o_proj.weight_packed', 'model.layers.28.self_attn.o_proj.weight_scale', 'model.layers.28.self_attn.o_proj.weight_shape', 'model.layers.28.self_attn.q_proj.weight_packed', 'model.layers.28.self_attn.q_proj.weight_scale', 'model.layers.28.self_attn.q_proj.weight_shape', 'model.layers.28.self_attn.v_proj.weight_packed', 'model.layers.28.self_attn.v_proj.weight_scale', 'model.layers.28.self_attn.v_proj.weight_shape', 'model.layers.29.mlp.down_proj.weight_packed', 'model.layers.29.mlp.down_proj.weight_scale', 'model.layers.29.mlp.down_proj.weight_shape', 'model.layers.29.mlp.gate_proj.weight_packed', 'model.layers.29.mlp.gate_proj.weight_scale', 'model.layers.29.mlp.gate_proj.weight_shape', 'model.layers.29.mlp.up_proj.weight_packed', 'model.layers.29.mlp.up_proj.weight_scale', 'model.layers.29.mlp.up_proj.weight_shape', 'model.layers.29.self_attn.k_proj.weight_packed', 'model.layers.29.self_attn.k_proj.weight_scale', 'model.layers.29.self_attn.k_proj.weight_shape', 'model.layers.29.self_attn.o_proj.weight_packed', 'model.layers.29.self_attn.o_proj.weight_scale', 'model.layers.29.self_attn.o_proj.weight_shape', 'model.layers.29.self_attn.q_proj.weight_packed', 'model.layers.29.self_attn.q_proj.weight_scale', 'model.layers.29.self_attn.q_proj.weight_shape', 'model.layers.29.self_attn.v_proj.weight_packed', 'model.layers.29.self_attn.v_proj.weight_scale', 'model.layers.29.self_attn.v_proj.weight_shape', 'model.layers.3.mlp.down_proj.weight_packed', 'model.layers.3.mlp.down_proj.weight_scale', 'model.layers.3.mlp.down_proj.weight_shape', 'model.layers.3.mlp.gate_proj.weight_packed', 'model.layers.3.mlp.gate_proj.weight_scale', 'model.layers.3.mlp.gate_proj.weight_shape', 'model.layers.3.mlp.up_proj.weight_packed', 'model.layers.3.mlp.up_proj.weight_scale', 'model.layers.3.mlp.up_proj.weight_shape', 'model.layers.3.self_attn.k_proj.weight_packed', 'model.layers.3.self_attn.k_proj.weight_scale', 'model.layers.3.self_attn.k_proj.weight_shape', 'model.layers.3.self_attn.o_proj.weight_packed', 'model.layers.3.self_attn.o_proj.weight_scale', 'model.layers.3.self_attn.o_proj.weight_shape', 'model.layers.3.self_attn.q_proj.weight_packed', 'model.layers.3.self_attn.q_proj.weight_scale', 'model.layers.3.self_attn.q_proj.weight_shape', 'model.layers.3.self_attn.v_proj.weight_packed', 'model.layers.3.self_attn.v_proj.weight_scale', 'model.layers.3.self_attn.v_proj.weight_shape', 'model.layers.30.mlp.down_proj.weight_packed', 'model.layers.30.mlp.down_proj.weight_scale', 'model.layers.30.mlp.down_proj.weight_shape', 'model.layers.30.mlp.gate_proj.weight_packed', 'model.layers.30.mlp.gate_proj.weight_scale', 'model.layers.30.mlp.gate_proj.weight_shape', 'model.layers.30.mlp.up_proj.weight_packed', 'model.layers.30.mlp.up_proj.weight_scale', 'model.layers.30.mlp.up_proj.weight_shape', 'model.layers.30.self_attn.k_proj.weight_packed', 'model.layers.30.self_attn.k_proj.weight_scale', 'model.layers.30.self_attn.k_proj.weight_shape', 'model.layers.30.self_attn.o_proj.weight_packed', 'model.layers.30.self_attn.o_proj.weight_scale', 'model.layers.30.self_attn.o_proj.weight_shape', 'model.layers.30.self_attn.q_proj.weight_packed', 'model.layers.30.self_attn.q_proj.weight_scale', 'model.layers.30.self_attn.q_proj.weight_shape', 'model.layers.30.self_attn.v_proj.weight_packed', 'model.layers.30.self_attn.v_proj.weight_scale', 'model.layers.30.self_attn.v_proj.weight_shape', 'model.layers.31.mlp.down_proj.weight_packed', 'model.layers.31.mlp.down_proj.weight_scale', 'model.layers.31.mlp.down_proj.weight_shape', 'model.layers.31.mlp.gate_proj.weight_packed', 'model.layers.31.mlp.gate_proj.weight_scale', 'model.layers.31.mlp.gate_proj.weight_shape', 'model.layers.31.mlp.up_proj.weight_packed', 'model.layers.31.mlp.up_proj.weight_scale', 'model.layers.31.mlp.up_proj.weight_shape', 'model.layers.31.self_attn.k_proj.weight_packed', 'model.layers.31.self_attn.k_proj.weight_scale', 'model.layers.31.self_attn.k_proj.weight_shape', 'model.layers.31.self_attn.o_proj.weight_packed', 'model.layers.31.self_attn.o_proj.weight_scale', 'model.layers.31.self_attn.o_proj.weight_shape', 'model.layers.31.self_attn.q_proj.weight_packed', 'model.layers.31.self_attn.q_proj.weight_scale', 'model.layers.31.self_attn.q_proj.weight_shape', 'model.layers.31.self_attn.v_proj.weight_packed', 'model.layers.31.self_attn.v_proj.weight_scale', 'model.layers.31.self_attn.v_proj.weight_shape', 'model.layers.4.mlp.down_proj.weight_packed', 'model.layers.4.mlp.down_proj.weight_scale', 'model.layers.4.mlp.down_proj.weight_shape', 'model.layers.4.mlp.gate_proj.weight_packed', 'model.layers.4.mlp.gate_proj.weight_scale', 'model.layers.4.mlp.gate_proj.weight_shape', 'model.layers.4.mlp.up_proj.weight_packed', 'model.layers.4.mlp.up_proj.weight_scale', 'model.layers.4.mlp.up_proj.weight_shape', 'model.layers.4.self_attn.k_proj.weight_packed', 'model.layers.4.self_attn.k_proj.weight_scale', 'model.layers.4.self_attn.k_proj.weight_shape', 'model.layers.4.self_attn.o_proj.weight_packed', 'model.layers.4.self_attn.o_proj.weight_scale', 'model.layers.4.self_attn.o_proj.weight_shape', 'model.layers.4.self_attn.q_proj.weight_packed', 'model.layers.4.self_attn.q_proj.weight_scale', 'model.layers.4.self_attn.q_proj.weight_shape', 'model.layers.4.self_attn.v_proj.weight_packed', 'model.layers.4.self_attn.v_proj.weight_scale', 'model.layers.4.self_attn.v_proj.weight_shape', 'model.layers.5.mlp.down_proj.weight_packed', 'model.layers.5.mlp.down_proj.weight_scale', 'model.layers.5.mlp.down_proj.weight_shape', 'model.layers.5.mlp.gate_proj.weight_packed', 'model.layers.5.mlp.gate_proj.weight_scale', 'model.layers.5.mlp.gate_proj.weight_shape', 'model.layers.5.mlp.up_proj.weight_packed', 'model.layers.5.mlp.up_proj.weight_scale', 'model.layers.5.mlp.up_proj.weight_shape', 'model.layers.5.self_attn.k_proj.weight_packed', 'model.layers.5.self_attn.k_proj.weight_scale', 'model.layers.5.self_attn.k_proj.weight_shape', 'model.layers.5.self_attn.o_proj.weight_packed', 'model.layers.5.self_attn.o_proj.weight_scale', 'model.layers.5.self_attn.o_proj.weight_shape', 'model.layers.5.self_attn.q_proj.weight_packed', 'model.layers.5.self_attn.q_proj.weight_scale', 'model.layers.5.self_attn.q_proj.weight_shape', 'model.layers.5.self_attn.v_proj.weight_packed', 'model.layers.5.self_attn.v_proj.weight_scale', 'model.layers.5.self_attn.v_proj.weight_shape', 'model.layers.6.mlp.down_proj.weight_packed', 'model.layers.6.mlp.down_proj.weight_scale', 'model.layers.6.mlp.down_proj.weight_shape', 'model.layers.6.mlp.gate_proj.weight_packed', 'model.layers.6.mlp.gate_proj.weight_scale', 'model.layers.6.mlp.gate_proj.weight_shape', 'model.layers.6.mlp.up_proj.weight_packed', 'model.layers.6.mlp.up_proj.weight_scale', 'model.layers.6.mlp.up_proj.weight_shape', 'model.layers.6.self_attn.k_proj.weight_packed', 'model.layers.6.self_attn.k_proj.weight_scale', 'model.layers.6.self_attn.k_proj.weight_shape', 'model.layers.6.self_attn.o_proj.weight_packed', 'model.layers.6.self_attn.o_proj.weight_scale', 'model.layers.6.self_attn.o_proj.weight_shape', 'model.layers.6.self_attn.q_proj.weight_packed', 'model.layers.6.self_attn.q_proj.weight_scale', 'model.layers.6.self_attn.q_proj.weight_shape', 'model.layers.6.self_attn.v_proj.weight_packed', 'model.layers.6.self_attn.v_proj.weight_scale', 'model.layers.6.self_attn.v_proj.weight_shape', 'model.layers.7.mlp.down_proj.weight_packed', 'model.layers.7.mlp.down_proj.weight_scale', 'model.layers.7.mlp.down_proj.weight_shape', 'model.layers.7.mlp.gate_proj.weight_packed', 'model.layers.7.mlp.gate_proj.weight_scale', 'model.layers.7.mlp.gate_proj.weight_shape', 'model.layers.7.mlp.up_proj.weight_packed', 'model.layers.7.mlp.up_proj.weight_scale', 'model.layers.7.mlp.up_proj.weight_shape', 'model.layers.7.self_attn.k_proj.weight_packed', 'model.layers.7.self_attn.k_proj.weight_scale', 'model.layers.7.self_attn.k_proj.weight_shape', 'model.layers.7.self_attn.o_proj.weight_packed', 'model.layers.7.self_attn.o_proj.weight_scale', 'model.layers.7.self_attn.o_proj.weight_shape', 'model.layers.7.self_attn.q_proj.weight_packed', 'model.layers.7.self_attn.q_proj.weight_scale', 'model.layers.7.self_attn.q_proj.weight_shape', 'model.layers.7.self_attn.v_proj.weight_packed', 'model.layers.7.self_attn.v_proj.weight_scale', 'model.layers.7.self_attn.v_proj.weight_shape', 'model.layers.8.mlp.down_proj.weight_packed', 'model.layers.8.mlp.down_proj.weight_scale', 'model.layers.8.mlp.down_proj.weight_shape', 'model.layers.8.mlp.gate_proj.weight_packed', 'model.layers.8.mlp.gate_proj.weight_scale', 'model.layers.8.mlp.gate_proj.weight_shape', 'model.layers.8.mlp.up_proj.weight_packed', 'model.layers.8.mlp.up_proj.weight_scale', 'model.layers.8.mlp.up_proj.weight_shape', 'model.layers.8.self_attn.k_proj.weight_packed', 'model.layers.8.self_attn.k_proj.weight_scale', 'model.layers.8.self_attn.k_proj.weight_shape', 'model.layers.8.self_attn.o_proj.weight_packed', 'model.layers.8.self_attn.o_proj.weight_scale', 'model.layers.8.self_attn.o_proj.weight_shape', 'model.layers.8.self_attn.q_proj.weight_packed', 'model.layers.8.self_attn.q_proj.weight_scale', 'model.layers.8.self_attn.q_proj.weight_shape', 'model.layers.8.self_attn.v_proj.weight_packed', 'model.layers.8.self_attn.v_proj.weight_scale', 'model.layers.8.self_attn.v_proj.weight_shape', 'model.layers.9.mlp.down_proj.weight_packed', 'model.layers.9.mlp.down_proj.weight_scale', 'model.layers.9.mlp.down_proj.weight_shape', 'model.layers.9.mlp.gate_proj.weight_packed', 'model.layers.9.mlp.gate_proj.weight_scale', 'model.layers.9.mlp.gate_proj.weight_shape', 'model.layers.9.mlp.up_proj.weight_packed', 'model.layers.9.mlp.up_proj.weight_scale', 'model.layers.9.mlp.up_proj.weight_shape', 'model.layers.9.self_attn.k_proj.weight_packed', 'model.layers.9.self_attn.k_proj.weight_scale', 'model.layers.9.self_attn.k_proj.weight_shape', 'model.layers.9.self_attn.o_proj.weight_packed', 'model.layers.9.self_attn.o_proj.weight_scale', 'model.layers.9.self_attn.o_proj.weight_shape', 'model.layers.9.self_attn.q_proj.weight_packed', 'model.layers.9.self_attn.q_proj.weight_scale', 'model.layers.9.self_attn.q_proj.weight_shape', 'model.layers.9.self_attn.v_proj.weight_packed', 'model.layers.9.self_attn.v_proj.weight_scale', 'model.layers.9.self_attn.v_proj.weight_shape']
- This IS expected if you are initializing LlamaForCausalLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing LlamaForCausalLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of LlamaForCausalLM were not initialized from the model checkpoint at neuralmagic/Meta-Llama-3.1-8B-Instruct-quantized.w8a16 and are newly initialized: ['model.layers.0.mlp.down_proj.weight', 'model.layers.0.mlp.gate_proj.weight', 'model.layers.0.mlp.up_proj.weight', 'model.layers.0.self_attn.k_proj.weight', 'model.layers.0.self_attn.o_proj.weight', 'model.layers.0.self_attn.q_proj.weight', 'model.layers.0.self_attn.v_proj.weight', 'model.layers.1.mlp.down_proj.weight', 'model.layers.1.mlp.gate_proj.weight', 'model.layers.1.mlp.up_proj.weight', 'model.layers.1.self_attn.k_proj.weight', 'model.layers.1.self_attn.o_proj.weight', 'model.layers.1.self_attn.q_proj.weight', 'model.layers.1.self_attn.v_proj.weight', 'model.layers.10.mlp.down_proj.weight', 'model.layers.10.mlp.gate_proj.weight', 'model.layers.10.mlp.up_proj.weight', 'model.layers.10.self_attn.k_proj.weight', 'model.layers.10.self_attn.o_proj.weight', 'model.layers.10.self_attn.q_proj.weight', 'model.layers.10.self_attn.v_proj.weight', 'model.layers.11.mlp.down_proj.weight', 'model.layers.11.mlp.gate_proj.weight', 'model.layers.11.mlp.up_proj.weight', 'model.layers.11.self_attn.k_proj.weight', 'model.layers.11.self_attn.o_proj.weight', 'model.layers.11.self_attn.q_proj.weight', 'model.layers.11.self_attn.v_proj.weight', 'model.layers.12.mlp.down_proj.weight', 'model.layers.12.mlp.gate_proj.weight', 'model.layers.12.mlp.up_proj.weight', 'model.layers.12.self_attn.k_proj.weight', 'model.layers.12.self_attn.o_proj.weight', 'model.layers.12.self_attn.q_proj.weight', 'model.layers.12.self_attn.v_proj.weight', 'model.layers.13.mlp.down_proj.weight', 'model.layers.13.mlp.gate_proj.weight', 'model.layers.13.mlp.up_proj.weight', 'model.layers.13.self_attn.k_proj.weight', 'model.layers.13.self_attn.o_proj.weight', 'model.layers.13.self_attn.q_proj.weight', 'model.layers.13.self_attn.v_proj.weight', 'model.layers.14.mlp.down_proj.weight', 'model.layers.14.mlp.gate_proj.weight', 'model.layers.14.mlp.up_proj.weight', 'model.layers.14.self_attn.k_proj.weight', 'model.layers.14.self_attn.o_proj.weight', 'model.layers.14.self_attn.q_proj.weight', 'model.layers.14.self_attn.v_proj.weight', 'model.layers.15.mlp.down_proj.weight', 'model.layers.15.mlp.gate_proj.weight', 'model.layers.15.mlp.up_proj.weight', 'model.layers.15.self_attn.k_proj.weight', 'model.layers.15.self_attn.o_proj.weight', 'model.layers.15.self_attn.q_proj.weight', 'model.layers.15.self_attn.v_proj.weight', 'model.layers.16.mlp.down_proj.weight', 'model.layers.16.mlp.gate_proj.weight', 'model.layers.16.mlp.up_proj.weight', 'model.layers.16.self_attn.k_proj.weight', 'model.layers.16.self_attn.o_proj.weight', 'model.layers.16.self_attn.q_proj.weight', 'model.layers.16.self_attn.v_proj.weight', 'model.layers.17.mlp.down_proj.weight', 'model.layers.17.mlp.gate_proj.weight', 'model.layers.17.mlp.up_proj.weight', 'model.layers.17.self_attn.k_proj.weight', 'model.layers.17.self_attn.o_proj.weight', 'model.layers.17.self_attn.q_proj.weight', 'model.layers.17.self_attn.v_proj.weight', 'model.layers.18.mlp.down_proj.weight', 'model.layers.18.mlp.gate_proj.weight', 'model.layers.18.mlp.up_proj.weight', 'model.layers.18.self_attn.k_proj.weight', 'model.layers.18.self_attn.o_proj.weight', 'model.layers.18.self_attn.q_proj.weight', 'model.layers.18.self_attn.v_proj.weight', 'model.layers.19.mlp.down_proj.weight', 'model.layers.19.mlp.gate_proj.weight', 'model.layers.19.mlp.up_proj.weight', 'model.layers.19.self_attn.k_proj.weight', 'model.layers.19.self_attn.o_proj.weight', 'model.layers.19.self_attn.q_proj.weight', 'model.layers.19.self_attn.v_proj.weight', 'model.layers.2.mlp.down_proj.weight', 'model.layers.2.mlp.gate_proj.weight', 'model.layers.2.mlp.up_proj.weight', 'model.layers.2.self_attn.k_proj.weight', 'model.layers.2.self_attn.o_proj.weight', 'model.layers.2.self_attn.q_proj.weight', 'model.layers.2.self_attn.v_proj.weight', 'model.layers.20.mlp.down_proj.weight', 'model.layers.20.mlp.gate_proj.weight', 'model.layers.20.mlp.up_proj.weight', 'model.layers.20.self_attn.k_proj.weight', 'model.layers.20.self_attn.o_proj.weight', 'model.layers.20.self_attn.q_proj.weight', 'model.layers.20.self_attn.v_proj.weight', 'model.layers.21.mlp.down_proj.weight', 'model.layers.21.mlp.gate_proj.weight', 'model.layers.21.mlp.up_proj.weight', 'model.layers.21.self_attn.k_proj.weight', 'model.layers.21.self_attn.o_proj.weight', 'model.layers.21.self_attn.q_proj.weight', 'model.layers.21.self_attn.v_proj.weight', 'model.layers.22.mlp.down_proj.weight', 'model.layers.22.mlp.gate_proj.weight', 'model.layers.22.mlp.up_proj.weight', 'model.layers.22.self_attn.k_proj.weight', 'model.layers.22.self_attn.o_proj.weight', 'model.layers.22.self_attn.q_proj.weight', 'model.layers.22.self_attn.v_proj.weight', 'model.layers.23.mlp.down_proj.weight', 'model.layers.23.mlp.gate_proj.weight', 'model.layers.23.mlp.up_proj.weight', 'model.layers.23.self_attn.k_proj.weight', 'model.layers.23.self_attn.o_proj.weight', 'model.layers.23.self_attn.q_proj.weight', 'model.layers.23.self_attn.v_proj.weight', 'model.layers.24.mlp.down_proj.weight', 'model.layers.24.mlp.gate_proj.weight', 'model.layers.24.mlp.up_proj.weight', 'model.layers.24.self_attn.k_proj.weight', 'model.layers.24.self_attn.o_proj.weight', 'model.layers.24.self_attn.q_proj.weight', 'model.layers.24.self_attn.v_proj.weight', 'model.layers.25.mlp.down_proj.weight', 'model.layers.25.mlp.gate_proj.weight', 'model.layers.25.mlp.up_proj.weight', 'model.layers.25.self_attn.k_proj.weight', 'model.layers.25.self_attn.o_proj.weight', 'model.layers.25.self_attn.q_proj.weight', 'model.layers.25.self_attn.v_proj.weight', 'model.layers.26.mlp.down_proj.weight', 'model.layers.26.mlp.gate_proj.weight', 'model.layers.26.mlp.up_proj.weight', 'model.layers.26.self_attn.k_proj.weight', 'model.layers.26.self_attn.o_proj.weight', 'model.layers.26.self_attn.q_proj.weight', 'model.layers.26.self_attn.v_proj.weight', 'model.layers.27.mlp.down_proj.weight', 'model.layers.27.mlp.gate_proj.weight', 'model.layers.27.mlp.up_proj.weight', 'model.layers.27.self_attn.k_proj.weight', 'model.layers.27.self_attn.o_proj.weight', 'model.layers.27.self_attn.q_proj.weight', 'model.layers.27.self_attn.v_proj.weight', 'model.layers.28.mlp.down_proj.weight', 'model.layers.28.mlp.gate_proj.weight', 'model.layers.28.mlp.up_proj.weight', 'model.layers.28.self_attn.k_proj.weight', 'model.layers.28.self_attn.o_proj.weight', 'model.layers.28.self_attn.q_proj.weight', 'model.layers.28.self_attn.v_proj.weight', 'model.layers.29.mlp.down_proj.weight', 'model.layers.29.mlp.gate_proj.weight', 'model.layers.29.mlp.up_proj.weight', 'model.layers.29.self_attn.k_proj.weight', 'model.layers.29.self_attn.o_proj.weight', 'model.layers.29.self_attn.q_proj.weight', 'model.layers.29.self_attn.v_proj.weight', 'model.layers.3.mlp.down_proj.weight', 'model.layers.3.mlp.gate_proj.weight', 'model.layers.3.mlp.up_proj.weight', 'model.layers.3.self_attn.k_proj.weight', 'model.layers.3.self_attn.o_proj.weight', 'model.layers.3.self_attn.q_proj.weight', 'model.layers.3.self_attn.v_proj.weight', 'model.layers.30.mlp.down_proj.weight', 'model.layers.30.mlp.gate_proj.weight', 'model.layers.30.mlp.up_proj.weight', 'model.layers.30.self_attn.k_proj.weight', 'model.layers.30.self_attn.o_proj.weight', 'model.layers.30.self_attn.q_proj.weight', 'model.layers.30.self_attn.v_proj.weight', 'model.layers.31.mlp.down_proj.weight', 'model.layers.31.mlp.gate_proj.weight', 'model.layers.31.mlp.up_proj.weight', 'model.layers.31.self_attn.k_proj.weight', 'model.layers.31.self_attn.o_proj.weight', 'model.layers.31.self_attn.q_proj.weight', 'model.layers.31.self_attn.v_proj.weight', 'model.layers.4.mlp.down_proj.weight', 'model.layers.4.mlp.gate_proj.weight', 'model.layers.4.mlp.up_proj.weight', 'model.layers.4.self_attn.k_proj.weight', 'model.layers.4.self_attn.o_proj.weight', 'model.layers.4.self_attn.q_proj.weight', 'model.layers.4.self_attn.v_proj.weight', 'model.layers.5.mlp.down_proj.weight', 'model.layers.5.mlp.gate_proj.weight', 'model.layers.5.mlp.up_proj.weight', 'model.layers.5.self_attn.k_proj.weight', 'model.layers.5.self_attn.o_proj.weight', 'model.layers.5.self_attn.q_proj.weight', 'model.layers.5.self_attn.v_proj.weight', 'model.layers.6.mlp.down_proj.weight', 'model.layers.6.mlp.gate_proj.weight', 'model.layers.6.mlp.up_proj.weight', 'model.layers.6.self_attn.k_proj.weight', 'model.layers.6.self_attn.o_proj.weight', 'model.layers.6.self_attn.q_proj.weight', 'model.layers.6.self_attn.v_proj.weight', 'model.layers.7.mlp.down_proj.weight', 'model.layers.7.mlp.gate_proj.weight', 'model.layers.7.mlp.up_proj.weight', 'model.layers.7.self_attn.k_proj.weight', 'model.layers.7.self_attn.o_proj.weight', 'model.layers.7.self_attn.q_proj.weight', 'model.layers.7.self_attn.v_proj.weight', 'model.layers.8.mlp.down_proj.weight', 'model.layers.8.mlp.gate_proj.weight', 'model.layers.8.mlp.up_proj.weight', 'model.layers.8.self_attn.k_proj.weight', 'model.layers.8.self_attn.o_proj.weight', 'model.layers.8.self_attn.q_proj.weight', 'model.layers.8.self_attn.v_proj.weight', 'model.layers.9.mlp.down_proj.weight', 'model.layers.9.mlp.gate_proj.weight', 'model.layers.9.mlp.up_proj.weight', 'model.layers.9.self_attn.k_proj.weight', 'model.layers.9.self_attn.o_proj.weight', 'model.layers.9.self_attn.q_proj.weight', 'model.layers.9.self_attn.v_proj.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
/local_disk0/.ephemeral_nfs/envs/pythonEnv-d6271344-6884-46e7-9edc-d3d2a32c6cc0/lib/python3.11/site-packages/transformers/generation/utils.py:1258: UserWarning: Using the model-agnostic default `max_length` (=20) to control the generation length. We recommend setting `max_new_tokens` to control the maximum length of the generation.
warnings.warn(
[{'generated_text': [{'role': 'user', 'content': 'Who are you?'},
{'role': 'assistant', 'content': 'zierzierrens Otihnонь'}]}]
The quantized models cannot yet be loaded to transformers directly (we are working on it with HF team)
For now, you need to use vLLM or SparseAutoModel to load these models:
from transformers import AutoTokenizer
from llmcompressor.transformers import SparseAutoModelForCausalLM
# Select model and load it.
MODEL_ID = "meta-llama/Meta-Llama-3-8B-Instruct"
model = SparseAutoModelForCausalLM.from_pretrained(
MODEL_ID,
device_map="auto",
torch_dtype="auto",
)
tokenizer = AutoTokenizer.from_pretrained(MODEL_ID)
Thanks @robertgshaw2 , good to know!
is this resolved now?
Yes this is resolved now, you can load compressed-tensors models in transformers https://huggingface.co/docs/transformers/main/en/quantization/compressed_tensors
mgoin
changed discussion status to
closed