"num_logits_to_keep" undefined in modeling_phi4mm.py() ?
Hello,
Sorry in advance, newbie here, first post on a HF forum, i couldn't find how to format the log below, so just copy-pasted.
Just trying to run the example code, i get the following log.
It looks like when calling model.generate() at line 45 of the example, modeling_phi4mm.py() is failing because num_logits_to_keep isn't defined.
I don't know if i'm doing something wrong.
Any help appreciated.
Just in case it helps:
- i'm running on a 3060 RTX with cuda 12.8
- I tried both flash-attn and eager modes to no avail.
Here's the full log:
--- IMAGE PROCESSING ---
Prompt
<|user|><|image_1|>What is shown in this image?<|end|><|assistant|>
Traceback (most recent call last):
File "E:\cursorAI\phi4multimodal\app.py", line 45, in
generate_ids = model.generate(
^^^^^^^^^^^^^^^
File "E:\cursorAI\phi4multimodal\p4venv\Lib\site-packages\torch\utils_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "E:\cursorAI\phi4multimodal\p4venv\Lib\site-packages\transformers\generation\utils.py", line 2223, in generate
result = self._sample(
^^^^^^^^^^^^^
File "E:\cursorAI\phi4multimodal\p4venv\Lib\site-packages\transformers\generation\utils.py", line 3211, in _sample
outputs = self(**model_inputs, return_dict=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\cursorAI\phi4multimodal\p4venv\Lib\site-packages\torch\nn\modules\module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\cursorAI\phi4multimodal\p4venv\Lib\site-packages\torch\nn\modules\module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\cedri.cache\huggingface\modules\transformers_modules\microsoft\Phi-4-multimodal-instruct\985802b4e1db71df6d366368508d5b30bd743c42\modeling_phi4mm.py", line 2137, in forward
logits = self.lm_head(hidden_states[:, -num_logits_to_keep:, :])
^^^^^^^^^^^^^^^^^^^
TypeError: bad operand type for unary -: 'NoneType'
Thanks for any help
Cedric
OK, problem solved.
For anyone bumping into this, the solution was to downgrade transformers from 4.49.0 to 4.48.2.
Below is my working requirements.txt:
Cheers
Cedric
Requirements.txt:
accelerate==1.4.0
backoff==2.2.1
certifi==2025.1.31
cffi==1.17.1
charset-normalizer==3.4.1
colorama==0.4.6
einops==0.8.1
filelock==3.13.1
flash_attn==2.7.4.post1
fsspec==2024.6.1
huggingface-hub==0.29.2
idna==3.10
Jinja2==3.1.4
MarkupSafe==2.1.5
mpmath==1.3.0
networkx==3.3
numpy==2.1.2
packaging==24.2
peft==0.14.0
pillow==11.0.0
psutil==7.0.0
pycparser==2.22
PyYAML==6.0.2
regex==2024.11.6
requests==2.32.3
safetensors==0.5.3
scipy==1.15.2
setuptools==70.2.0
soundfile==0.13.1
sympy==1.13.1
tokenizers==0.21.0
torch==2.6.0+cu126
torchaudio==2.6.0+cu126
torchvision==0.21.0+cu126
tqdm==4.67.1
transformers==4.48.2
typing_extensions==4.12.2
urllib3==2.3.0
wheel==0.45.1
Another solution is to explicitly give num_logits_to_keep=0
as keyword argument to the model.generate()
call.