Key Error 'llma' in configuration_auto.py

#76
by himasrikode - opened

transformers==4.18.0
torch==1.10.2
python == 3.6.9

Key Error Issue coming Tried the ways that were already suggested in different platforms but none worked.Is there anyway i could resolve this issue

Meta Llama org

Hi @himasrikode
To use this model you need to have ideally the latest transformers / torch / python installed; can you try on a fresh new python env (>=3.8): pip install -U transformers torch

Sign up or log in to comment