Illegal Instruction Issue on ubuntu 20 server

#4
by OceanMind - opened

from llama_cpp import Llama
print("Loading model...")
llm = Llama(model_path='./models/codellama-7b.Q2_K.gguf')
print("Model loaded.")
output = llm("Q: Name the planets in the solar system? A: ", max_tokens=32, stop=["Q:", "\n"], echo=True)
print("output", output)


SeaHamburg@box:/llama$ pip3 show llama_cpp_python
Name: llama_cpp_python
Version: 0.2.7
Summary: Python bindings for the llama.cpp library
Home-page:
Author:
Author-email: Andrei Betlen abetlen@gmail.com
License: MIT
Location: /home/SeaHamburg/.local/lib/python3.11/site-packages
Requires: diskcache, numpy, typing-extensions
Required-by:
SeaHamburg@box:
/llama$ python3 app.py
Loading model...
Illegal instruction
SeaHamburg@box:~/llama$


i cant find someone else with similar issue online

Sign up or log in to comment