Not working on M1 Max using llama-cpp-python
#7
by
shroominic
- opened
Getting this error:
libc++abi: terminating due to uncaught exception of
type std::out_of_range: unordered_map::at: key not found
Not even one token is predicted so the model is crashing directly...
Working normally for all other models so it is something specific to this one.
llama-cpp-python==0.2.20