Won't load into text gen ui with transformers

#3
by Turrican10 - opened

I've had other model works fine from TheBloke , maybe because its a .bin GGML instead of the newer GGUF. But I get
OSError: It looks like the config file at 'models\WizardLM-7B-uncensored.ggmlv3.q4_K_M.bin' is not a valid JSON file.

@Turrican10 yes ggml is outdated and i dont believe it works anymore with llama.cpp, ctransformers, llama-cpp-python

however the error you have is because you are trying to load it with transformers. Transformers does not support gguf nor ggml so thats why it doesnt work.

Sign up or log in to comment