How to load in LocalAI?
Hi,
i am not able to load this into local-ai.
I tried using the API with:POST
http://<local-ai>/models/apply
Body:
{
"url": "huggingface://mradermacher/LocalAI-Llama3.1-8b-Function-Call-v0.3-i1-GGUF",
"name": "LocalAI-Llama3.1-8b-Function-Call-v0.3-i1-GGUF"
}
without any luck.
I was also not able to change the yml file of another model and restart local_ai.:mudler/LocalAI-Llama3-8b-Function-Call-v0.2-GGUF/localai.yaml
(this one was installed from gallery of huggingface which was also a nightmare because the UI does not load because the list is to long. i had to load is with postgres an search the HTML code)
I have local_ai running as a docker container in unraid.
Reading the documentation was not helpful at all.
For my opinion the documentation is horrible for local-ai.
Hope you could help me.
best regards
I know nothing of local-ai, but if it supports ggufs, you probbly need to specify the url of a specific quant, e.g. https://huggingface.co/mradermacher/LocalAI-Llama3.1-8b-Function-Call-v0.3-i1-GGUF/resolve/main/LocalAI-Llama3.1-8b-Function-Call-v0.3.i1-Q4_K_M.gguf