Text Generation
Transformers
OpenVINO
English
llama
biology
medical
conversational
Inference Endpoints
text-generation-inference

Add LocalAI configuration

#1

this allows to run the model directly with LocalAI by pointing to the URL of the model file, for example:

local-ai run huggingface://fakezeta/Llama3-Aloe-8B-Alpha-ov-int8/model.yaml

fakezeta changed pull request status to merged

Great!

Thank you

Sign up or log in to comment