Supported tool: LocalAI

#2
by mudler - opened

Just wanted to let anyone know that also https://github.com/go-skynet/LocalAI supports llama.cpp and ggml compatible models (such as MPT, Starcoder, dolly, GPT NeoX, ...) cc @TheBloke

deleted

Ooh that is cool. Might be usable with flowise?

seems it's getting there in the next flowise release: https://github.com/FlowiseAI/Flowise/issues/229 !

turns out its already supported. I'm adding an example too https://github.com/go-skynet/LocalAI/pull/438 !

deleted

Cool. will be reading up on that. i looked at it last week hearing it supported ooba, but it didnt, as far as i could tell. disappointing. but this would be a way to run the main AI piece local , which would be my goal

Sign up or log in to comment