Spaces:
Building
Open Source Models
How would I use it with Open Source models like GPT4All, OpenAssistant, ...?
(Just found examples that require an OpenAPI key so far...)
Hey,
@RobertoDonPedro
Right now we support Ctransformers and we might just have fixed a bug with HuggingFaceHub models that we'll push a fix very soon.
These two allow you to load models or run inference remotely. We are working to add more.
Can you provide a working sample?
Hi
@GabrielLogspace
I was wondering if Ctransformers also works for local llms?
Also if it does could you maybe make a short description on how to use it with a local llm?
It gives me the error "Required input llm for module ConversationChain not found" when I try to use CTransformers node output with the LLM node input of the ConversationChain node. I used the GPT4All default model from their documentation aka C:\Users\User.cache\gpt4all\orca-mini-3b-gguf2-q4_0.gguf