Open Source Models

#20
by RobertoDonPedro - opened

How would I use it with Open Source models like GPT4All, OpenAssistant, ...?
(Just found examples that require an OpenAPI key so far...)

Hey, @RobertoDonPedro
Right now we support Ctransformers and we might just have fixed a bug with HuggingFaceHub models that we'll push a fix very soon.

These two allow you to load models or run inference remotely. We are working to add more.

Can you provide a working sample?

Hi @GabrielLogspace
I was wondering if Ctransformers also works for local llms?
Also if it does could you maybe make a short description on how to use it with a local llm?

Using ctranformers here in the Space probably won't work so you'd have to run LangFlow on your local machine.

If you have the model downloaded do something like this:

image.png

and it should be loaded and ready to start inference.

It gives me the error "Required input llm for module ConversationChain not found" when I try to use CTransformers node output with the LLM node input of the ConversationChain node. I used the GPT4All default model from their documentation aka C:\Users\User.cache\gpt4all\orca-mini-3b-gguf2-q4_0.gguf

Sign up or log in to comment