Geraldine's picture
Update README.md
75c5a2c verified

A newer version of the Streamlit SDK is available: 1.37.0

Upgrade
metadata
license: mit
title: Streamlit simple QA Inference App with Ollama, Nvidia Cloud and Groq
app_file: Home.py
sdk: streamlit

Streamlit simple QA Inference App with Ollama, Nvidia Cloud and Groq

Post : https://iaetbibliotheques.fr/2024/05/comment-executer-localement-un-llm-22

Deployed : no

Two different ways to develop the same chatbot application

  • app_api_completion.py : make QA inference with LLMs by choosing between the native Chat API completion endpoints provided by Ollama, Nvidia or Groq
  • app_langchain_completion.py : make QA inference with LLMs with the dedicated Langchain wrappers for Ollama, Nvidia or Groq

You can use one, two or the three LLMs hosting solutions according to your environment :

git clone
pip install -r requirements.txt
streamlit run Home.py

Running on http://localhost:8501

screenshot