{ "cells": [ { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "view-in-github" }, "source": [ "\"Open" ] }, { "cell_type": "markdown", "metadata": { "id": "EnzlcRZycXnr" }, "source": [ "# LLmDocumentChatBot\n", "\n", "`Chat with your documents using Llama 2, Falcon or OpenAI`\n", "\n", "- You can upload multiple documents at once to a single database.\n", "- Every time a new database is created, the previous one is deleted.\n", "- For maximum privacy, you can click \"Load LLAMA GGML Model\" to use a Llama 2 model. By default, the model llama-2_7B-Chat is loaded.\n", "\n", "Program that enables seamless interaction with your documents through an advanced vector database and the power of Large Language Model (LLM) technology.\n", "\n", "| Description | Link |\n", "| ----------- | ---- |\n", "| 📙 Colab Notebook | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/ndn1954/llmdocumentchatbot/blob/main/LLMdocumentchatbot_Colab.ipynb) |\n", "| 🎉 Repository | [![GitHub Repository](https://img.shields.io/badge/GitHub-Repository-black?style=flat-square&logo=github)](https://github.com/ndn1954/llmdocumentchatbot/) |\n", "| 🚀 Online Demo | [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/ndn1954/llmdocumentchatbot) |\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "S5awiNy-A50W" }, "outputs": [], "source": [ "!git clone https://github.com/ndn1954/llmdocumentchatbot.git\n", "%cd llmdocumentchatbot\n", "!pip install -r requirements.txt\n", "\n", "import torch\n", "import os\n", "print(\"Wait until the cell finishes executing\")\n", "if torch.cuda.is_available():\n", " print(\"CUDA is available on this system.\")\n", " os.system('CMAKE_ARGS=\"-DLLAMA_CUBLAS=on\" FORCE_CMAKE=1 pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir --verbose')\n", "else:\n", " print(\"CUDA is not available on this system.\")\n", " os.system('pip install llama-cpp-python')" ] }, { "cell_type": "markdown", "metadata": { "id": "jLfxiOyMEcGF" }, "source": [ "`RESTART THE RUNTIME` before executing the next cell." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "2F2VGAJtEbb3" }, "outputs": [], "source": [ "%cd /content/llmdocumentchatbot\n", "!python app.py" ] }, { "cell_type": "markdown", "metadata": { "id": "3aEEcmchZIlf" }, "source": [ "Open the `public URL` when it appears" ] } ], "metadata": { "accelerator": "GPU", "colab": { "authorship_tag": "ABX9TyMoq/QuUmy+xrGmEAesfDhp", "gpuType": "T4", "include_colab_link": true, "provenance": [] }, "kernelspec": { "display_name": "Python 3", "name": "python3" }, "language_info": { "name": "python" } }, "nbformat": 4, "nbformat_minor": 0 }