Spaces:
Runtime error
Runtime error
abishek-official
commited on
Commit
•
8d98f56
1
Parent(s):
66c6252
Update README.md
Browse files
README.md
CHANGED
@@ -11,3 +11,52 @@ license: apache-2.0
|
|
11 |
---
|
12 |
|
13 |
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
---
|
12 |
|
13 |
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
|
14 |
+
|
15 |
+
# AI-Assistant-Chatbot🚀
|
16 |
+
Building an AI is everyone's dream right? It is very simple to make your own AI chatbot using falcon 7b-instruct llm in hugging face and framing it as a chatbot using LangChain library and making it as an application to use.
|
17 |
+
|
18 |
+
## Resources Used in the Projects🌟
|
19 |
+
* **Falcon** as LLM
|
20 |
+
* **Langchain** as framework for LLM
|
21 |
+
* **Falcon-Model** from HuggingFace
|
22 |
+
* **Gradio** for deploying the application with interface
|
23 |
+
|
24 |
+
## Steps Involved are🌟
|
25 |
+
* First integrate the model and run the model in notebooks or in local IDE without any errors
|
26 |
+
* Create an Interface for the model using **Gradio**
|
27 |
+
* Create a repository in the Hugging Face Spaces
|
28 |
+
* Add app.py file and requirements.txt file into the repository
|
29 |
+
* Now the model is deployed in the HuggingFace Repository Space and ready to use anytime
|
30 |
+
|
31 |
+
## System Requirement🌟
|
32 |
+
* You must have Python 3.10 or later installed or use google colab for testing the model.
|
33 |
+
* You must have Hugging Face Account
|
34 |
+
|
35 |
+
## File Description🌟
|
36 |
+
* **app.py** - The file contains the entire source code to deploy which is used in Hugging Face Repository for deployment
|
37 |
+
* **requirements.txt** - This file contains all the names of the packages needed to be installed in the environment
|
38 |
+
* **chatbot-falcon.ipynb** - It a colab notebook file which has the entire source should have to run in your colab for testing
|
39 |
+
* **gradio.txt** - The context about gradio framework
|
40 |
+
* **langchain.txt** - The context about langchain framework
|
41 |
+
* **huggingface.txt** - The context about Hugging Face Platform
|
42 |
+
|
43 |
+
# Steps to Create Repository in Hugging Face and Deploy the model
|
44 |
+
|
45 |
+
## Creating Hugging Face Repository
|
46 |
+
1. Goto Hugging Face Website - https://huggingface.co/
|
47 |
+
2. Click the **+New** icon below the Hugging Face logo
|
48 |
+
3. Then click on Spaces
|
49 |
+
4. Enter your Repo name, type of license being used, type of SDK used (in ou case, it is Gradio)
|
50 |
+
5. Now set as public or private and click on create Space
|
51 |
+
|
52 |
+
## Adding files to Repository
|
53 |
+
1. Now inside repository, click on Files and click on Add file at right upper middle of screen
|
54 |
+
2. Click Upload File and upload the app.py and requirements.txt files to the repo
|
55 |
+
3. Now it shows **Building** once **Running** note comes click on to the **APP** window and start using your Application
|
56 |
+
|
57 |
+
|
58 |
+
# Disclaimer❗
|
59 |
+
Here the content are open source but owned by some organizations and they have some terms & conditions for usage of their models. Do not use for Commercial purpose, it is meant for Research and learning. Do not Misuse any of the data or resources provided in the repository. Repository or author are not responsible for illegal use of resources.
|
60 |
+
|
61 |
+
~ Content by Abishek
|
62 |
+
|