Ali Kadhim commited on
Commit
7b8984d
1 Parent(s): 3f6c270

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -96
README.md CHANGED
@@ -1,97 +1,2 @@
1
  # Beyond-ChatGPT
2
- Chainlit App using Python streaming for Level 0 MLOps
3
-
4
- LLM Application with Chainlit, Docker, and Huggingface Spaces
5
- In this guide, we'll walk you through the steps to create a Language Learning Model (LLM) application using Chainlit, then containerize it using Docker, and finally deploy it on Huggingface Spaces.
6
-
7
- Prerequisites
8
- A GitHub account
9
- Docker installed on your local machine
10
- A Huggingface Spaces account
11
-
12
-
13
- ### Building our App
14
- Clone [this](https://github.com/AI-Maker-Space/Beyond-ChatGPT/tree/main) repo.
15
-
16
- ``` bash
17
- git clone https://github.com/AI-Maker-Space/Beyond-ChatGPT.git
18
- ```
19
-
20
- Navigate inside this repo
21
- ```
22
- cd Beyond-ChatGPT
23
- ```
24
-
25
- Install the packages required for this python envirnoment in `requirements.txt`.
26
- ```
27
- pip install -r requirements.txt
28
- ```
29
-
30
- Open your `.env` file. Replace the `###` in your `.env` file with your OpenAI Key and save the file.
31
- ```
32
- OPENAI_API_KEY=sk-###
33
- ```
34
-
35
- Let's try deploying it locally. Make sure you're in the python environment where you installed Chainlit and OpenAI.
36
-
37
- Run the app using Chainlit
38
-
39
- ```
40
- chainlit run app.py -w
41
- ```
42
-
43
- Great work! Let's see if we can interact with our chatbot.
44
-
45
- Time to throw it into a docker container a prepare it for shipping
46
-
47
- Build the Docker image. We'll tag our image as `llm-app` using the `-t` parameter. The `.` at the end means we want all of the files in our current directory to be added to our image.
48
- ``` bash
49
- docker build -t llm-app .
50
- ```
51
-
52
- Run and test the Docker image locally using the `run` command. The `-p`parameter connects our **host port #** to the left of the `:` to our **container port #** on the right.
53
- ``` bash
54
- docker run -p 7860:7860 llm-app
55
- ```
56
-
57
- Visit http://localhost:7860 in your browser to see if the app runs correctly.
58
-
59
- Great! Time to ship!
60
-
61
- ### Deploy to Huggingface Spaces
62
-
63
- Make sure you're logged into Huggingface Spaces CLI
64
-
65
- ``` bash
66
- huggingface-cli login
67
- ```
68
-
69
- Follow the prompts to authenticate.
70
-
71
-
72
- Deploy to Huggingface Spaces
73
-
74
-
75
- Create a new Huggingface Space
76
-
77
- - Owner: Your username
78
- - Space Name: `llm-app`
79
- - License: `Openrail`
80
- - Select the Space SDK: `Docker`
81
- - Docker Template: `Blank`
82
- - Space Hardware: `CPU basic - 2 vCPU - 16 GB - Free`
83
- - Repo type: `Public`
84
-
85
-
86
-
87
- Deploying on Huggingface Spaces using a custom Docker image involves using their web interface. Go to Huggingface Spaces and create a new space, then set it up to use your Docker image from the Huggingface Container Registry.
88
-
89
- Access the Application
90
-
91
- Once deployed, access your app at:
92
-
93
- ruby
94
- Copy code
95
- https://huggingface.co/spaces/your-username/llm-app
96
- Conclusion
97
- You've successfully created an LLM application with Chainlit, containerized it with Docker, and deployed it on Huggingface Spaces. Visit the link to interact with your deployed application.
 
1
  # Beyond-ChatGPT
2
+ Please refer to the [LLMOps Dev Environment](https://github.com/AI-Maker-Space/LLMOps-Dev-101/)https://github.com/AI-Maker-Space/LLMOps-Dev-101/ for instructions.