Amrpyt commited on
Commit
f63dd16
1 Parent(s): 7d7066c

Upload 5 files

Browse files
docker-compose/README.md ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Docker Compose Deployment Guide
2
+
3
+ This guide provides instructions on how to deploy the Docker Compose applications within this project.
4
+
5
+ ## Prerequisites
6
+
7
+ Before proceeding, ensure you have Docker and Docker Compose installed on your system:
8
+
9
+ - Docker: [Get Docker](https://docs.docker.com/get-docker/)
10
+ - Docker Compose: [Install Docker Compose](https://docs.docker.com/compose/install/)
11
+
12
+ ## Deployment Instructions
13
+
14
+ Within this project, there are multiple Docker Compose applications. Each can be deployed using its respective `docker-compose.yaml` file. Here's how to deploy each one:
15
+
16
+ Certainly! You can use `curl` or `wget` to download the Docker Compose file directly from a given URL (if the files are hosted online and have a direct URL). Here's how you can do it in a one-liner command for each service, followed by the command to run it:
17
+
18
+ ### [BetterChatGPT](https://github.com/ztjhz/BetterChatGPT)
19
+
20
+ ```sh
21
+ curl -L -o docker-compose.yaml https://raw.githubusercontent.com/PawanOsman/ChatGPT/main/docker-compose/bettergpt/docker-compose.yaml
22
+ docker-compose -f docker-compose.yaml up -d
23
+ ```
24
+
25
+ Or if you're using `wget`:
26
+
27
+ ```sh
28
+ wget https://raw.githubusercontent.com/PawanOsman/ChatGPT/main/docker-compose/bettergpt/docker-compose.yaml -O docker-compose.yaml
29
+ docker-compose -f docker-compose.yaml up -d
30
+ ```
31
+
32
+ ### [ChatGPT Next Web](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web)
33
+
34
+ ```sh
35
+ curl -L -o docker-compose.yaml https://raw.githubusercontent.com/PawanOsman/ChatGPT/main/docker-compose/chatgpt-next-web/docker-compose.yaml
36
+ docker-compose -f docker-compose.yaml up -d
37
+ ```
38
+
39
+ Or with `wget`:
40
+
41
+ ```sh
42
+ wget https://raw.githubusercontent.com/PawanOsman/ChatGPT/main/docker-compose/chatgpt-next-web/docker-compose.yaml -O docker-compose.yaml
43
+ docker-compose -f docker-compose.yaml up -d
44
+ ```
45
+
46
+ ### [Lobe Chat](https://github.com/lobehub/lobe-chat)
47
+
48
+ ```sh
49
+ curl -L -o docker-compose.yaml https://raw.githubusercontent.com/PawanOsman/ChatGPT/main/docker-compose/lobe-chat/docker-compose.yaml
50
+ docker-compose -f docker-compose.yaml up -d
51
+ ```
52
+
53
+ Or using `wget`:
54
+
55
+ ```sh
56
+ wget https://raw.githubusercontent.com/PawanOsman/ChatGPT/main/docker-compose/lobe-chat/docker-compose.yaml -O docker-compose.yaml
57
+ docker-compose -f docker-compose.yaml up -d
58
+ ```
59
+
60
+ ## Managing the Applications
61
+
62
+ Once deployed, you can manage your applications with the following commands:
63
+
64
+ - To view the status of your services:
65
+ ```sh
66
+ docker-compose ps
67
+ ```
68
+
69
+ - To stop the services:
70
+ ```sh
71
+ docker-compose down
72
+ ```
73
+
74
+ - To view the logs of a service:
75
+ ```sh
76
+ docker-compose logs [service-name]
77
+ ```
78
+
79
+ Replace `[service-name]` with the name of the service you want to check the logs for.
80
+
81
+ ## Additional Notes
82
+
83
+ - Ensure you are in the correct directory before running the `docker-compose` commands.
84
+ - Use the `-d` flag to run containers in detached mode.
85
+ - To pull the latest images before starting containers, use the command `docker-compose pull`.
86
+
87
+ Thank you for using this project. Please report any issues or provide feedback to the project maintainers.
docker-compose/bettergpt/Dockerfile ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Use a Node.js base image
2
+ FROM node:19-alpine
3
+
4
+ # Set the working directory
5
+ WORKDIR /app
6
+
7
+ # Clone the project repository
8
+ RUN git clone https://github.com/ztjhz/BetterChatGPT.git /app
9
+
10
+ # Install dependencies
11
+ RUN npm install
12
+
13
+ # Expose the port the app runs on
14
+ EXPOSE 5173
15
+
16
+ # Command to run the start script
17
+ CMD ["npm", "run", "dev"]
docker-compose/bettergpt/docker-compose.yaml ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ version: '3.8' # Specify Docker Compose file version
2
+
3
+ services:
4
+ # Define the service for the web interface of ChatGPT
5
+ bettergpt:
6
+ image: pawanosman/bettergpt:latest # Use the specified Docker image for the web interface
7
+ ports:
8
+ - "5173:5173" # Map port 5173 on the host to port 5173 in the container
9
+ environment: # Set environment variables for the container
10
+ VITE_CUSTOM_API_ENDPOINT: "http://localhost:3040/v1/chat/completions"
11
+ VITE_DEFAULT_API_ENDPOINT: "http://localhost:3040/v1/chat/completions"
12
+ VITE_OPENAI_API_KEY: "anything"
13
+ depends_on:
14
+ - chatgpt # Ensure this service starts after the chatgpt service
15
+
16
+ # Define the backend service for ChatGPT
17
+ chatgpt:
18
+ image: pawanosman/chatgpt:latest # Use the specified Docker image for the backend
19
+ restart: always # Ensure the container restarts automatically if it stops
20
+ ports:
21
+ - "3040:3040" # Map port 3040 on the host to port 3040 in the container
docker-compose/chatgpt-next-web/docker-compose.yaml ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ version: '3.8' # Specify Docker Compose file version
2
+
3
+ services:
4
+ # Define the service for the web interface of ChatGPT
5
+ chatgpt-next-web:
6
+ image: yidadaa/chatgpt-next-web # Use the specified Docker image for the web interface
7
+ ports:
8
+ - "3000:3000" # Map port 3000 on the host to port 3000 in the container
9
+ environment: # Set environment variables for the container
10
+ OPENAI_API_KEY: "anything" # Placeholder for the actual OpenAI API key
11
+ BASE_URL: "http://chatgpt:3040" # URL for the backend service
12
+ CUSTOM_MODELS: "-all,+gpt-3.5-turbo" # Enable only the gpt-3.5-turbo model, disable all others
13
+ depends_on:
14
+ - chatgpt # Ensure this service starts after the chatgpt service
15
+
16
+ # Define the backend service for ChatGPT
17
+ chatgpt:
18
+ image: pawanosman/chatgpt:latest # Use the specified Docker image for the backend
19
+ restart: always # Ensure the container restarts automatically if it stops
20
+ ports:
21
+ - "3040:3040" # Map port 3040 on the host to port 3040 in the container
docker-compose/lobe-chat/docker-compose.yaml ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ version: '3.8' # Specify Docker Compose file version
2
+
3
+ services:
4
+ # Define the service for the web interface of ChatGPT
5
+ lobe-chat:
6
+ image: lobehub/lobe-chat:latest # Use the specified Docker image for the web interface
7
+ restart: always
8
+ ports:
9
+ - "3210:3210" # Map port 3210 on the host to port 3210 in the container
10
+ environment: # Set environment variables for the container
11
+ OPENAI_API_KEY: "anything" # Placeholder for the actual OpenAI API key
12
+ OPENAI_PROXY_URL: "http://chatgpt:3040/v1" # URL for the backend service
13
+ depends_on:
14
+ - chatgpt # Ensure this service starts after the chatgpt service
15
+
16
+ # Define the backend service for ChatGPT
17
+ chatgpt:
18
+ image: pawanosman/chatgpt:latest # Use the specified Docker image for the backend
19
+ restart: always # Ensure the container restarts automatically if it stops
20
+ ports:
21
+ - "3040:3040" # Map port 3040 on the host to port 3040 in the container