yangdx commited on
Commit
43d6ad8
·
1 Parent(s): 74add59

Update README.md for LightRAG Server

Browse files
Files changed (1) hide show
  1. lightrag/api/README.md +48 -0
lightrag/api/README.md CHANGED
@@ -40,6 +40,7 @@ For example, you have the possibility to use ollama for the embedding and openai
40
  #### For OpenAI Server
41
  - Requires valid OpenAI API credentials set in environment variables
42
  - OPENAI_API_KEY must be set
 
43
 
44
  #### For Azure OpenAI Server
45
  Azure OpenAI API can be created using the following commands in Azure CLI (you need to install Azure CLI first from [https://docs.microsoft.com/en-us/cli/azure/install-azure-cli](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli)):
@@ -60,6 +61,20 @@ az cognitiveservices account keys list --name $RESOURCE_NAME -g $RESOURCE_GROUP_
60
  ```
61
  The output of the last command will give you the endpoint and the key for the OpenAI API. You can use these values to set the environment variables in the `.env` file.
62
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
63
 
64
  ## Configuration
65
 
@@ -83,6 +98,9 @@ LLM_BINDING=ollama
83
  LLM_BINDING_HOST=http://localhost:11434
84
  LLM_MODEL=mistral-nemo:latest
85
 
 
 
 
86
  # Embedding Configuration
87
  EMBEDDING_BINDING=ollama
88
  EMBEDDING_BINDING_HOST=http://localhost:11434
@@ -285,7 +303,37 @@ curl -X POST "http://localhost:9621/documents/batch" \
285
  -F "files=@/path/to/doc2.txt"
286
  ```
287
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
288
  #### DELETE /documents
 
289
  Clear all documents from the RAG system.
290
 
291
  ```bash
 
40
  #### For OpenAI Server
41
  - Requires valid OpenAI API credentials set in environment variables
42
  - OPENAI_API_KEY must be set
43
+ - LLM_BINDING or LLM_MODEL must be set by command line on in environment variables
44
 
45
  #### For Azure OpenAI Server
46
  Azure OpenAI API can be created using the following commands in Azure CLI (you need to install Azure CLI first from [https://docs.microsoft.com/en-us/cli/azure/install-azure-cli](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli)):
 
61
  ```
62
  The output of the last command will give you the endpoint and the key for the OpenAI API. You can use these values to set the environment variables in the `.env` file.
63
 
64
+ ### About Ollama API
65
+
66
+ We provide an Ollama-compatible interfaces for LightRAG, aiming to emulate LightRAG as an Ollama chat model. This allows AI chat frontends supporting Ollama, such as Open WebUI, to access LightRAG easily. After starting the lightrag-ollama service, you can add an Ollama-type connection in the Open WebUI admin pannel. And then a model named lightrag:latest will appear in Open WebUI's model management interface. Users can then send queries to LightRAG through the chat interface.
67
+
68
+ A query prefix in the query string can determines which LightRAG query mode is used to generate the respond for the query. The supported prefixes include:
69
+
70
+ /local
71
+ /global
72
+ /hybrid
73
+ /naive
74
+ /mix
75
+
76
+ For example, chat message "/mix 唐僧有几个徒弟" will trigger a mix mode query for LighRAG. A chat message without query prefix will trigger a hybrid mode query by default
77
+
78
 
79
  ## Configuration
80
 
 
98
  LLM_BINDING_HOST=http://localhost:11434
99
  LLM_MODEL=mistral-nemo:latest
100
 
101
+ # must be set if using OpenAI LLM (LLM_MODEL must be set or set by command line parms)
102
+ OPENAI_API_KEY=you_api_key
103
+
104
  # Embedding Configuration
105
  EMBEDDING_BINDING=ollama
106
  EMBEDDING_BINDING_HOST=http://localhost:11434
 
303
  -F "files=@/path/to/doc2.txt"
304
  ```
305
 
306
+ ### Ollama Emulation Endpoints
307
+
308
+ #### GET /api/version
309
+
310
+ Get Ollama version information
311
+
312
+ ```bash
313
+ curl http://localhost:9621/api/version
314
+ ```
315
+
316
+ #### GET /api/tags
317
+
318
+ Get Ollama available models
319
+
320
+ ```bash
321
+ curl http://localhost:9621/api/tags
322
+ ```
323
+
324
+ #### POST /api/chat
325
+
326
+ Handle chat completion requests
327
+
328
+ ```
329
+ curl -N -X POST http://localhost:9621/api/chat -H "Content-Type: application/json" -d \
330
+ '{"model":"lightrag:latest","messages":[{"role":"user","content":"猪八戒是谁"}],"stream":true}'
331
+ ```
332
+
333
+ > For more information about Ollama API pls. visit : [Ollama API documentation](https://github.com/ollama/ollama/blob/main/docs/api.md)
334
+
335
  #### DELETE /documents
336
+
337
  Clear all documents from the RAG system.
338
 
339
  ```bash