yangdx commited on
Commit
6b647b3
·
1 Parent(s): a142be9

Update README with Ollama API and Open WebUI details

Browse files

- Add section on query mode selection
- Separate Ollama API and Open WebUI details
- Clarify query prefix usage

Files changed (1) hide show
  1. lightrag/api/README.md +8 -2
lightrag/api/README.md CHANGED
@@ -63,7 +63,9 @@ The output of the last command will give you the endpoint and the key for the Op
63
 
64
  ### About Ollama API
65
 
66
- We provide an Ollama-compatible interfaces for LightRAG, aiming to emulate LightRAG as an Ollama chat model. This allows AI chat frontends supporting Ollama, such as Open WebUI, to access LightRAG easily. After starting the lightrag-ollama service, you can add an Ollama-type connection in the Open WebUI admin pannel. And then a model named lightrag:latest will appear in Open WebUI's model management interface. Users can then send queries to LightRAG through the chat interface.
 
 
67
 
68
  A query prefix in the query string can determines which LightRAG query mode is used to generate the respond for the query. The supported prefixes include:
69
 
@@ -73,7 +75,11 @@ A query prefix in the query string can determines which LightRAG query mode is u
73
  /naive
74
  /mix
75
 
76
- For example, chat message "/mix 唐僧有几个徒弟" will trigger a mix mode query for LighRAG. A chat message without query prefix will trigger a hybrid mode query by default
 
 
 
 
77
 
78
 
79
  ## Configuration
 
63
 
64
  ### About Ollama API
65
 
66
+ We provide an Ollama-compatible interfaces for LightRAG, aiming to emulate LightRAG as an Ollama chat model. This allows AI chat frontends supporting Ollama, such as Open WebUI, to access LightRAG easily.
67
+
68
+ #### Choose Query mode in chat
69
 
70
  A query prefix in the query string can determines which LightRAG query mode is used to generate the respond for the query. The supported prefixes include:
71
 
 
75
  /naive
76
  /mix
77
 
78
+ For example, chat message "/mix 唐僧有几个徒弟" will trigger a mix mode query for LighRAG. A chat message without query prefix will trigger a hybrid mode query by default
79
+
80
+ #### Connect Open WebUI to LightRAG
81
+
82
+ After starting the lightrag-server, you can add an Ollama-type connection in the Open WebUI admin pannel. And then a model named lightrag:latest will appear in Open WebUI's model management interface. Users can then send queries to LightRAG through the chat interface.
83
 
84
 
85
  ## Configuration