davidfearne
commited on
Update app.py
Browse files
app.py
CHANGED
@@ -88,13 +88,20 @@ st.sidebar.caption(f"Session ID: {genuuid()}")
|
|
88 |
|
89 |
|
90 |
# Main chat interface
|
91 |
-
st.markdown("""Query
|
92 |
|
93 |
-
|
94 |
-
** Converts to Concise Query: Reformulates the input into a succinct and effective query optimized for the retrieval system's semantic search capabilities.
|
95 |
|
96 |
-
|
|
|
|
|
|
|
|
|
|
|
97 |
|
|
|
|
|
|
|
98 |
# User ID Input
|
99 |
user_id = st.text_input("Experiment ID:", key="user_id")
|
100 |
|
|
|
88 |
|
89 |
|
90 |
# Main chat interface
|
91 |
+
st.markdown("""## Query Translation in RAG Architecture
|
92 |
|
93 |
+
Query translation in a Retrieval-Augmented Generation (RAG) architecture is the process where an LLM acts as a translator between the user's natural language input and the retrieval system.
|
|
|
94 |
|
95 |
+
### Key Functions of Query Translation:
|
96 |
+
1. **Adds Context**
|
97 |
+
The LLM enriches the user's input with relevant context (e.g., expanding vague questions or specifying details) to make it more precise.
|
98 |
+
|
99 |
+
2. **Converts to Concise Query**
|
100 |
+
The LLM reformulates the input into a succinct and effective query optimized for the retrieval system's semantic search capabilities.
|
101 |
|
102 |
+
### Purpose
|
103 |
+
This ensures that the retrieval system receives a clear and focused query, increasing the relevance of the information it retrieves. The query translator acts as a bridge between human conversational language and the technical requirements of a semantic retrieval system.
|
104 |
+
"""
|
105 |
# User ID Input
|
106 |
user_id = st.text_input("Experiment ID:", key="user_id")
|
107 |
|