all the best fixes
Browse files- app.py +2 -2
- chainlit.md +36 -1
- solution_app.py +2 -2
app.py
CHANGED
@@ -90,9 +90,9 @@ User Query:
|
|
90 |
{query}
|
91 |
|
92 |
Context:
|
93 |
-
{context}
|
94 |
|
95 |
-
<|start_header_id|>assistant<|end_header_id|>
|
96 |
"""
|
97 |
|
98 |
### 2. CREATE PROMPT TEMPLATE
|
|
|
90 |
{query}
|
91 |
|
92 |
Context:
|
93 |
+
{context}
|
94 |
|
95 |
+
<|eot_id|><|start_header_id|>assistant<|end_header_id|>
|
96 |
"""
|
97 |
|
98 |
### 2. CREATE PROMPT TEMPLATE
|
chainlit.md
CHANGED
@@ -1 +1,36 @@
|
|
1 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
### SF Sentinel: The Cutting-Edge AI Experience
|
2 |
+
|
3 |
+
Welcome to **SF Sentinel**, your gateway to the future of intelligent information retrieval and augmented generation, inspired by the innovative spirit of San Francisco. Here’s why SF Sentinel is not just an app but a technological marvel:
|
4 |
+
|
5 |
+
---
|
6 |
+
|
7 |
+
#### **Powered by State-of-the-Art Models**
|
8 |
+
|
9 |
+
1. **LLaMA 3: The Next Generation Language Model**
|
10 |
+
- **NousResearch/Meta-Llama-3-8B-Instruct**: At the heart of SF Sentinel is the LLaMA 3, a powerful language model designed to understand and generate human-like text. With 8 billion parameters, this model brings unparalleled accuracy and fluency to natural language processing, ensuring that every response is as insightful as a conversation with a San Francisco sage.
|
11 |
+
|
12 |
+
2. **Arctic Embed: Precision Embeddings for Context-Aware Insights**
|
13 |
+
- **Snowflake/snowflake-arctic-embed-m**: Our embedding model, Arctic Embed, excels at capturing the essence of complex texts. By transforming textual data into high-dimensional vectors, it allows SF Sentinel to understand the nuanced relationships between different pieces of information, delivering precise and context-aware insights every time.
|
14 |
+
|
15 |
+
---
|
16 |
+
|
17 |
+
#### **Leveraging Hugging Face Inference Endpoints**
|
18 |
+
|
19 |
+
- **Hugging Face Inference Endpoints**: The backbone of SF Sentinel's real-time processing capabilities, these endpoints enable seamless integration and deployment of cutting-edge models. By utilizing Hugging Face's robust infrastructure, we ensure that SF Sentinel can handle intensive computations with speed and reliability, providing instant responses to your queries.
|
20 |
+
|
21 |
+
---
|
22 |
+
|
23 |
+
#### **Frameworks That Empower**
|
24 |
+
|
25 |
+
1. **LangChain: The Ultimate Chain of Intelligence**
|
26 |
+
- **LangChain**: This powerful framework orchestrates the seamless interaction between different AI components. LangChain enables SF Sentinel to combine the strengths of LLaMA 3 and Arctic Embed, ensuring that data flows smoothly and insights are generated efficiently.
|
27 |
+
|
28 |
+
2. **FAISS: High-Speed Similarity Search**
|
29 |
+
- **Facebook AI Similarity Search (FAISS)**: A critical component for managing and querying large-scale vector data, FAISS ensures that SF Sentinel can perform rapid and accurate similarity searches. This means you get the most relevant information faster than ever before.
|
30 |
+
|
31 |
+
3. **Chainlit: Interactive AI Conversations**
|
32 |
+
- **Chainlit**: Our conversational framework, Chainlit, transforms SF Sentinel into an interactive assistant. With Chainlit, you can engage in dynamic, back-and-forth conversations, making the experience not just informative but also engaging and intuitive.
|
33 |
+
|
34 |
+
---
|
35 |
+
|
36 |
+
Embrace the future. Experience **SF Sentinel**.
|
solution_app.py
CHANGED
@@ -83,9 +83,9 @@ User Query:
|
|
83 |
{query}
|
84 |
|
85 |
Context:
|
86 |
-
{context}
|
87 |
|
88 |
-
<|start_header_id|>assistant<|end_header_id|>
|
89 |
"""
|
90 |
|
91 |
rag_prompt = PromptTemplate.from_template(RAG_PROMPT_TEMPLATE)
|
|
|
83 |
{query}
|
84 |
|
85 |
Context:
|
86 |
+
{context}
|
87 |
|
88 |
+
<|eot_id|><|start_header_id|>assistant<|end_header_id|>
|
89 |
"""
|
90 |
|
91 |
rag_prompt = PromptTemplate.from_template(RAG_PROMPT_TEMPLATE)
|