BLJohnPrabhasith commited on
Commit
ad9fa14
1 Parent(s): d3e3a75

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +94 -0
README.md CHANGED
@@ -10,4 +10,98 @@ pinned: false
10
  license: apache-2.0
11
  ---
12
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
10
  license: apache-2.0
11
  ---
12
 
13
+ # PDFChatbot
14
+
15
+ Pdf Chatbot is a RAG-based system, a context-aware system trained to answer queries regarding the Document that the user gives. It can Trace the Chat from the encoding to generating the output using the powerful Langsmith.
16
+
17
+ The Document need to be fed to the model in order to train the model for the question and answer session.
18
+ ![Inital_Upload](https://github.com/user-attachments/assets/f80b5a07-4e63-45a2-8b92-84bd6273e99f)
19
+
20
+ **HuggingFace Embeddings:**
21
+
22
+ The application uses the HuggingFace BAAI/llm-embedder model to convert text data into embeddings, which are then used to facilitate efficient and accurate search within the PDF content.
23
+
24
+ **FAISS Vector Store:**
25
+
26
+ FAISS (Facebook AI Similarity Search) is employed to store and search text embeddings. This allows the application to quickly retrieve relevant sections of the PDF based on user queries.
27
+
28
+ **Processing the PDF: Extracting and Embedding Text**
29
+
30
+ The core functionality starts with processing the uploaded PDF. Using PyPDFLoader fromLangChain, the PDF content is extracted and split into manageable chunks. These chunks are then transformed into embeddings using the HuggingFace model. The embeddings are stored in a FAISS vector store, allowing for quick and efficient retrieval during user interactions.
31
+
32
+ **Handling User Queries**
33
+
34
+ The application is designed to handle user queries seamlessly and efficiently. When a user enters a question, the system retrieves relevant document sections using FAISS, applies a custom prompt template, and generates a response using the Groq LLM model(mixtral-8x7b-32768).
35
+
36
+ **Storing the History of the Chat and displaying it using Interactive UI**
37
+ ![image](https://github.com/user-attachments/assets/8ef44055-06b1-4d2c-b347-6e6d37c9d27d)
38
+
39
+ The user interface is built using Streamlit, providing an easy-to-use platform for interacting with the PDF. Users can upload their PDFs, enter queries, and view previous interactions all within the same interface. The chat history is stored in the session state, ensuring that the conversation flows smoothly and previous questions and answers are easily accessible.
40
+
41
+ **Tracing Using Langsmith**
42
+ ![image](https://github.com/user-attachments/assets/2cb8cc76-dc63-4700-89eb-8a06d2071870)
43
+
44
+ Detailed Tracing:
45
+
46
+ LangSmith provides detailed tracing of the entire workflow, helping you understand how data flows through the system. This is especially useful when you’re dealing with multiple components like document loaders, embedding models, and vector databases.
47
+
48
+ Debugging:
49
+
50
+ By tracing each step in the process, LangSmith makes it easier to identify where something might be going wrong. For instance, if the responses to user queries are not as expected, you can trace back through the chain to see if the issue lies in the retrieval process, the embedding model, or the prompt generation.
51
+
52
+ Performance Monitoring:
53
+
54
+ LangSmith allows you to monitor the performance of each component, giving insights into which parts of the process are taking the most time or resources. This is crucial for optimizing the application, especially as it scales.
55
+
56
+ ## API Reference
57
+
58
+ #### Get all items
59
+
60
+ ```http
61
+ GET /api/items
62
+ ```
63
+
64
+ | Parameter | Type | Description |
65
+ | :-------- | :------- | :------------------------- |
66
+ | `api_key` | `GroqAPI` | **Required**. Your API key |
67
+ | `api_key` | `LangsmithAPI` | Your API key |
68
+
69
+ #### Required items For Feeding the Data
70
+
71
+ | Parameter | Type | Description |
72
+ | :-------- | :------- | :-------------------------------- |
73
+ | `PDF` | `Document` | **Required**. path of item to fetch |
74
+
75
+
76
+
77
+
78
+
79
+
80
+
81
+ - [@Github](https://www.github.com/JohnPrabhasith)
82
+ - [@HuggingFace](https://www.huggingface.co/BLJohnPrabhasith)
83
+ - [@Medium](https://medium.com/@johnprabhasith)
84
+ ## Tech Stack
85
+
86
+ **LLM** : GroqApi(mixtral-8x7b-32768)
87
+
88
+ **Embedding** : HuggingFaceBgeEmbeddings
89
+
90
+ **Langchain**
91
+
92
+ **Langsmith**
93
+
94
+
95
+
96
+
97
+ ## Environment Variables
98
+
99
+ To run this project, you will need to add the following environment variables to your .env file
100
+
101
+ `GROQ_API`
102
+
103
+ `LANGSMITH_API`
104
+
105
+
106
+
107
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference