Muddasri commited on
Commit
44406e5
·
1 Parent(s): 6c2979e

Made changes to the backend

Browse files
MAIN_PYTHON_README.md ADDED
@@ -0,0 +1,194 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # main.py - CBT RAG Ablation Study Pipeline
2
+
3
+ ## Overview
4
+
5
+ `main.py` is the main entry point for the Cognitive Behavioral Therapy (CBT) RAG ablation study. It orchestrates a comprehensive comparative analysis of 6 different chunking techniques, evaluating each with multiple LLM models.
6
+
7
+ ## Pipeline Flow
8
+
9
+ The system executes the following steps:
10
+
11
+ ### Step 1: Data Ingestion with 6 Chunking Techniques
12
+
13
+ ```python
14
+ all_chunks, final_chunks, proc, index = ingest_data()
15
+ ```
16
+
17
+ - Loads the CBT book from `EntireBookCleaned.txt`
18
+ - Processes the book with **6 different chunking techniques**:
19
+ 1. **fixed** - Fixed-size chunking (1000 chars, 100 overlap)
20
+ 2. **sentence** - Sentence-level chunking (NLTK)
21
+ 3. **paragraph** - Paragraph-level chunking (splits on `\n\n`)
22
+ 4. **semantic** - Semantic chunking (embedding similarity)
23
+ 5. **recursive** - Recursive chunking (hierarchical separators)
24
+ 6. **page** - Page-level chunking (splits on `--- Page` markers)
25
+ - Each chunk is tagged with `chunking_technique` metadata
26
+ - All chunks are uploaded to a **single Pinecone index** for comparison
27
+ - **Returns:** All chunks, configured technique chunks, processor, and index for reuse
28
+
29
+ ### Step 2: Initialize Components
30
+
31
+ ```python
32
+ rag_engine = RAGGenerator()
33
+ models = {name: MODEL_MAP[name](token=hf_token) for name in cfg.model_list}
34
+ evaluator = RAGEvaluator(
35
+ judge_model=cfg.gen['judge_model'],
36
+ embedding_model=proc.encoder,
37
+ api_key=groq_key
38
+ )
39
+ ```
40
+
41
+ - Initializes the RAG answer generator
42
+ - Loads all 5 LLM models for tournament
43
+ - Sets up the evaluator with judge model
44
+
45
+ ### Step 3: Run RAG for Each Technique (Loop)
46
+
47
+ ```python
48
+ for technique in CHUNKING_TECHNIQUES:
49
+ results = run_rag_for_technique(
50
+ technique_name=technique['name'],
51
+ query=query,
52
+ index=index,
53
+ encoder=proc.encoder,
54
+ models=models,
55
+ evaluator=evaluator,
56
+ rag_engine=rag_engine
57
+ )
58
+ ```
59
+
60
+ For each technique:
61
+
62
+ - **Filters** chunks by `chunking_technique` metadata
63
+ - **Retrieves** top 5 chunks using Pinecone query with filter
64
+ - **Runs** model tournament (all 5 models generate answers)
65
+ - **Evaluates** faithfulness and relevancy for each answer
66
+ - **Stores** results for comparison
67
+
68
+ ### Step 4: Generate Findings Document
69
+
70
+ ```python
71
+ findings_file = generate_findings_document(all_results, query)
72
+ ```
73
+
74
+ - Creates comprehensive markdown report: `rag_ablation_findings.md`
75
+ - Includes:
76
+ - Results table for each technique
77
+ - Best model per technique
78
+ - Sample answers
79
+ - Comparative analysis and ranking
80
+ - Recommendations
81
+
82
+ ### Step 5: Summary Output
83
+
84
+ - Displays quick summary table
85
+ - Shows average scores per technique
86
+ - Identifies best performing technique and model
87
+ - Points to detailed findings document
88
+
89
+ ## Configuration
90
+
91
+ The system uses `config.yaml` for all configuration:
92
+
93
+ ```yaml
94
+ processing:
95
+ embedding_model: "jinaai/jina-embeddings-v2-small-en"
96
+ technique: "recursive" # Default technique for retrieval
97
+ chunk_size: 1000
98
+ chunk_overlap: 100
99
+
100
+ retrieval:
101
+ mode: "hybrid"
102
+ rerank_strategy: "cross-encoder"
103
+ use_mmr: true
104
+ top_k: 10
105
+ final_k: 5
106
+
107
+ generation:
108
+ temperature: 0.1
109
+ max_new_tokens: 512
110
+ judge_model: "llama-3.1-8b-instant"
111
+ ```
112
+
113
+ ## Key Components
114
+
115
+ ### Data Loader (`data_loader.py`)
116
+
117
+ - Loads and parses the CBT book from `EntireBookCleaned.txt`
118
+ - Splits content by page markers (`--- Page X ---`)
119
+ - Returns a DataFrame with columns: `id`, `title`, `url`, `full_text`
120
+
121
+ ### Chunk Processor (`retriever/processor.py`)
122
+
123
+ - Implements 7 chunking strategies (fixed, recursive, character, paragraph, sentence, semantic, page)
124
+ - Generates embeddings using SentenceTransformers
125
+ - Returns chunks with metadata and vector embeddings
126
+
127
+ ### Vector Database (`vector_db.py`)
128
+
129
+ - Manages Pinecone index creation and updates
130
+ - Handles batch upsert operations
131
+ - Provides index statistics and chunk loading
132
+
133
+ ### Hybrid Retriever (`retriever/retriever.py`)
134
+
135
+ - Combines semantic search (vector embeddings) with keyword search (BM25)
136
+ - Supports reranking strategies (cross-encoder, reciprocal rank fusion)
137
+ - Implements Maximum Marginal Relevance (MMR) for diversity
138
+
139
+ ### RAG Generator (`retriever/generator.py`)
140
+
141
+ - Generates answers using the configured LLM
142
+ - Formats context chunks into prompts
143
+ - Handles model-specific tokenization
144
+
145
+ ### Evaluator (`retriever/evaluator.py`)
146
+
147
+ - Evaluates faithfulness (answer grounded in context)
148
+ - Evaluates relevancy (answer addresses query)
149
+ - Uses a judge model for automated evaluation
150
+
151
+ ## Usage
152
+
153
+ ### Basic Usage
154
+
155
+ ```bash
156
+ python main.py
157
+ ```
158
+
159
+ ### With Environment Variables
160
+
161
+ ```bash
162
+ export HF_TOKEN="your_huggingface_token"
163
+ export PINECONE_API_KEY="your_pinecone_key"
164
+ export OPENROUTER_API_KEY="your_openrouter_key"
165
+ python main.py
166
+ ```
167
+
168
+ ## Output
169
+
170
+ The system produces:
171
+
172
+ 1. **Ingestion Summary** - Chunks created per technique
173
+ 2. **Book Statistics** - Pages, characters, average length
174
+ 3. **Retrieval Results** - Number of context chunks retrieved
175
+ 4. **Model Answers** - Generated answers from each model
176
+ 5. **Evaluation Scores** - Faithfulness and relevancy scores
177
+ 6. **Tournament Results** - Final comparison table
178
+ 7. **Best Model** - Model with highest combined score
179
+
180
+ ## Ablation Study Support
181
+
182
+ By running `ingest_data()` first, all 6 chunking techniques are processed and stored in a single Pinecone index. This enables:
183
+
184
+ - **Comparative analysis** of different chunking strategies
185
+ - **Filtering by technique** using the `chunking_technique` metadata field
186
+ - **Performance evaluation** across different chunking methods
187
+
188
+ ## Notes
189
+
190
+ - The system uses **Jina embeddings** (512 dimensions) for vector representation
191
+ - **Pinecone** is used as the vector database (serverless, AWS us-east-1)
192
+ - **OpenRouter** is used for the judge model inference
193
+ - The default retrieval technique is **recursive** (configurable in `config.yaml`)
194
+ - All 6 techniques are processed during ingestion for ablation study
config.yaml CHANGED
@@ -32,10 +32,10 @@ retrieval:
32
  final_k: 5
33
 
34
  generation:
35
- temperature: 0.1
36
  max_new_tokens: 512
37
- # The model used to Judge the others
38
- judge_model: "llama-3.1-8b-instant"
39
 
40
  # List of contestants in the tournament
41
  models:
 
32
  final_k: 5
33
 
34
  generation:
35
+ temperature: 0.
36
  max_new_tokens: 512
37
+ # The model used to Judge the others (OpenRouter)
38
+ judge_model: "stepfun/step-3.5-flash:free"
39
 
40
  # List of contestants in the tournament
41
  models:
frontend/app/globals.css CHANGED
@@ -139,3 +139,142 @@
139
  .animate-gradient {
140
  animation: gradient 8s linear infinite;
141
  }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
139
  .animate-gradient {
140
  animation: gradient 8s linear infinite;
141
  }
142
+
143
+ @keyframes streamCursorBlink {
144
+ 0%,
145
+ 45% {
146
+ opacity: 1;
147
+ }
148
+ 55%,
149
+ 100% {
150
+ opacity: 0;
151
+ }
152
+ }
153
+
154
+ @keyframes streamTextReveal {
155
+ from {
156
+ opacity: 0.65;
157
+ }
158
+ to {
159
+ opacity: 1;
160
+ }
161
+ }
162
+
163
+ .streaming-text-reveal {
164
+ animation: streamTextReveal 180ms ease-out;
165
+ }
166
+
167
+ .stream-cursor {
168
+ display: inline-block;
169
+ width: 0.55ch;
170
+ margin-left: 0.1ch;
171
+ border-bottom: 0.12em solid currentColor;
172
+ animation: streamCursorBlink 1s steps(1, end) infinite;
173
+ vertical-align: baseline;
174
+ }
175
+
176
+ .assistant-streaming-bubble {
177
+ background-image: linear-gradient(180deg, rgba(16, 185, 129, 0.06), rgba(16, 185, 129, 0));
178
+ }
179
+
180
+ .markdown-body {
181
+ line-height: 1.6;
182
+ }
183
+
184
+ .markdown-body > * + * {
185
+ margin-top: 0.65rem;
186
+ }
187
+
188
+ .markdown-body h1,
189
+ .markdown-body h2,
190
+ .markdown-body h3,
191
+ .markdown-body h4 {
192
+ font-weight: 700;
193
+ line-height: 1.35;
194
+ }
195
+
196
+ .markdown-body h1 {
197
+ font-size: 1.1rem;
198
+ }
199
+
200
+ .markdown-body h2 {
201
+ font-size: 1.02rem;
202
+ }
203
+
204
+ .markdown-body h3,
205
+ .markdown-body h4 {
206
+ font-size: 0.95rem;
207
+ }
208
+
209
+ .markdown-body p {
210
+ margin: 0;
211
+ }
212
+
213
+ .markdown-body ul,
214
+ .markdown-body ol {
215
+ margin-left: 1.1rem;
216
+ }
217
+
218
+ .markdown-body li + li {
219
+ margin-top: 0.2rem;
220
+ }
221
+
222
+ .markdown-body a {
223
+ text-decoration: underline;
224
+ text-underline-offset: 2px;
225
+ }
226
+
227
+ .markdown-body code {
228
+ border-radius: 0.35rem;
229
+ background: rgba(148, 163, 184, 0.2);
230
+ padding: 0.05rem 0.35rem;
231
+ font-family: ui-monospace, SFMono-Regular, Menlo, Monaco, Consolas, "Liberation Mono", "Courier New", monospace;
232
+ font-size: 0.82em;
233
+ }
234
+
235
+ .markdown-body pre {
236
+ overflow-x: auto;
237
+ border-radius: 0.75rem;
238
+ border: 1px solid rgba(148, 163, 184, 0.3);
239
+ background: rgba(2, 6, 23, 0.92);
240
+ color: #e2e8f0;
241
+ padding: 0.75rem;
242
+ }
243
+
244
+ .markdown-body pre code {
245
+ background: transparent;
246
+ padding: 0;
247
+ color: inherit;
248
+ font-size: 0.78rem;
249
+ }
250
+
251
+ .markdown-body table {
252
+ width: 100%;
253
+ border-collapse: collapse;
254
+ font-size: 0.86rem;
255
+ }
256
+
257
+ .markdown-body th,
258
+ .markdown-body td {
259
+ border: 1px solid rgba(148, 163, 184, 0.35);
260
+ padding: 0.4rem 0.5rem;
261
+ text-align: left;
262
+ }
263
+
264
+ .markdown-body th {
265
+ background: rgba(148, 163, 184, 0.15);
266
+ font-weight: 600;
267
+ }
268
+
269
+ .markdown-body blockquote {
270
+ border-left: 3px solid rgba(148, 163, 184, 0.45);
271
+ margin: 0;
272
+ padding-left: 0.7rem;
273
+ color: inherit;
274
+ opacity: 0.9;
275
+ }
276
+
277
+ .markdown-body hr {
278
+ border: 0;
279
+ border-top: 1px solid rgba(148, 163, 184, 0.35);
280
+ }
frontend/components/AIAssistantUI.jsx CHANGED
@@ -84,6 +84,14 @@ export default function AIAssistantUI() {
84
 
85
  const [isThinking, setIsThinking] = useState(false)
86
  const [thinkingConvId, setThinkingConvId] = useState(null)
 
 
 
 
 
 
 
 
87
 
88
  useEffect(() => {
89
  const onKey = (e) => {
@@ -193,8 +201,13 @@ export default function AIAssistantUI() {
193
 
194
  async function sendMessage(convId, content) {
195
  if (!content.trim()) return
 
 
 
 
196
  const now = new Date().toISOString()
197
  const userMsg = { id: Math.random().toString(36).slice(2), role: "user", content, createdAt: now }
 
198
 
199
  setConversations((prev) =>
200
  prev.map((c) => {
@@ -213,7 +226,124 @@ export default function AIAssistantUI() {
213
  setIsThinking(true)
214
  setThinkingConvId(convId)
215
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
216
  const currentConvId = convId
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
217
 
218
  // Prefer same-origin proxy to avoid browser CORS/network issues in development.
219
  const primaryUrl = "/api/proxy"
@@ -226,18 +356,19 @@ export default function AIAssistantUI() {
226
  query: content,
227
  model: selectedModel,
228
  }),
 
229
  }
230
 
231
  try {
232
  let res
233
 
234
  try {
235
- res = await fetch(`${primaryUrl}/predict`, fetchOptions)
236
  } catch {}
237
 
238
  if (!res || !res.ok) {
239
  // Retry direct backend URL if proxy is not reachable.
240
- res = await fetch(`${fallbackUrl}/predict`, fetchOptions)
241
  }
242
 
243
  if (!res.ok) {
@@ -245,49 +376,83 @@ export default function AIAssistantUI() {
245
  throw new Error(`Prediction failed (${res.status}) ${details}`.trim())
246
  }
247
 
248
- const data = await res.json()
 
 
249
 
250
- setConversations((prev) =>
251
- prev.map((c) => {
252
- if (c.id !== currentConvId) return c
253
- const asstMsg = {
254
- id: Math.random().toString(36).slice(2),
255
- role: "assistant",
256
- content: data.answer || "Sorry, I encountered an error.",
257
- createdAt: new Date().toISOString(),
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
258
  }
259
- const msgs = [...(c.messages || []), asstMsg]
260
- return {
261
- ...c,
262
- messages: msgs,
263
- updatedAt: new Date().toISOString(),
264
- messageCount: msgs.length,
265
- preview: (asstMsg.content || "").slice(0, 80),
266
  }
267
- }),
268
- )
269
- } catch (err) {
270
- console.error("predict request failed:", err)
271
- setConversations((prev) =>
272
- prev.map((c) => {
273
- if (c.id !== currentConvId) return c
274
- const errorMsg = {
275
- id: Math.random().toString(36).slice(2),
276
- role: "assistant",
277
- content: "Sorry, I could not reach the backend. Start FastAPI and verify frontend .env.local URLs, then restart Next.js dev server.",
278
- createdAt: new Date().toISOString(),
279
  }
280
- const msgs = [...(c.messages || []), errorMsg]
281
- return {
282
- ...c,
283
- messages: msgs,
284
- updatedAt: new Date().toISOString(),
285
- messageCount: msgs.length,
286
- preview: errorMsg.content.slice(0, 80),
287
  }
288
- }),
289
- )
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
290
  } finally {
 
 
 
291
  setIsThinking(false)
292
  setThinkingConvId(null)
293
  }
@@ -318,6 +483,8 @@ export default function AIAssistantUI() {
318
  }
319
 
320
  function pauseThinking() {
 
 
321
  setIsThinking(false)
322
  setThinkingConvId(null)
323
  }
 
84
 
85
  const [isThinking, setIsThinking] = useState(false)
86
  const [thinkingConvId, setThinkingConvId] = useState(null)
87
+ const activeRequestRef = useRef(null)
88
+
89
+ useEffect(() => {
90
+ return () => {
91
+ activeRequestRef.current?.abort()
92
+ activeRequestRef.current = null
93
+ }
94
+ }, [])
95
 
96
  useEffect(() => {
97
  const onKey = (e) => {
 
201
 
202
  async function sendMessage(convId, content) {
203
  if (!content.trim()) return
204
+ const existingConversation = conversations.find((c) => c.id === convId)
205
+ const hasUserMessages = (existingConversation?.messages || []).some((m) => m.role === "user")
206
+ const shouldAutoTitle = existingConversation?.title === "New Chat" && !hasUserMessages
207
+
208
  const now = new Date().toISOString()
209
  const userMsg = { id: Math.random().toString(36).slice(2), role: "user", content, createdAt: now }
210
+ const assistantMsgId = Math.random().toString(36).slice(2)
211
 
212
  setConversations((prev) =>
213
  prev.map((c) => {
 
226
  setIsThinking(true)
227
  setThinkingConvId(convId)
228
 
229
+ if (shouldAutoTitle) {
230
+ const primaryUrl = "/api/proxy"
231
+ const fallbackUrl = process.env.NEXT_PUBLIC_API_URL || "http://127.0.0.1:8000"
232
+ const titlePayload = {
233
+ method: "POST",
234
+ headers: { "Content-Type": "application/json" },
235
+ body: JSON.stringify({ query: content }),
236
+ }
237
+
238
+ ;(async () => {
239
+ try {
240
+ let res
241
+ try {
242
+ res = await fetch(`${primaryUrl}/predict/title`, titlePayload)
243
+ } catch {}
244
+
245
+ if (!res || !res.ok) {
246
+ res = await fetch(`${fallbackUrl}/predict/title`, titlePayload)
247
+ }
248
+
249
+ if (!res.ok) return
250
+ const data = await res.json().catch(() => null)
251
+ const nextTitle = (data?.title || "").trim()
252
+ if (!nextTitle) return
253
+
254
+ setConversations((prev) =>
255
+ prev.map((c) => {
256
+ if (c.id !== convId) return c
257
+ if (c.title !== "New Chat") return c
258
+ return {
259
+ ...c,
260
+ title: nextTitle.slice(0, 80),
261
+ updatedAt: new Date().toISOString(),
262
+ }
263
+ }),
264
+ )
265
+ } catch {
266
+ // Keep default title if title endpoint is unavailable.
267
+ }
268
+ })()
269
+ }
270
+
271
  const currentConvId = convId
272
+ const controller = new AbortController()
273
+ activeRequestRef.current = controller
274
+
275
+ const upsertAssistantMessage = (updater, fallbackContent = "") => {
276
+ setConversations((prev) =>
277
+ prev.map((c) => {
278
+ if (c.id !== currentConvId) return c
279
+ const existingMessages = c.messages || []
280
+ const idx = existingMessages.findIndex((m) => m.id === assistantMsgId)
281
+ const baseMessage =
282
+ idx >= 0
283
+ ? existingMessages[idx]
284
+ : {
285
+ id: assistantMsgId,
286
+ role: "assistant",
287
+ content: fallbackContent,
288
+ createdAt: new Date().toISOString(),
289
+ isStreaming: true,
290
+ }
291
+
292
+ const nextMessage = updater(baseMessage)
293
+ const nextMessages = [...existingMessages]
294
+ if (idx >= 0) {
295
+ nextMessages[idx] = nextMessage
296
+ } else {
297
+ nextMessages.push(nextMessage)
298
+ }
299
+
300
+ return {
301
+ ...c,
302
+ messages: nextMessages,
303
+ updatedAt: new Date().toISOString(),
304
+ messageCount: nextMessages.length,
305
+ preview: nextMessages[nextMessages.length - 1]?.content?.slice(0, 80) || c.preview,
306
+ }
307
+ }),
308
+ )
309
+ }
310
+
311
+ const appendChunk = (chunk) => {
312
+ if (!chunk) return
313
+ upsertAssistantMessage(
314
+ (m) => ({
315
+ ...m,
316
+ content: (m.content || "") + chunk,
317
+ isStreaming: true,
318
+ }),
319
+ "",
320
+ )
321
+ }
322
+
323
+ const finalizeAssistant = (finalText) => {
324
+ upsertAssistantMessage(
325
+ (m) => {
326
+ const fallbackContent = m.content || "Sorry, I encountered an error."
327
+ return {
328
+ ...m,
329
+ content: finalText != null ? finalText : fallbackContent,
330
+ isStreaming: false,
331
+ }
332
+ },
333
+ finalText || "Sorry, I encountered an error.",
334
+ )
335
+ }
336
+
337
+ const setAssistantError = (message) => {
338
+ upsertAssistantMessage(
339
+ (m) => ({
340
+ ...m,
341
+ content: message,
342
+ isStreaming: false,
343
+ }),
344
+ message,
345
+ )
346
+ }
347
 
348
  // Prefer same-origin proxy to avoid browser CORS/network issues in development.
349
  const primaryUrl = "/api/proxy"
 
356
  query: content,
357
  model: selectedModel,
358
  }),
359
+ signal: controller.signal,
360
  }
361
 
362
  try {
363
  let res
364
 
365
  try {
366
+ res = await fetch(`${primaryUrl}/predict/stream`, fetchOptions)
367
  } catch {}
368
 
369
  if (!res || !res.ok) {
370
  // Retry direct backend URL if proxy is not reachable.
371
+ res = await fetch(`${fallbackUrl}/predict/stream`, fetchOptions)
372
  }
373
 
374
  if (!res.ok) {
 
376
  throw new Error(`Prediction failed (${res.status}) ${details}`.trim())
377
  }
378
 
379
+ if (!res.body) {
380
+ throw new Error("Streaming is not available in this browser response")
381
+ }
382
 
383
+ const reader = res.body.getReader()
384
+ const decoder = new TextDecoder()
385
+ let buffer = ""
386
+ let firstTokenReceived = false
387
+ let finalAnswer = null
388
+
389
+ while (true) {
390
+ const { value, done } = await reader.read()
391
+ if (done) break
392
+
393
+ buffer += decoder.decode(value, { stream: true })
394
+ const lines = buffer.split("\n")
395
+ buffer = lines.pop() || ""
396
+
397
+ for (const line of lines) {
398
+ const trimmed = line.trim()
399
+ if (!trimmed) continue
400
+
401
+ let evt
402
+ try {
403
+ evt = JSON.parse(trimmed)
404
+ } catch {
405
+ continue
406
  }
407
+
408
+ if (evt.type === "token") {
409
+ if (!firstTokenReceived) {
410
+ firstTokenReceived = true
411
+ setIsThinking(false)
412
+ }
413
+ appendChunk(evt.token || "")
414
  }
415
+
416
+ if (evt.type === "done") {
417
+ finalAnswer = typeof evt.answer === "string" ? evt.answer : null
 
 
 
 
 
 
 
 
 
418
  }
419
+
420
+ if (evt.type === "error") {
421
+ throw new Error(evt.message || "Streaming failed")
 
 
 
 
422
  }
423
+ }
424
+ }
425
+
426
+ const remainder = buffer.trim()
427
+ if (remainder) {
428
+ try {
429
+ const evt = JSON.parse(remainder)
430
+ if (evt.type === "done") {
431
+ finalAnswer = typeof evt.answer === "string" ? evt.answer : null
432
+ }
433
+ if (evt.type === "token") {
434
+ appendChunk(evt.token || "")
435
+ }
436
+ if (evt.type === "error") {
437
+ throw new Error(evt.message || "Streaming failed")
438
+ }
439
+ } catch {
440
+ // Ignore malformed tail chunk.
441
+ }
442
+ }
443
+
444
+ finalizeAssistant(finalAnswer)
445
+ } catch (err) {
446
+ console.error("predict request failed:", err)
447
+ if (err?.name === "AbortError") {
448
+ setAssistantError("Generation paused.")
449
+ } else {
450
+ setAssistantError("Sorry, I could not reach the backend. Start FastAPI and verify frontend .env.local URLs, then restart Next.js dev server.")
451
+ }
452
  } finally {
453
+ if (activeRequestRef.current === controller) {
454
+ activeRequestRef.current = null
455
+ }
456
  setIsThinking(false)
457
  setThinkingConvId(null)
458
  }
 
483
  }
484
 
485
  function pauseThinking() {
486
+ activeRequestRef.current?.abort()
487
+ activeRequestRef.current = null
488
  setIsThinking(false)
489
  setThinkingConvId(null)
490
  }
frontend/components/ChatPane.jsx CHANGED
@@ -3,6 +3,7 @@
3
  import { useState, forwardRef, useImperativeHandle, useRef } from "react"
4
  import { RefreshCw, Check, X, Square } from "lucide-react"
5
  import Message from "./Message"
 
6
  import Composer from "./Composer"
7
  import { cls, timeAgo } from "./utils"
8
 
@@ -171,8 +172,14 @@ const ChatPane = forwardRef(function ChatPane(
171
  </div>
172
  </div>
173
  ) : (
174
- <Message role={m.role}>
175
- <div className="whitespace-pre-wrap">{m.content}</div>
 
 
 
 
 
 
176
  </Message>
177
  )}
178
  </div>
 
3
  import { useState, forwardRef, useImperativeHandle, useRef } from "react"
4
  import { RefreshCw, Check, X, Square } from "lucide-react"
5
  import Message from "./Message"
6
+ import MarkdownMessage from "./MarkdownMessage"
7
  import Composer from "./Composer"
8
  import { cls, timeAgo } from "./utils"
9
 
 
172
  </div>
173
  </div>
174
  ) : (
175
+ <Message role={m.role} streaming={Boolean(m.isStreaming)}>
176
+ {m.role === "assistant" ? (
177
+ <div className={cls(m.isStreaming && "streaming-text-reveal")}>
178
+ <MarkdownMessage content={m.content} isStreaming={Boolean(m.isStreaming)} />
179
+ </div>
180
+ ) : (
181
+ <div className="whitespace-pre-wrap">{m.content}</div>
182
+ )}
183
  </Message>
184
  )}
185
  </div>
frontend/components/MarkdownMessage.jsx ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use client"
2
+
3
+ import ReactMarkdown from "react-markdown"
4
+ import remarkGfm from "remark-gfm"
5
+ import rehypeSanitize from "rehype-sanitize"
6
+
7
+ function withTemporaryFenceClosure(markdown) {
8
+ const text = markdown || ""
9
+ const fenceMatches = text.match(/```/g)
10
+ const fenceCount = fenceMatches ? fenceMatches.length : 0
11
+ if (fenceCount % 2 === 1) {
12
+ return `${text}\n\n\`\`\`\n`
13
+ }
14
+ return text
15
+ }
16
+
17
+ export default function MarkdownMessage({ content, isStreaming = false }) {
18
+ const source = isStreaming ? withTemporaryFenceClosure(content) : content || ""
19
+
20
+ return (
21
+ <div className="markdown-body">
22
+ <ReactMarkdown remarkPlugins={[remarkGfm]} rehypePlugins={[rehypeSanitize]}>
23
+ {source}
24
+ </ReactMarkdown>
25
+ </div>
26
+ )
27
+ }
frontend/components/Message.jsx CHANGED
@@ -1,6 +1,6 @@
1
  import { cls } from "./utils"
2
 
3
- export default function Message({ role, children }) {
4
  const isUser = role === "user"
5
  return (
6
  <div className={cls("flex gap-3", isUser ? "justify-end" : "justify-start")}>
@@ -17,6 +17,7 @@ export default function Message({ role, children }) {
17
  isUser
18
  ? "bg-zinc-900 text-white dark:bg-white dark:text-zinc-900"
19
  : "bg-white text-zinc-900 dark:bg-zinc-900 dark:text-zinc-100 border border-zinc-200 dark:border-zinc-800",
 
20
  )}
21
  >
22
  {children}
 
1
  import { cls } from "./utils"
2
 
3
+ export default function Message({ role, children, streaming = false }) {
4
  const isUser = role === "user"
5
  return (
6
  <div className={cls("flex gap-3", isUser ? "justify-end" : "justify-start")}>
 
17
  isUser
18
  ? "bg-zinc-900 text-white dark:bg-white dark:text-zinc-900"
19
  : "bg-white text-zinc-900 dark:bg-zinc-900 dark:text-zinc-100 border border-zinc-200 dark:border-zinc-800",
20
+ !isUser && streaming && "assistant-streaming-bubble",
21
  )}
22
  >
23
  {children}
frontend/package-lock.json CHANGED
@@ -21,7 +21,10 @@
21
  "react-dom": "^19.2.1",
22
  "react-dropzone": "^14.3.8",
23
  "react-icons": "^5.5.0",
 
24
  "react-use-measure": "^2.1.7",
 
 
25
  "tailwind-merge": "^3.3.1"
26
  },
27
  "devDependencies": {
@@ -3643,13 +3646,39 @@
3643
  "tslib": "^2.4.0"
3644
  }
3645
  },
 
 
 
 
 
 
 
 
 
3646
  "node_modules/@types/estree": {
3647
  "version": "1.0.8",
3648
  "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz",
3649
  "integrity": "sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w==",
3650
- "dev": true,
3651
  "license": "MIT"
3652
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3653
  "node_modules/@types/json-schema": {
3654
  "version": "7.0.15",
3655
  "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz",
@@ -3664,6 +3693,21 @@
3664
  "dev": true,
3665
  "license": "MIT"
3666
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3667
  "node_modules/@types/node": {
3668
  "version": "20.19.24",
3669
  "resolved": "https://registry.npmjs.org/@types/node/-/node-20.19.24.tgz",
@@ -3678,7 +3722,6 @@
3678
  "version": "19.2.2",
3679
  "resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.2.tgz",
3680
  "integrity": "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA==",
3681
- "devOptional": true,
3682
  "license": "MIT",
3683
  "dependencies": {
3684
  "csstype": "^3.0.2"
@@ -3694,6 +3737,12 @@
3694
  "@types/react": "^19.2.0"
3695
  }
3696
  },
 
 
 
 
 
 
3697
  "node_modules/@typescript-eslint/eslint-plugin": {
3698
  "version": "8.46.3",
3699
  "resolved": "https://registry.npmjs.org/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.46.3.tgz",
@@ -3995,6 +4044,12 @@
3995
  "url": "https://opencollective.com/typescript-eslint"
3996
  }
3997
  },
 
 
 
 
 
 
3998
  "node_modules/@unrs/resolver-binding-android-arm-eabi": {
3999
  "version": "1.11.1",
4000
  "resolved": "https://registry.npmjs.org/@unrs/resolver-binding-android-arm-eabi/-/resolver-binding-android-arm-eabi-1.11.1.tgz",
@@ -4609,6 +4664,16 @@
4609
  "@babel/core": "^7.4.0 || ^8.0.0-0 <8.0.0"
4610
  }
4611
  },
 
 
 
 
 
 
 
 
 
 
4612
  "node_modules/balanced-match": {
4613
  "version": "1.0.2",
4614
  "resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
@@ -4779,6 +4844,16 @@
4779
  ],
4780
  "license": "CC-BY-4.0"
4781
  },
 
 
 
 
 
 
 
 
 
 
4782
  "node_modules/chalk": {
4783
  "version": "4.1.2",
4784
  "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
@@ -4796,6 +4871,46 @@
4796
  "url": "https://github.com/chalk/chalk?sponsor=1"
4797
  }
4798
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4799
  "node_modules/class-variance-authority": {
4800
  "version": "0.7.1",
4801
  "resolved": "https://registry.npmjs.org/class-variance-authority/-/class-variance-authority-0.7.1.tgz",
@@ -4843,6 +4958,16 @@
4843
  "dev": true,
4844
  "license": "MIT"
4845
  },
 
 
 
 
 
 
 
 
 
 
4846
  "node_modules/commander": {
4847
  "version": "7.2.0",
4848
  "resolved": "https://registry.npmjs.org/commander/-/commander-7.2.0.tgz",
@@ -4997,7 +5122,6 @@
4997
  "version": "3.1.3",
4998
  "resolved": "https://registry.npmjs.org/csstype/-/csstype-3.1.3.tgz",
4999
  "integrity": "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==",
5000
- "devOptional": true,
5001
  "license": "MIT"
5002
  },
5003
  "node_modules/damerau-levenshtein": {
@@ -5078,6 +5202,19 @@
5078
  }
5079
  }
5080
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
5081
  "node_modules/deep-is": {
5082
  "version": "0.1.4",
5083
  "resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz",
@@ -5130,6 +5267,15 @@
5130
  "url": "https://github.com/sponsors/ljharb"
5131
  }
5132
  },
 
 
 
 
 
 
 
 
 
5133
  "node_modules/detect-libc": {
5134
  "version": "2.1.2",
5135
  "resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.1.2.tgz",
@@ -5146,6 +5292,19 @@
5146
  "integrity": "sha512-ypdmJU/TbBby2Dxibuv7ZLW3Bs1QEmM7nHjEANfohJLvE0XVujisn1qPJcZxg+qDucsr+bP6fLD1rPS3AhJ7EQ==",
5147
  "license": "MIT"
5148
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
5149
  "node_modules/doctrine": {
5150
  "version": "2.1.0",
5151
  "resolved": "https://registry.npmjs.org/doctrine/-/doctrine-2.1.0.tgz",
@@ -5900,6 +6059,16 @@
5900
  "node": ">=4.0"
5901
  }
5902
  },
 
 
 
 
 
 
 
 
 
 
5903
  "node_modules/esutils": {
5904
  "version": "2.0.3",
5905
  "resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz",
@@ -5909,6 +6078,12 @@
5909
  "node": ">=0.10.0"
5910
  }
5911
  },
 
 
 
 
 
 
5912
  "node_modules/fast-deep-equal": {
5913
  "version": "3.1.3",
5914
  "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz",
@@ -6390,6 +6565,61 @@
6390
  "node": ">= 0.4"
6391
  }
6392
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6393
  "node_modules/hermes-estree": {
6394
  "version": "0.25.1",
6395
  "resolved": "https://registry.npmjs.org/hermes-estree/-/hermes-estree-0.25.1.tgz",
@@ -6407,6 +6637,16 @@
6407
  "hermes-estree": "0.25.1"
6408
  }
6409
  },
 
 
 
 
 
 
 
 
 
 
6410
  "node_modules/ignore": {
6411
  "version": "5.3.2",
6412
  "resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz",
@@ -6443,6 +6683,12 @@
6443
  "node": ">=0.8.19"
6444
  }
6445
  },
 
 
 
 
 
 
6446
  "node_modules/internal-slot": {
6447
  "version": "1.1.0",
6448
  "resolved": "https://registry.npmjs.org/internal-slot/-/internal-slot-1.1.0.tgz",
@@ -6458,6 +6704,30 @@
6458
  "node": ">= 0.4"
6459
  }
6460
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6461
  "node_modules/is-array-buffer": {
6462
  "version": "3.0.5",
6463
  "resolved": "https://registry.npmjs.org/is-array-buffer/-/is-array-buffer-3.0.5.tgz",
@@ -6621,6 +6891,16 @@
6621
  "url": "https://github.com/sponsors/ljharb"
6622
  }
6623
  },
 
 
 
 
 
 
 
 
 
 
6624
  "node_modules/is-extglob": {
6625
  "version": "2.1.1",
6626
  "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz",
@@ -6680,6 +6960,16 @@
6680
  "node": ">=0.10.0"
6681
  }
6682
  },
 
 
 
 
 
 
 
 
 
 
6683
  "node_modules/is-map": {
6684
  "version": "2.0.3",
6685
  "resolved": "https://registry.npmjs.org/is-map/-/is-map-2.0.3.tgz",
@@ -6733,6 +7023,18 @@
6733
  "url": "https://github.com/sponsors/ljharb"
6734
  }
6735
  },
 
 
 
 
 
 
 
 
 
 
 
 
6736
  "node_modules/is-regex": {
6737
  "version": "1.2.1",
6738
  "resolved": "https://registry.npmjs.org/is-regex/-/is-regex-1.2.1.tgz",
@@ -7345,6 +7647,16 @@
7345
  "dev": true,
7346
  "license": "MIT"
7347
  },
 
 
 
 
 
 
 
 
 
 
7348
  "node_modules/loose-envify": {
7349
  "version": "1.4.0",
7350
  "resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
@@ -7394,6 +7706,16 @@
7394
  "@jridgewell/sourcemap-codec": "^1.5.5"
7395
  }
7396
  },
 
 
 
 
 
 
 
 
 
 
7397
  "node_modules/math-intrinsics": {
7398
  "version": "1.1.0",
7399
  "resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
@@ -7404,76 +7726,921 @@
7404
  "node": ">= 0.4"
7405
  }
7406
  },
7407
- "node_modules/mdn-data": {
7408
- "version": "2.0.30",
7409
- "resolved": "https://registry.npmjs.org/mdn-data/-/mdn-data-2.0.30.tgz",
7410
- "integrity": "sha512-GaqWWShW4kv/G9IEucWScBx9G1/vsFZZJUO+tD26M8J8z3Kw5RDQjaoZe03YAClgeS/SWPOcb4nkFBTEi5DUEA==",
7411
- "license": "CC0-1.0"
 
 
 
 
 
 
 
 
 
 
7412
  },
7413
- "node_modules/merge2": {
7414
- "version": "1.4.1",
7415
- "resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz",
7416
- "integrity": "sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==",
7417
- "dev": true,
7418
  "license": "MIT",
7419
  "engines": {
7420
- "node": ">= 8"
 
 
 
7421
  }
7422
  },
7423
- "node_modules/micromatch": {
7424
- "version": "4.0.8",
7425
- "resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.8.tgz",
7426
- "integrity": "sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA==",
7427
- "dev": true,
7428
  "license": "MIT",
7429
  "dependencies": {
7430
- "braces": "^3.0.3",
7431
- "picomatch": "^2.3.1"
 
 
 
 
 
 
 
 
 
 
7432
  },
7433
- "engines": {
7434
- "node": ">=8.6"
 
7435
  }
7436
  },
7437
- "node_modules/minimatch": {
7438
- "version": "3.1.2",
7439
- "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
7440
- "integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
7441
- "dev": true,
7442
- "license": "ISC",
7443
  "dependencies": {
7444
- "brace-expansion": "^1.1.7"
 
 
 
 
 
 
7445
  },
7446
- "engines": {
7447
- "node": "*"
 
7448
  }
7449
  },
7450
- "node_modules/minimist": {
7451
- "version": "1.2.8",
7452
- "resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz",
7453
- "integrity": "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==",
7454
- "dev": true,
7455
  "license": "MIT",
 
 
 
 
 
 
 
7456
  "funding": {
7457
- "url": "https://github.com/sponsors/ljharb"
 
7458
  }
7459
  },
7460
- "node_modules/motion": {
7461
- "version": "12.23.24",
7462
- "resolved": "https://registry.npmjs.org/motion/-/motion-12.23.24.tgz",
7463
- "integrity": "sha512-Rc5E7oe2YZ72N//S3QXGzbnXgqNrTESv8KKxABR20q2FLch9gHLo0JLyYo2hZ238bZ9Gx6cWhj9VO0IgwbMjCw==",
7464
  "license": "MIT",
7465
  "dependencies": {
7466
- "framer-motion": "^12.23.24",
7467
- "tslib": "^2.4.0"
 
 
 
7468
  },
7469
- "peerDependencies": {
7470
- "@emotion/is-prop-valid": "*",
7471
- "react": "^18.0.0 || ^19.0.0",
7472
- "react-dom": "^18.0.0 || ^19.0.0"
 
 
 
 
 
 
 
 
 
 
7473
  },
7474
- "peerDependenciesMeta": {
7475
- "@emotion/is-prop-valid": {
7476
- "optional": true
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7477
  },
7478
  "react": {
7479
  "optional": true
@@ -7861,6 +9028,31 @@
7861
  "node": ">=6"
7862
  }
7863
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7864
  "node_modules/parse-json": {
7865
  "version": "5.2.0",
7866
  "resolved": "https://registry.npmjs.org/parse-json/-/parse-json-5.2.0.tgz",
@@ -7993,6 +9185,16 @@
7993
  "react-is": "^16.13.1"
7994
  }
7995
  },
 
 
 
 
 
 
 
 
 
 
7996
  "node_modules/punycode": {
7997
  "version": "2.3.1",
7998
  "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz",
@@ -8077,6 +9279,33 @@
8077
  "integrity": "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ==",
8078
  "license": "MIT"
8079
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8080
  "node_modules/react-remove-scroll": {
8081
  "version": "2.7.1",
8082
  "resolved": "https://registry.npmjs.org/react-remove-scroll/-/react-remove-scroll-2.7.1.tgz",
@@ -8258,6 +9487,86 @@
8258
  "regjsparser": "bin/parser"
8259
  }
8260
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8261
  "node_modules/resolve": {
8262
  "version": "1.22.11",
8263
  "resolved": "https://registry.npmjs.org/resolve/-/resolve-1.22.11.tgz",
@@ -8627,6 +9936,16 @@
8627
  "node": ">=0.10.0"
8628
  }
8629
  },
 
 
 
 
 
 
 
 
 
 
8630
  "node_modules/stable-hash": {
8631
  "version": "0.0.5",
8632
  "resolved": "https://registry.npmjs.org/stable-hash/-/stable-hash-0.0.5.tgz",
@@ -8761,6 +10080,20 @@
8761
  "url": "https://github.com/sponsors/ljharb"
8762
  }
8763
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8764
  "node_modules/strip-bom": {
8765
  "version": "3.0.0",
8766
  "resolved": "https://registry.npmjs.org/strip-bom/-/strip-bom-3.0.0.tgz",
@@ -8784,6 +10117,24 @@
8784
  "url": "https://github.com/sponsors/sindresorhus"
8785
  }
8786
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8787
  "node_modules/styled-jsx": {
8788
  "version": "5.1.6",
8789
  "resolved": "https://registry.npmjs.org/styled-jsx/-/styled-jsx-5.1.6.tgz",
@@ -8955,6 +10306,26 @@
8955
  "node": ">=8.0"
8956
  }
8957
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8958
  "node_modules/ts-api-utils": {
8959
  "version": "2.1.0",
8960
  "resolved": "https://registry.npmjs.org/ts-api-utils/-/ts-api-utils-2.1.0.tgz",
@@ -9195,6 +10566,93 @@
9195
  "node": ">=4"
9196
  }
9197
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9198
  "node_modules/unrs-resolver": {
9199
  "version": "1.11.1",
9200
  "resolved": "https://registry.npmjs.org/unrs-resolver/-/unrs-resolver-1.11.1.tgz",
@@ -9313,6 +10771,34 @@
9313
  }
9314
  }
9315
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9316
  "node_modules/which": {
9317
  "version": "2.0.2",
9318
  "resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
@@ -9469,6 +10955,16 @@
9469
  "peerDependencies": {
9470
  "zod": "^3.25.0 || ^4.0.0"
9471
  }
 
 
 
 
 
 
 
 
 
 
9472
  }
9473
  }
9474
  }
 
21
  "react-dom": "^19.2.1",
22
  "react-dropzone": "^14.3.8",
23
  "react-icons": "^5.5.0",
24
+ "react-markdown": "^10.1.0",
25
  "react-use-measure": "^2.1.7",
26
+ "rehype-sanitize": "^6.0.0",
27
+ "remark-gfm": "^4.0.1",
28
  "tailwind-merge": "^3.3.1"
29
  },
30
  "devDependencies": {
 
3646
  "tslib": "^2.4.0"
3647
  }
3648
  },
3649
+ "node_modules/@types/debug": {
3650
+ "version": "4.1.13",
3651
+ "resolved": "https://registry.npmjs.org/@types/debug/-/debug-4.1.13.tgz",
3652
+ "integrity": "sha512-KSVgmQmzMwPlmtljOomayoR89W4FynCAi3E8PPs7vmDVPe84hT+vGPKkJfThkmXs0x0jAaa9U8uW8bbfyS2fWw==",
3653
+ "license": "MIT",
3654
+ "dependencies": {
3655
+ "@types/ms": "*"
3656
+ }
3657
+ },
3658
  "node_modules/@types/estree": {
3659
  "version": "1.0.8",
3660
  "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz",
3661
  "integrity": "sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w==",
 
3662
  "license": "MIT"
3663
  },
3664
+ "node_modules/@types/estree-jsx": {
3665
+ "version": "1.0.5",
3666
+ "resolved": "https://registry.npmjs.org/@types/estree-jsx/-/estree-jsx-1.0.5.tgz",
3667
+ "integrity": "sha512-52CcUVNFyfb1A2ALocQw/Dd1BQFNmSdkuC3BkZ6iqhdMfQz7JWOFRuJFloOzjk+6WijU56m9oKXFAXc7o3Towg==",
3668
+ "license": "MIT",
3669
+ "dependencies": {
3670
+ "@types/estree": "*"
3671
+ }
3672
+ },
3673
+ "node_modules/@types/hast": {
3674
+ "version": "3.0.4",
3675
+ "resolved": "https://registry.npmjs.org/@types/hast/-/hast-3.0.4.tgz",
3676
+ "integrity": "sha512-WPs+bbQw5aCj+x6laNGWLH3wviHtoCv/P3+otBhbOhJgG8qtpdAMlTCxLtsTWA7LH1Oh/bFCHsBn0TPS5m30EQ==",
3677
+ "license": "MIT",
3678
+ "dependencies": {
3679
+ "@types/unist": "*"
3680
+ }
3681
+ },
3682
  "node_modules/@types/json-schema": {
3683
  "version": "7.0.15",
3684
  "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz",
 
3693
  "dev": true,
3694
  "license": "MIT"
3695
  },
3696
+ "node_modules/@types/mdast": {
3697
+ "version": "4.0.4",
3698
+ "resolved": "https://registry.npmjs.org/@types/mdast/-/mdast-4.0.4.tgz",
3699
+ "integrity": "sha512-kGaNbPh1k7AFzgpud/gMdvIm5xuECykRR+JnWKQno9TAXVa6WIVCGTPvYGekIDL4uwCZQSYbUxNBSb1aUo79oA==",
3700
+ "license": "MIT",
3701
+ "dependencies": {
3702
+ "@types/unist": "*"
3703
+ }
3704
+ },
3705
+ "node_modules/@types/ms": {
3706
+ "version": "2.1.0",
3707
+ "resolved": "https://registry.npmjs.org/@types/ms/-/ms-2.1.0.tgz",
3708
+ "integrity": "sha512-GsCCIZDE/p3i96vtEqx+7dBUGXrc7zeSK3wwPHIaRThS+9OhWIXRqzs4d6k1SVU8g91DrNRWxWUGhp5KXQb2VA==",
3709
+ "license": "MIT"
3710
+ },
3711
  "node_modules/@types/node": {
3712
  "version": "20.19.24",
3713
  "resolved": "https://registry.npmjs.org/@types/node/-/node-20.19.24.tgz",
 
3722
  "version": "19.2.2",
3723
  "resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.2.tgz",
3724
  "integrity": "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA==",
 
3725
  "license": "MIT",
3726
  "dependencies": {
3727
  "csstype": "^3.0.2"
 
3737
  "@types/react": "^19.2.0"
3738
  }
3739
  },
3740
+ "node_modules/@types/unist": {
3741
+ "version": "3.0.3",
3742
+ "resolved": "https://registry.npmjs.org/@types/unist/-/unist-3.0.3.tgz",
3743
+ "integrity": "sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==",
3744
+ "license": "MIT"
3745
+ },
3746
  "node_modules/@typescript-eslint/eslint-plugin": {
3747
  "version": "8.46.3",
3748
  "resolved": "https://registry.npmjs.org/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.46.3.tgz",
 
4044
  "url": "https://opencollective.com/typescript-eslint"
4045
  }
4046
  },
4047
+ "node_modules/@ungap/structured-clone": {
4048
+ "version": "1.3.0",
4049
+ "resolved": "https://registry.npmjs.org/@ungap/structured-clone/-/structured-clone-1.3.0.tgz",
4050
+ "integrity": "sha512-WmoN8qaIAo7WTYWbAZuG8PYEhn5fkz7dZrqTBZ7dtt//lL2Gwms1IcnQ5yHqjDfX8Ft5j4YzDM23f87zBfDe9g==",
4051
+ "license": "ISC"
4052
+ },
4053
  "node_modules/@unrs/resolver-binding-android-arm-eabi": {
4054
  "version": "1.11.1",
4055
  "resolved": "https://registry.npmjs.org/@unrs/resolver-binding-android-arm-eabi/-/resolver-binding-android-arm-eabi-1.11.1.tgz",
 
4664
  "@babel/core": "^7.4.0 || ^8.0.0-0 <8.0.0"
4665
  }
4666
  },
4667
+ "node_modules/bail": {
4668
+ "version": "2.0.2",
4669
+ "resolved": "https://registry.npmjs.org/bail/-/bail-2.0.2.tgz",
4670
+ "integrity": "sha512-0xO6mYd7JB2YesxDKplafRpsiOzPt9V02ddPCLbY1xYGPOX24NTyN50qnUxgCPcSoYMhKpAuBTjQoRZCAkUDRw==",
4671
+ "license": "MIT",
4672
+ "funding": {
4673
+ "type": "github",
4674
+ "url": "https://github.com/sponsors/wooorm"
4675
+ }
4676
+ },
4677
  "node_modules/balanced-match": {
4678
  "version": "1.0.2",
4679
  "resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
 
4844
  ],
4845
  "license": "CC-BY-4.0"
4846
  },
4847
+ "node_modules/ccount": {
4848
+ "version": "2.0.1",
4849
+ "resolved": "https://registry.npmjs.org/ccount/-/ccount-2.0.1.tgz",
4850
+ "integrity": "sha512-eyrF0jiFpY+3drT6383f1qhkbGsLSifNAjA61IUjZjmLCWjItY6LB9ft9YhoDgwfmclB2zhu51Lc7+95b8NRAg==",
4851
+ "license": "MIT",
4852
+ "funding": {
4853
+ "type": "github",
4854
+ "url": "https://github.com/sponsors/wooorm"
4855
+ }
4856
+ },
4857
  "node_modules/chalk": {
4858
  "version": "4.1.2",
4859
  "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
 
4871
  "url": "https://github.com/chalk/chalk?sponsor=1"
4872
  }
4873
  },
4874
+ "node_modules/character-entities": {
4875
+ "version": "2.0.2",
4876
+ "resolved": "https://registry.npmjs.org/character-entities/-/character-entities-2.0.2.tgz",
4877
+ "integrity": "sha512-shx7oQ0Awen/BRIdkjkvz54PnEEI/EjwXDSIZp86/KKdbafHh1Df/RYGBhn4hbe2+uKC9FnT5UCEdyPz3ai9hQ==",
4878
+ "license": "MIT",
4879
+ "funding": {
4880
+ "type": "github",
4881
+ "url": "https://github.com/sponsors/wooorm"
4882
+ }
4883
+ },
4884
+ "node_modules/character-entities-html4": {
4885
+ "version": "2.1.0",
4886
+ "resolved": "https://registry.npmjs.org/character-entities-html4/-/character-entities-html4-2.1.0.tgz",
4887
+ "integrity": "sha512-1v7fgQRj6hnSwFpq1Eu0ynr/CDEw0rXo2B61qXrLNdHZmPKgb7fqS1a2JwF0rISo9q77jDI8VMEHoApn8qDoZA==",
4888
+ "license": "MIT",
4889
+ "funding": {
4890
+ "type": "github",
4891
+ "url": "https://github.com/sponsors/wooorm"
4892
+ }
4893
+ },
4894
+ "node_modules/character-entities-legacy": {
4895
+ "version": "3.0.0",
4896
+ "resolved": "https://registry.npmjs.org/character-entities-legacy/-/character-entities-legacy-3.0.0.tgz",
4897
+ "integrity": "sha512-RpPp0asT/6ufRm//AJVwpViZbGM/MkjQFxJccQRHmISF/22NBtsHqAWmL+/pmkPWoIUJdWyeVleTl1wydHATVQ==",
4898
+ "license": "MIT",
4899
+ "funding": {
4900
+ "type": "github",
4901
+ "url": "https://github.com/sponsors/wooorm"
4902
+ }
4903
+ },
4904
+ "node_modules/character-reference-invalid": {
4905
+ "version": "2.0.1",
4906
+ "resolved": "https://registry.npmjs.org/character-reference-invalid/-/character-reference-invalid-2.0.1.tgz",
4907
+ "integrity": "sha512-iBZ4F4wRbyORVsu0jPV7gXkOsGYjGHPmAyv+HiHG8gi5PtC9KI2j1+v8/tlibRvjoWX027ypmG/n0HtO5t7unw==",
4908
+ "license": "MIT",
4909
+ "funding": {
4910
+ "type": "github",
4911
+ "url": "https://github.com/sponsors/wooorm"
4912
+ }
4913
+ },
4914
  "node_modules/class-variance-authority": {
4915
  "version": "0.7.1",
4916
  "resolved": "https://registry.npmjs.org/class-variance-authority/-/class-variance-authority-0.7.1.tgz",
 
4958
  "dev": true,
4959
  "license": "MIT"
4960
  },
4961
+ "node_modules/comma-separated-tokens": {
4962
+ "version": "2.0.3",
4963
+ "resolved": "https://registry.npmjs.org/comma-separated-tokens/-/comma-separated-tokens-2.0.3.tgz",
4964
+ "integrity": "sha512-Fu4hJdvzeylCfQPp9SGWidpzrMs7tTrlu6Vb8XGaRGck8QSNZJJp538Wrb60Lax4fPwR64ViY468OIUTbRlGZg==",
4965
+ "license": "MIT",
4966
+ "funding": {
4967
+ "type": "github",
4968
+ "url": "https://github.com/sponsors/wooorm"
4969
+ }
4970
+ },
4971
  "node_modules/commander": {
4972
  "version": "7.2.0",
4973
  "resolved": "https://registry.npmjs.org/commander/-/commander-7.2.0.tgz",
 
5122
  "version": "3.1.3",
5123
  "resolved": "https://registry.npmjs.org/csstype/-/csstype-3.1.3.tgz",
5124
  "integrity": "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==",
 
5125
  "license": "MIT"
5126
  },
5127
  "node_modules/damerau-levenshtein": {
 
5202
  }
5203
  }
5204
  },
5205
+ "node_modules/decode-named-character-reference": {
5206
+ "version": "1.3.0",
5207
+ "resolved": "https://registry.npmjs.org/decode-named-character-reference/-/decode-named-character-reference-1.3.0.tgz",
5208
+ "integrity": "sha512-GtpQYB283KrPp6nRw50q3U9/VfOutZOe103qlN7BPP6Ad27xYnOIWv4lPzo8HCAL+mMZofJ9KEy30fq6MfaK6Q==",
5209
+ "license": "MIT",
5210
+ "dependencies": {
5211
+ "character-entities": "^2.0.0"
5212
+ },
5213
+ "funding": {
5214
+ "type": "github",
5215
+ "url": "https://github.com/sponsors/wooorm"
5216
+ }
5217
+ },
5218
  "node_modules/deep-is": {
5219
  "version": "0.1.4",
5220
  "resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz",
 
5267
  "url": "https://github.com/sponsors/ljharb"
5268
  }
5269
  },
5270
+ "node_modules/dequal": {
5271
+ "version": "2.0.3",
5272
+ "resolved": "https://registry.npmjs.org/dequal/-/dequal-2.0.3.tgz",
5273
+ "integrity": "sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA==",
5274
+ "license": "MIT",
5275
+ "engines": {
5276
+ "node": ">=6"
5277
+ }
5278
+ },
5279
  "node_modules/detect-libc": {
5280
  "version": "2.1.2",
5281
  "resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.1.2.tgz",
 
5292
  "integrity": "sha512-ypdmJU/TbBby2Dxibuv7ZLW3Bs1QEmM7nHjEANfohJLvE0XVujisn1qPJcZxg+qDucsr+bP6fLD1rPS3AhJ7EQ==",
5293
  "license": "MIT"
5294
  },
5295
+ "node_modules/devlop": {
5296
+ "version": "1.1.0",
5297
+ "resolved": "https://registry.npmjs.org/devlop/-/devlop-1.1.0.tgz",
5298
+ "integrity": "sha512-RWmIqhcFf1lRYBvNmr7qTNuyCt/7/ns2jbpp1+PalgE/rDQcBT0fioSMUpJ93irlUhC5hrg4cYqe6U+0ImW0rA==",
5299
+ "license": "MIT",
5300
+ "dependencies": {
5301
+ "dequal": "^2.0.0"
5302
+ },
5303
+ "funding": {
5304
+ "type": "github",
5305
+ "url": "https://github.com/sponsors/wooorm"
5306
+ }
5307
+ },
5308
  "node_modules/doctrine": {
5309
  "version": "2.1.0",
5310
  "resolved": "https://registry.npmjs.org/doctrine/-/doctrine-2.1.0.tgz",
 
6059
  "node": ">=4.0"
6060
  }
6061
  },
6062
+ "node_modules/estree-util-is-identifier-name": {
6063
+ "version": "3.0.0",
6064
+ "resolved": "https://registry.npmjs.org/estree-util-is-identifier-name/-/estree-util-is-identifier-name-3.0.0.tgz",
6065
+ "integrity": "sha512-hFtqIDZTIUZ9BXLb8y4pYGyk6+wekIivNVTcmvk8NoOh+VeRn5y6cEHzbURrWbfp1fIqdVipilzj+lfaadNZmg==",
6066
+ "license": "MIT",
6067
+ "funding": {
6068
+ "type": "opencollective",
6069
+ "url": "https://opencollective.com/unified"
6070
+ }
6071
+ },
6072
  "node_modules/esutils": {
6073
  "version": "2.0.3",
6074
  "resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz",
 
6078
  "node": ">=0.10.0"
6079
  }
6080
  },
6081
+ "node_modules/extend": {
6082
+ "version": "3.0.2",
6083
+ "resolved": "https://registry.npmjs.org/extend/-/extend-3.0.2.tgz",
6084
+ "integrity": "sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g==",
6085
+ "license": "MIT"
6086
+ },
6087
  "node_modules/fast-deep-equal": {
6088
  "version": "3.1.3",
6089
  "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz",
 
6565
  "node": ">= 0.4"
6566
  }
6567
  },
6568
+ "node_modules/hast-util-sanitize": {
6569
+ "version": "5.0.2",
6570
+ "resolved": "https://registry.npmjs.org/hast-util-sanitize/-/hast-util-sanitize-5.0.2.tgz",
6571
+ "integrity": "sha512-3yTWghByc50aGS7JlGhk61SPenfE/p1oaFeNwkOOyrscaOkMGrcW9+Cy/QAIOBpZxP1yqDIzFMR0+Np0i0+usg==",
6572
+ "license": "MIT",
6573
+ "dependencies": {
6574
+ "@types/hast": "^3.0.0",
6575
+ "@ungap/structured-clone": "^1.0.0",
6576
+ "unist-util-position": "^5.0.0"
6577
+ },
6578
+ "funding": {
6579
+ "type": "opencollective",
6580
+ "url": "https://opencollective.com/unified"
6581
+ }
6582
+ },
6583
+ "node_modules/hast-util-to-jsx-runtime": {
6584
+ "version": "2.3.6",
6585
+ "resolved": "https://registry.npmjs.org/hast-util-to-jsx-runtime/-/hast-util-to-jsx-runtime-2.3.6.tgz",
6586
+ "integrity": "sha512-zl6s8LwNyo1P9uw+XJGvZtdFF1GdAkOg8ujOw+4Pyb76874fLps4ueHXDhXWdk6YHQ6OgUtinliG7RsYvCbbBg==",
6587
+ "license": "MIT",
6588
+ "dependencies": {
6589
+ "@types/estree": "^1.0.0",
6590
+ "@types/hast": "^3.0.0",
6591
+ "@types/unist": "^3.0.0",
6592
+ "comma-separated-tokens": "^2.0.0",
6593
+ "devlop": "^1.0.0",
6594
+ "estree-util-is-identifier-name": "^3.0.0",
6595
+ "hast-util-whitespace": "^3.0.0",
6596
+ "mdast-util-mdx-expression": "^2.0.0",
6597
+ "mdast-util-mdx-jsx": "^3.0.0",
6598
+ "mdast-util-mdxjs-esm": "^2.0.0",
6599
+ "property-information": "^7.0.0",
6600
+ "space-separated-tokens": "^2.0.0",
6601
+ "style-to-js": "^1.0.0",
6602
+ "unist-util-position": "^5.0.0",
6603
+ "vfile-message": "^4.0.0"
6604
+ },
6605
+ "funding": {
6606
+ "type": "opencollective",
6607
+ "url": "https://opencollective.com/unified"
6608
+ }
6609
+ },
6610
+ "node_modules/hast-util-whitespace": {
6611
+ "version": "3.0.0",
6612
+ "resolved": "https://registry.npmjs.org/hast-util-whitespace/-/hast-util-whitespace-3.0.0.tgz",
6613
+ "integrity": "sha512-88JUN06ipLwsnv+dVn+OIYOvAuvBMy/Qoi6O7mQHxdPXpjy+Cd6xRkWwux7DKO+4sYILtLBRIKgsdpS2gQc7qw==",
6614
+ "license": "MIT",
6615
+ "dependencies": {
6616
+ "@types/hast": "^3.0.0"
6617
+ },
6618
+ "funding": {
6619
+ "type": "opencollective",
6620
+ "url": "https://opencollective.com/unified"
6621
+ }
6622
+ },
6623
  "node_modules/hermes-estree": {
6624
  "version": "0.25.1",
6625
  "resolved": "https://registry.npmjs.org/hermes-estree/-/hermes-estree-0.25.1.tgz",
 
6637
  "hermes-estree": "0.25.1"
6638
  }
6639
  },
6640
+ "node_modules/html-url-attributes": {
6641
+ "version": "3.0.1",
6642
+ "resolved": "https://registry.npmjs.org/html-url-attributes/-/html-url-attributes-3.0.1.tgz",
6643
+ "integrity": "sha512-ol6UPyBWqsrO6EJySPz2O7ZSr856WDrEzM5zMqp+FJJLGMW35cLYmmZnl0vztAZxRUoNZJFTCohfjuIJ8I4QBQ==",
6644
+ "license": "MIT",
6645
+ "funding": {
6646
+ "type": "opencollective",
6647
+ "url": "https://opencollective.com/unified"
6648
+ }
6649
+ },
6650
  "node_modules/ignore": {
6651
  "version": "5.3.2",
6652
  "resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz",
 
6683
  "node": ">=0.8.19"
6684
  }
6685
  },
6686
+ "node_modules/inline-style-parser": {
6687
+ "version": "0.2.7",
6688
+ "resolved": "https://registry.npmjs.org/inline-style-parser/-/inline-style-parser-0.2.7.tgz",
6689
+ "integrity": "sha512-Nb2ctOyNR8DqQoR0OwRG95uNWIC0C1lCgf5Naz5H6Ji72KZ8OcFZLz2P5sNgwlyoJ8Yif11oMuYs5pBQa86csA==",
6690
+ "license": "MIT"
6691
+ },
6692
  "node_modules/internal-slot": {
6693
  "version": "1.1.0",
6694
  "resolved": "https://registry.npmjs.org/internal-slot/-/internal-slot-1.1.0.tgz",
 
6704
  "node": ">= 0.4"
6705
  }
6706
  },
6707
+ "node_modules/is-alphabetical": {
6708
+ "version": "2.0.1",
6709
+ "resolved": "https://registry.npmjs.org/is-alphabetical/-/is-alphabetical-2.0.1.tgz",
6710
+ "integrity": "sha512-FWyyY60MeTNyeSRpkM2Iry0G9hpr7/9kD40mD/cGQEuilcZYS4okz8SN2Q6rLCJ8gbCt6fN+rC+6tMGS99LaxQ==",
6711
+ "license": "MIT",
6712
+ "funding": {
6713
+ "type": "github",
6714
+ "url": "https://github.com/sponsors/wooorm"
6715
+ }
6716
+ },
6717
+ "node_modules/is-alphanumerical": {
6718
+ "version": "2.0.1",
6719
+ "resolved": "https://registry.npmjs.org/is-alphanumerical/-/is-alphanumerical-2.0.1.tgz",
6720
+ "integrity": "sha512-hmbYhX/9MUMF5uh7tOXyK/n0ZvWpad5caBA17GsC6vyuCqaWliRG5K1qS9inmUhEMaOBIW7/whAnSwveW/LtZw==",
6721
+ "license": "MIT",
6722
+ "dependencies": {
6723
+ "is-alphabetical": "^2.0.0",
6724
+ "is-decimal": "^2.0.0"
6725
+ },
6726
+ "funding": {
6727
+ "type": "github",
6728
+ "url": "https://github.com/sponsors/wooorm"
6729
+ }
6730
+ },
6731
  "node_modules/is-array-buffer": {
6732
  "version": "3.0.5",
6733
  "resolved": "https://registry.npmjs.org/is-array-buffer/-/is-array-buffer-3.0.5.tgz",
 
6891
  "url": "https://github.com/sponsors/ljharb"
6892
  }
6893
  },
6894
+ "node_modules/is-decimal": {
6895
+ "version": "2.0.1",
6896
+ "resolved": "https://registry.npmjs.org/is-decimal/-/is-decimal-2.0.1.tgz",
6897
+ "integrity": "sha512-AAB9hiomQs5DXWcRB1rqsxGUstbRroFOPPVAomNk/3XHR5JyEZChOyTWe2oayKnsSsr/kcGqF+z6yuH6HHpN0A==",
6898
+ "license": "MIT",
6899
+ "funding": {
6900
+ "type": "github",
6901
+ "url": "https://github.com/sponsors/wooorm"
6902
+ }
6903
+ },
6904
  "node_modules/is-extglob": {
6905
  "version": "2.1.1",
6906
  "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz",
 
6960
  "node": ">=0.10.0"
6961
  }
6962
  },
6963
+ "node_modules/is-hexadecimal": {
6964
+ "version": "2.0.1",
6965
+ "resolved": "https://registry.npmjs.org/is-hexadecimal/-/is-hexadecimal-2.0.1.tgz",
6966
+ "integrity": "sha512-DgZQp241c8oO6cA1SbTEWiXeoxV42vlcJxgH+B3hi1AiqqKruZR3ZGF8In3fj4+/y/7rHvlOZLZtgJ/4ttYGZg==",
6967
+ "license": "MIT",
6968
+ "funding": {
6969
+ "type": "github",
6970
+ "url": "https://github.com/sponsors/wooorm"
6971
+ }
6972
+ },
6973
  "node_modules/is-map": {
6974
  "version": "2.0.3",
6975
  "resolved": "https://registry.npmjs.org/is-map/-/is-map-2.0.3.tgz",
 
7023
  "url": "https://github.com/sponsors/ljharb"
7024
  }
7025
  },
7026
+ "node_modules/is-plain-obj": {
7027
+ "version": "4.1.0",
7028
+ "resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-4.1.0.tgz",
7029
+ "integrity": "sha512-+Pgi+vMuUNkJyExiMBt5IlFoMyKnr5zhJ4Uspz58WOhBF5QoIZkFyNHIbBAtHwzVAgk5RtndVNsDRN61/mmDqg==",
7030
+ "license": "MIT",
7031
+ "engines": {
7032
+ "node": ">=12"
7033
+ },
7034
+ "funding": {
7035
+ "url": "https://github.com/sponsors/sindresorhus"
7036
+ }
7037
+ },
7038
  "node_modules/is-regex": {
7039
  "version": "1.2.1",
7040
  "resolved": "https://registry.npmjs.org/is-regex/-/is-regex-1.2.1.tgz",
 
7647
  "dev": true,
7648
  "license": "MIT"
7649
  },
7650
+ "node_modules/longest-streak": {
7651
+ "version": "3.1.0",
7652
+ "resolved": "https://registry.npmjs.org/longest-streak/-/longest-streak-3.1.0.tgz",
7653
+ "integrity": "sha512-9Ri+o0JYgehTaVBBDoMqIl8GXtbWg711O3srftcHhZ0dqnETqLaoIK0x17fUw9rFSlK/0NlsKe0Ahhyl5pXE2g==",
7654
+ "license": "MIT",
7655
+ "funding": {
7656
+ "type": "github",
7657
+ "url": "https://github.com/sponsors/wooorm"
7658
+ }
7659
+ },
7660
  "node_modules/loose-envify": {
7661
  "version": "1.4.0",
7662
  "resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
 
7706
  "@jridgewell/sourcemap-codec": "^1.5.5"
7707
  }
7708
  },
7709
+ "node_modules/markdown-table": {
7710
+ "version": "3.0.4",
7711
+ "resolved": "https://registry.npmjs.org/markdown-table/-/markdown-table-3.0.4.tgz",
7712
+ "integrity": "sha512-wiYz4+JrLyb/DqW2hkFJxP7Vd7JuTDm77fvbM8VfEQdmSMqcImWeeRbHwZjBjIFki/VaMK2BhFi7oUUZeM5bqw==",
7713
+ "license": "MIT",
7714
+ "funding": {
7715
+ "type": "github",
7716
+ "url": "https://github.com/sponsors/wooorm"
7717
+ }
7718
+ },
7719
  "node_modules/math-intrinsics": {
7720
  "version": "1.1.0",
7721
  "resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
 
7726
  "node": ">= 0.4"
7727
  }
7728
  },
7729
+ "node_modules/mdast-util-find-and-replace": {
7730
+ "version": "3.0.2",
7731
+ "resolved": "https://registry.npmjs.org/mdast-util-find-and-replace/-/mdast-util-find-and-replace-3.0.2.tgz",
7732
+ "integrity": "sha512-Tmd1Vg/m3Xz43afeNxDIhWRtFZgM2VLyaf4vSTYwudTyeuTneoL3qtWMA5jeLyz/O1vDJmmV4QuScFCA2tBPwg==",
7733
+ "license": "MIT",
7734
+ "dependencies": {
7735
+ "@types/mdast": "^4.0.0",
7736
+ "escape-string-regexp": "^5.0.0",
7737
+ "unist-util-is": "^6.0.0",
7738
+ "unist-util-visit-parents": "^6.0.0"
7739
+ },
7740
+ "funding": {
7741
+ "type": "opencollective",
7742
+ "url": "https://opencollective.com/unified"
7743
+ }
7744
  },
7745
+ "node_modules/mdast-util-find-and-replace/node_modules/escape-string-regexp": {
7746
+ "version": "5.0.0",
7747
+ "resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-5.0.0.tgz",
7748
+ "integrity": "sha512-/veY75JbMK4j1yjvuUxuVsiS/hr/4iHs9FTT6cgTexxdE0Ly/glccBAkloH/DofkjRbZU3bnoj38mOmhkZ0lHw==",
 
7749
  "license": "MIT",
7750
  "engines": {
7751
+ "node": ">=12"
7752
+ },
7753
+ "funding": {
7754
+ "url": "https://github.com/sponsors/sindresorhus"
7755
  }
7756
  },
7757
+ "node_modules/mdast-util-from-markdown": {
7758
+ "version": "2.0.3",
7759
+ "resolved": "https://registry.npmjs.org/mdast-util-from-markdown/-/mdast-util-from-markdown-2.0.3.tgz",
7760
+ "integrity": "sha512-W4mAWTvSlKvf8L6J+VN9yLSqQ9AOAAvHuoDAmPkz4dHf553m5gVj2ejadHJhoJmcmxEnOv6Pa8XJhpxE93kb8Q==",
 
7761
  "license": "MIT",
7762
  "dependencies": {
7763
+ "@types/mdast": "^4.0.0",
7764
+ "@types/unist": "^3.0.0",
7765
+ "decode-named-character-reference": "^1.0.0",
7766
+ "devlop": "^1.0.0",
7767
+ "mdast-util-to-string": "^4.0.0",
7768
+ "micromark": "^4.0.0",
7769
+ "micromark-util-decode-numeric-character-reference": "^2.0.0",
7770
+ "micromark-util-decode-string": "^2.0.0",
7771
+ "micromark-util-normalize-identifier": "^2.0.0",
7772
+ "micromark-util-symbol": "^2.0.0",
7773
+ "micromark-util-types": "^2.0.0",
7774
+ "unist-util-stringify-position": "^4.0.0"
7775
  },
7776
+ "funding": {
7777
+ "type": "opencollective",
7778
+ "url": "https://opencollective.com/unified"
7779
  }
7780
  },
7781
+ "node_modules/mdast-util-gfm": {
7782
+ "version": "3.1.0",
7783
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm/-/mdast-util-gfm-3.1.0.tgz",
7784
+ "integrity": "sha512-0ulfdQOM3ysHhCJ1p06l0b0VKlhU0wuQs3thxZQagjcjPrlFRqY215uZGHHJan9GEAXd9MbfPjFJz+qMkVR6zQ==",
7785
+ "license": "MIT",
 
7786
  "dependencies": {
7787
+ "mdast-util-from-markdown": "^2.0.0",
7788
+ "mdast-util-gfm-autolink-literal": "^2.0.0",
7789
+ "mdast-util-gfm-footnote": "^2.0.0",
7790
+ "mdast-util-gfm-strikethrough": "^2.0.0",
7791
+ "mdast-util-gfm-table": "^2.0.0",
7792
+ "mdast-util-gfm-task-list-item": "^2.0.0",
7793
+ "mdast-util-to-markdown": "^2.0.0"
7794
  },
7795
+ "funding": {
7796
+ "type": "opencollective",
7797
+ "url": "https://opencollective.com/unified"
7798
  }
7799
  },
7800
+ "node_modules/mdast-util-gfm-autolink-literal": {
7801
+ "version": "2.0.1",
7802
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-autolink-literal/-/mdast-util-gfm-autolink-literal-2.0.1.tgz",
7803
+ "integrity": "sha512-5HVP2MKaP6L+G6YaxPNjuL0BPrq9orG3TsrZ9YXbA3vDw/ACI4MEsnoDpn6ZNm7GnZgtAcONJyPhOP8tNJQavQ==",
 
7804
  "license": "MIT",
7805
+ "dependencies": {
7806
+ "@types/mdast": "^4.0.0",
7807
+ "ccount": "^2.0.0",
7808
+ "devlop": "^1.0.0",
7809
+ "mdast-util-find-and-replace": "^3.0.0",
7810
+ "micromark-util-character": "^2.0.0"
7811
+ },
7812
  "funding": {
7813
+ "type": "opencollective",
7814
+ "url": "https://opencollective.com/unified"
7815
  }
7816
  },
7817
+ "node_modules/mdast-util-gfm-footnote": {
7818
+ "version": "2.1.0",
7819
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-footnote/-/mdast-util-gfm-footnote-2.1.0.tgz",
7820
+ "integrity": "sha512-sqpDWlsHn7Ac9GNZQMeUzPQSMzR6Wv0WKRNvQRg0KqHh02fpTz69Qc1QSseNX29bhz1ROIyNyxExfawVKTm1GQ==",
7821
  "license": "MIT",
7822
  "dependencies": {
7823
+ "@types/mdast": "^4.0.0",
7824
+ "devlop": "^1.1.0",
7825
+ "mdast-util-from-markdown": "^2.0.0",
7826
+ "mdast-util-to-markdown": "^2.0.0",
7827
+ "micromark-util-normalize-identifier": "^2.0.0"
7828
  },
7829
+ "funding": {
7830
+ "type": "opencollective",
7831
+ "url": "https://opencollective.com/unified"
7832
+ }
7833
+ },
7834
+ "node_modules/mdast-util-gfm-strikethrough": {
7835
+ "version": "2.0.0",
7836
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-strikethrough/-/mdast-util-gfm-strikethrough-2.0.0.tgz",
7837
+ "integrity": "sha512-mKKb915TF+OC5ptj5bJ7WFRPdYtuHv0yTRxK2tJvi+BDqbkiG7h7u/9SI89nRAYcmap2xHQL9D+QG/6wSrTtXg==",
7838
+ "license": "MIT",
7839
+ "dependencies": {
7840
+ "@types/mdast": "^4.0.0",
7841
+ "mdast-util-from-markdown": "^2.0.0",
7842
+ "mdast-util-to-markdown": "^2.0.0"
7843
  },
7844
+ "funding": {
7845
+ "type": "opencollective",
7846
+ "url": "https://opencollective.com/unified"
7847
+ }
7848
+ },
7849
+ "node_modules/mdast-util-gfm-table": {
7850
+ "version": "2.0.0",
7851
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-table/-/mdast-util-gfm-table-2.0.0.tgz",
7852
+ "integrity": "sha512-78UEvebzz/rJIxLvE7ZtDd/vIQ0RHv+3Mh5DR96p7cS7HsBhYIICDBCu8csTNWNO6tBWfqXPWekRuj2FNOGOZg==",
7853
+ "license": "MIT",
7854
+ "dependencies": {
7855
+ "@types/mdast": "^4.0.0",
7856
+ "devlop": "^1.0.0",
7857
+ "markdown-table": "^3.0.0",
7858
+ "mdast-util-from-markdown": "^2.0.0",
7859
+ "mdast-util-to-markdown": "^2.0.0"
7860
+ },
7861
+ "funding": {
7862
+ "type": "opencollective",
7863
+ "url": "https://opencollective.com/unified"
7864
+ }
7865
+ },
7866
+ "node_modules/mdast-util-gfm-task-list-item": {
7867
+ "version": "2.0.0",
7868
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-task-list-item/-/mdast-util-gfm-task-list-item-2.0.0.tgz",
7869
+ "integrity": "sha512-IrtvNvjxC1o06taBAVJznEnkiHxLFTzgonUdy8hzFVeDun0uTjxxrRGVaNFqkU1wJR3RBPEfsxmU6jDWPofrTQ==",
7870
+ "license": "MIT",
7871
+ "dependencies": {
7872
+ "@types/mdast": "^4.0.0",
7873
+ "devlop": "^1.0.0",
7874
+ "mdast-util-from-markdown": "^2.0.0",
7875
+ "mdast-util-to-markdown": "^2.0.0"
7876
+ },
7877
+ "funding": {
7878
+ "type": "opencollective",
7879
+ "url": "https://opencollective.com/unified"
7880
+ }
7881
+ },
7882
+ "node_modules/mdast-util-mdx-expression": {
7883
+ "version": "2.0.1",
7884
+ "resolved": "https://registry.npmjs.org/mdast-util-mdx-expression/-/mdast-util-mdx-expression-2.0.1.tgz",
7885
+ "integrity": "sha512-J6f+9hUp+ldTZqKRSg7Vw5V6MqjATc+3E4gf3CFNcuZNWD8XdyI6zQ8GqH7f8169MM6P7hMBRDVGnn7oHB9kXQ==",
7886
+ "license": "MIT",
7887
+ "dependencies": {
7888
+ "@types/estree-jsx": "^1.0.0",
7889
+ "@types/hast": "^3.0.0",
7890
+ "@types/mdast": "^4.0.0",
7891
+ "devlop": "^1.0.0",
7892
+ "mdast-util-from-markdown": "^2.0.0",
7893
+ "mdast-util-to-markdown": "^2.0.0"
7894
+ },
7895
+ "funding": {
7896
+ "type": "opencollective",
7897
+ "url": "https://opencollective.com/unified"
7898
+ }
7899
+ },
7900
+ "node_modules/mdast-util-mdx-jsx": {
7901
+ "version": "3.2.0",
7902
+ "resolved": "https://registry.npmjs.org/mdast-util-mdx-jsx/-/mdast-util-mdx-jsx-3.2.0.tgz",
7903
+ "integrity": "sha512-lj/z8v0r6ZtsN/cGNNtemmmfoLAFZnjMbNyLzBafjzikOM+glrjNHPlf6lQDOTccj9n5b0PPihEBbhneMyGs1Q==",
7904
+ "license": "MIT",
7905
+ "dependencies": {
7906
+ "@types/estree-jsx": "^1.0.0",
7907
+ "@types/hast": "^3.0.0",
7908
+ "@types/mdast": "^4.0.0",
7909
+ "@types/unist": "^3.0.0",
7910
+ "ccount": "^2.0.0",
7911
+ "devlop": "^1.1.0",
7912
+ "mdast-util-from-markdown": "^2.0.0",
7913
+ "mdast-util-to-markdown": "^2.0.0",
7914
+ "parse-entities": "^4.0.0",
7915
+ "stringify-entities": "^4.0.0",
7916
+ "unist-util-stringify-position": "^4.0.0",
7917
+ "vfile-message": "^4.0.0"
7918
+ },
7919
+ "funding": {
7920
+ "type": "opencollective",
7921
+ "url": "https://opencollective.com/unified"
7922
+ }
7923
+ },
7924
+ "node_modules/mdast-util-mdxjs-esm": {
7925
+ "version": "2.0.1",
7926
+ "resolved": "https://registry.npmjs.org/mdast-util-mdxjs-esm/-/mdast-util-mdxjs-esm-2.0.1.tgz",
7927
+ "integrity": "sha512-EcmOpxsZ96CvlP03NghtH1EsLtr0n9Tm4lPUJUBccV9RwUOneqSycg19n5HGzCf+10LozMRSObtVr3ee1WoHtg==",
7928
+ "license": "MIT",
7929
+ "dependencies": {
7930
+ "@types/estree-jsx": "^1.0.0",
7931
+ "@types/hast": "^3.0.0",
7932
+ "@types/mdast": "^4.0.0",
7933
+ "devlop": "^1.0.0",
7934
+ "mdast-util-from-markdown": "^2.0.0",
7935
+ "mdast-util-to-markdown": "^2.0.0"
7936
+ },
7937
+ "funding": {
7938
+ "type": "opencollective",
7939
+ "url": "https://opencollective.com/unified"
7940
+ }
7941
+ },
7942
+ "node_modules/mdast-util-phrasing": {
7943
+ "version": "4.1.0",
7944
+ "resolved": "https://registry.npmjs.org/mdast-util-phrasing/-/mdast-util-phrasing-4.1.0.tgz",
7945
+ "integrity": "sha512-TqICwyvJJpBwvGAMZjj4J2n0X8QWp21b9l0o7eXyVJ25YNWYbJDVIyD1bZXE6WtV6RmKJVYmQAKWa0zWOABz2w==",
7946
+ "license": "MIT",
7947
+ "dependencies": {
7948
+ "@types/mdast": "^4.0.0",
7949
+ "unist-util-is": "^6.0.0"
7950
+ },
7951
+ "funding": {
7952
+ "type": "opencollective",
7953
+ "url": "https://opencollective.com/unified"
7954
+ }
7955
+ },
7956
+ "node_modules/mdast-util-to-hast": {
7957
+ "version": "13.2.1",
7958
+ "resolved": "https://registry.npmjs.org/mdast-util-to-hast/-/mdast-util-to-hast-13.2.1.tgz",
7959
+ "integrity": "sha512-cctsq2wp5vTsLIcaymblUriiTcZd0CwWtCbLvrOzYCDZoWyMNV8sZ7krj09FSnsiJi3WVsHLM4k6Dq/yaPyCXA==",
7960
+ "license": "MIT",
7961
+ "dependencies": {
7962
+ "@types/hast": "^3.0.0",
7963
+ "@types/mdast": "^4.0.0",
7964
+ "@ungap/structured-clone": "^1.0.0",
7965
+ "devlop": "^1.0.0",
7966
+ "micromark-util-sanitize-uri": "^2.0.0",
7967
+ "trim-lines": "^3.0.0",
7968
+ "unist-util-position": "^5.0.0",
7969
+ "unist-util-visit": "^5.0.0",
7970
+ "vfile": "^6.0.0"
7971
+ },
7972
+ "funding": {
7973
+ "type": "opencollective",
7974
+ "url": "https://opencollective.com/unified"
7975
+ }
7976
+ },
7977
+ "node_modules/mdast-util-to-markdown": {
7978
+ "version": "2.1.2",
7979
+ "resolved": "https://registry.npmjs.org/mdast-util-to-markdown/-/mdast-util-to-markdown-2.1.2.tgz",
7980
+ "integrity": "sha512-xj68wMTvGXVOKonmog6LwyJKrYXZPvlwabaryTjLh9LuvovB/KAH+kvi8Gjj+7rJjsFi23nkUxRQv1KqSroMqA==",
7981
+ "license": "MIT",
7982
+ "dependencies": {
7983
+ "@types/mdast": "^4.0.0",
7984
+ "@types/unist": "^3.0.0",
7985
+ "longest-streak": "^3.0.0",
7986
+ "mdast-util-phrasing": "^4.0.0",
7987
+ "mdast-util-to-string": "^4.0.0",
7988
+ "micromark-util-classify-character": "^2.0.0",
7989
+ "micromark-util-decode-string": "^2.0.0",
7990
+ "unist-util-visit": "^5.0.0",
7991
+ "zwitch": "^2.0.0"
7992
+ },
7993
+ "funding": {
7994
+ "type": "opencollective",
7995
+ "url": "https://opencollective.com/unified"
7996
+ }
7997
+ },
7998
+ "node_modules/mdast-util-to-string": {
7999
+ "version": "4.0.0",
8000
+ "resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-4.0.0.tgz",
8001
+ "integrity": "sha512-0H44vDimn51F0YwvxSJSm0eCDOJTRlmN0R1yBh4HLj9wiV1Dn0QoXGbvFAWj2hSItVTlCmBF1hqKlIyUBVFLPg==",
8002
+ "license": "MIT",
8003
+ "dependencies": {
8004
+ "@types/mdast": "^4.0.0"
8005
+ },
8006
+ "funding": {
8007
+ "type": "opencollective",
8008
+ "url": "https://opencollective.com/unified"
8009
+ }
8010
+ },
8011
+ "node_modules/mdn-data": {
8012
+ "version": "2.0.30",
8013
+ "resolved": "https://registry.npmjs.org/mdn-data/-/mdn-data-2.0.30.tgz",
8014
+ "integrity": "sha512-GaqWWShW4kv/G9IEucWScBx9G1/vsFZZJUO+tD26M8J8z3Kw5RDQjaoZe03YAClgeS/SWPOcb4nkFBTEi5DUEA==",
8015
+ "license": "CC0-1.0"
8016
+ },
8017
+ "node_modules/merge2": {
8018
+ "version": "1.4.1",
8019
+ "resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz",
8020
+ "integrity": "sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==",
8021
+ "dev": true,
8022
+ "license": "MIT",
8023
+ "engines": {
8024
+ "node": ">= 8"
8025
+ }
8026
+ },
8027
+ "node_modules/micromark": {
8028
+ "version": "4.0.2",
8029
+ "resolved": "https://registry.npmjs.org/micromark/-/micromark-4.0.2.tgz",
8030
+ "integrity": "sha512-zpe98Q6kvavpCr1NPVSCMebCKfD7CA2NqZ+rykeNhONIJBpc1tFKt9hucLGwha3jNTNI8lHpctWJWoimVF4PfA==",
8031
+ "funding": [
8032
+ {
8033
+ "type": "GitHub Sponsors",
8034
+ "url": "https://github.com/sponsors/unifiedjs"
8035
+ },
8036
+ {
8037
+ "type": "OpenCollective",
8038
+ "url": "https://opencollective.com/unified"
8039
+ }
8040
+ ],
8041
+ "license": "MIT",
8042
+ "dependencies": {
8043
+ "@types/debug": "^4.0.0",
8044
+ "debug": "^4.0.0",
8045
+ "decode-named-character-reference": "^1.0.0",
8046
+ "devlop": "^1.0.0",
8047
+ "micromark-core-commonmark": "^2.0.0",
8048
+ "micromark-factory-space": "^2.0.0",
8049
+ "micromark-util-character": "^2.0.0",
8050
+ "micromark-util-chunked": "^2.0.0",
8051
+ "micromark-util-combine-extensions": "^2.0.0",
8052
+ "micromark-util-decode-numeric-character-reference": "^2.0.0",
8053
+ "micromark-util-encode": "^2.0.0",
8054
+ "micromark-util-normalize-identifier": "^2.0.0",
8055
+ "micromark-util-resolve-all": "^2.0.0",
8056
+ "micromark-util-sanitize-uri": "^2.0.0",
8057
+ "micromark-util-subtokenize": "^2.0.0",
8058
+ "micromark-util-symbol": "^2.0.0",
8059
+ "micromark-util-types": "^2.0.0"
8060
+ }
8061
+ },
8062
+ "node_modules/micromark-core-commonmark": {
8063
+ "version": "2.0.3",
8064
+ "resolved": "https://registry.npmjs.org/micromark-core-commonmark/-/micromark-core-commonmark-2.0.3.tgz",
8065
+ "integrity": "sha512-RDBrHEMSxVFLg6xvnXmb1Ayr2WzLAWjeSATAoxwKYJV94TeNavgoIdA0a9ytzDSVzBy2YKFK+emCPOEibLeCrg==",
8066
+ "funding": [
8067
+ {
8068
+ "type": "GitHub Sponsors",
8069
+ "url": "https://github.com/sponsors/unifiedjs"
8070
+ },
8071
+ {
8072
+ "type": "OpenCollective",
8073
+ "url": "https://opencollective.com/unified"
8074
+ }
8075
+ ],
8076
+ "license": "MIT",
8077
+ "dependencies": {
8078
+ "decode-named-character-reference": "^1.0.0",
8079
+ "devlop": "^1.0.0",
8080
+ "micromark-factory-destination": "^2.0.0",
8081
+ "micromark-factory-label": "^2.0.0",
8082
+ "micromark-factory-space": "^2.0.0",
8083
+ "micromark-factory-title": "^2.0.0",
8084
+ "micromark-factory-whitespace": "^2.0.0",
8085
+ "micromark-util-character": "^2.0.0",
8086
+ "micromark-util-chunked": "^2.0.0",
8087
+ "micromark-util-classify-character": "^2.0.0",
8088
+ "micromark-util-html-tag-name": "^2.0.0",
8089
+ "micromark-util-normalize-identifier": "^2.0.0",
8090
+ "micromark-util-resolve-all": "^2.0.0",
8091
+ "micromark-util-subtokenize": "^2.0.0",
8092
+ "micromark-util-symbol": "^2.0.0",
8093
+ "micromark-util-types": "^2.0.0"
8094
+ }
8095
+ },
8096
+ "node_modules/micromark-extension-gfm": {
8097
+ "version": "3.0.0",
8098
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm/-/micromark-extension-gfm-3.0.0.tgz",
8099
+ "integrity": "sha512-vsKArQsicm7t0z2GugkCKtZehqUm31oeGBV/KVSorWSy8ZlNAv7ytjFhvaryUiCUJYqs+NoE6AFhpQvBTM6Q4w==",
8100
+ "license": "MIT",
8101
+ "dependencies": {
8102
+ "micromark-extension-gfm-autolink-literal": "^2.0.0",
8103
+ "micromark-extension-gfm-footnote": "^2.0.0",
8104
+ "micromark-extension-gfm-strikethrough": "^2.0.0",
8105
+ "micromark-extension-gfm-table": "^2.0.0",
8106
+ "micromark-extension-gfm-tagfilter": "^2.0.0",
8107
+ "micromark-extension-gfm-task-list-item": "^2.0.0",
8108
+ "micromark-util-combine-extensions": "^2.0.0",
8109
+ "micromark-util-types": "^2.0.0"
8110
+ },
8111
+ "funding": {
8112
+ "type": "opencollective",
8113
+ "url": "https://opencollective.com/unified"
8114
+ }
8115
+ },
8116
+ "node_modules/micromark-extension-gfm-autolink-literal": {
8117
+ "version": "2.1.0",
8118
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-autolink-literal/-/micromark-extension-gfm-autolink-literal-2.1.0.tgz",
8119
+ "integrity": "sha512-oOg7knzhicgQ3t4QCjCWgTmfNhvQbDDnJeVu9v81r7NltNCVmhPy1fJRX27pISafdjL+SVc4d3l48Gb6pbRypw==",
8120
+ "license": "MIT",
8121
+ "dependencies": {
8122
+ "micromark-util-character": "^2.0.0",
8123
+ "micromark-util-sanitize-uri": "^2.0.0",
8124
+ "micromark-util-symbol": "^2.0.0",
8125
+ "micromark-util-types": "^2.0.0"
8126
+ },
8127
+ "funding": {
8128
+ "type": "opencollective",
8129
+ "url": "https://opencollective.com/unified"
8130
+ }
8131
+ },
8132
+ "node_modules/micromark-extension-gfm-footnote": {
8133
+ "version": "2.1.0",
8134
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-footnote/-/micromark-extension-gfm-footnote-2.1.0.tgz",
8135
+ "integrity": "sha512-/yPhxI1ntnDNsiHtzLKYnE3vf9JZ6cAisqVDauhp4CEHxlb4uoOTxOCJ+9s51bIB8U1N1FJ1RXOKTIlD5B/gqw==",
8136
+ "license": "MIT",
8137
+ "dependencies": {
8138
+ "devlop": "^1.0.0",
8139
+ "micromark-core-commonmark": "^2.0.0",
8140
+ "micromark-factory-space": "^2.0.0",
8141
+ "micromark-util-character": "^2.0.0",
8142
+ "micromark-util-normalize-identifier": "^2.0.0",
8143
+ "micromark-util-sanitize-uri": "^2.0.0",
8144
+ "micromark-util-symbol": "^2.0.0",
8145
+ "micromark-util-types": "^2.0.0"
8146
+ },
8147
+ "funding": {
8148
+ "type": "opencollective",
8149
+ "url": "https://opencollective.com/unified"
8150
+ }
8151
+ },
8152
+ "node_modules/micromark-extension-gfm-strikethrough": {
8153
+ "version": "2.1.0",
8154
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-strikethrough/-/micromark-extension-gfm-strikethrough-2.1.0.tgz",
8155
+ "integrity": "sha512-ADVjpOOkjz1hhkZLlBiYA9cR2Anf8F4HqZUO6e5eDcPQd0Txw5fxLzzxnEkSkfnD0wziSGiv7sYhk/ktvbf1uw==",
8156
+ "license": "MIT",
8157
+ "dependencies": {
8158
+ "devlop": "^1.0.0",
8159
+ "micromark-util-chunked": "^2.0.0",
8160
+ "micromark-util-classify-character": "^2.0.0",
8161
+ "micromark-util-resolve-all": "^2.0.0",
8162
+ "micromark-util-symbol": "^2.0.0",
8163
+ "micromark-util-types": "^2.0.0"
8164
+ },
8165
+ "funding": {
8166
+ "type": "opencollective",
8167
+ "url": "https://opencollective.com/unified"
8168
+ }
8169
+ },
8170
+ "node_modules/micromark-extension-gfm-table": {
8171
+ "version": "2.1.1",
8172
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-table/-/micromark-extension-gfm-table-2.1.1.tgz",
8173
+ "integrity": "sha512-t2OU/dXXioARrC6yWfJ4hqB7rct14e8f7m0cbI5hUmDyyIlwv5vEtooptH8INkbLzOatzKuVbQmAYcbWoyz6Dg==",
8174
+ "license": "MIT",
8175
+ "dependencies": {
8176
+ "devlop": "^1.0.0",
8177
+ "micromark-factory-space": "^2.0.0",
8178
+ "micromark-util-character": "^2.0.0",
8179
+ "micromark-util-symbol": "^2.0.0",
8180
+ "micromark-util-types": "^2.0.0"
8181
+ },
8182
+ "funding": {
8183
+ "type": "opencollective",
8184
+ "url": "https://opencollective.com/unified"
8185
+ }
8186
+ },
8187
+ "node_modules/micromark-extension-gfm-tagfilter": {
8188
+ "version": "2.0.0",
8189
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-tagfilter/-/micromark-extension-gfm-tagfilter-2.0.0.tgz",
8190
+ "integrity": "sha512-xHlTOmuCSotIA8TW1mDIM6X2O1SiX5P9IuDtqGonFhEK0qgRI4yeC6vMxEV2dgyr2TiD+2PQ10o+cOhdVAcwfg==",
8191
+ "license": "MIT",
8192
+ "dependencies": {
8193
+ "micromark-util-types": "^2.0.0"
8194
+ },
8195
+ "funding": {
8196
+ "type": "opencollective",
8197
+ "url": "https://opencollective.com/unified"
8198
+ }
8199
+ },
8200
+ "node_modules/micromark-extension-gfm-task-list-item": {
8201
+ "version": "2.1.0",
8202
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-task-list-item/-/micromark-extension-gfm-task-list-item-2.1.0.tgz",
8203
+ "integrity": "sha512-qIBZhqxqI6fjLDYFTBIa4eivDMnP+OZqsNwmQ3xNLE4Cxwc+zfQEfbs6tzAo2Hjq+bh6q5F+Z8/cksrLFYWQQw==",
8204
+ "license": "MIT",
8205
+ "dependencies": {
8206
+ "devlop": "^1.0.0",
8207
+ "micromark-factory-space": "^2.0.0",
8208
+ "micromark-util-character": "^2.0.0",
8209
+ "micromark-util-symbol": "^2.0.0",
8210
+ "micromark-util-types": "^2.0.0"
8211
+ },
8212
+ "funding": {
8213
+ "type": "opencollective",
8214
+ "url": "https://opencollective.com/unified"
8215
+ }
8216
+ },
8217
+ "node_modules/micromark-factory-destination": {
8218
+ "version": "2.0.1",
8219
+ "resolved": "https://registry.npmjs.org/micromark-factory-destination/-/micromark-factory-destination-2.0.1.tgz",
8220
+ "integrity": "sha512-Xe6rDdJlkmbFRExpTOmRj9N3MaWmbAgdpSrBQvCFqhezUn4AHqJHbaEnfbVYYiexVSs//tqOdY/DxhjdCiJnIA==",
8221
+ "funding": [
8222
+ {
8223
+ "type": "GitHub Sponsors",
8224
+ "url": "https://github.com/sponsors/unifiedjs"
8225
+ },
8226
+ {
8227
+ "type": "OpenCollective",
8228
+ "url": "https://opencollective.com/unified"
8229
+ }
8230
+ ],
8231
+ "license": "MIT",
8232
+ "dependencies": {
8233
+ "micromark-util-character": "^2.0.0",
8234
+ "micromark-util-symbol": "^2.0.0",
8235
+ "micromark-util-types": "^2.0.0"
8236
+ }
8237
+ },
8238
+ "node_modules/micromark-factory-label": {
8239
+ "version": "2.0.1",
8240
+ "resolved": "https://registry.npmjs.org/micromark-factory-label/-/micromark-factory-label-2.0.1.tgz",
8241
+ "integrity": "sha512-VFMekyQExqIW7xIChcXn4ok29YE3rnuyveW3wZQWWqF4Nv9Wk5rgJ99KzPvHjkmPXF93FXIbBp6YdW3t71/7Vg==",
8242
+ "funding": [
8243
+ {
8244
+ "type": "GitHub Sponsors",
8245
+ "url": "https://github.com/sponsors/unifiedjs"
8246
+ },
8247
+ {
8248
+ "type": "OpenCollective",
8249
+ "url": "https://opencollective.com/unified"
8250
+ }
8251
+ ],
8252
+ "license": "MIT",
8253
+ "dependencies": {
8254
+ "devlop": "^1.0.0",
8255
+ "micromark-util-character": "^2.0.0",
8256
+ "micromark-util-symbol": "^2.0.0",
8257
+ "micromark-util-types": "^2.0.0"
8258
+ }
8259
+ },
8260
+ "node_modules/micromark-factory-space": {
8261
+ "version": "2.0.1",
8262
+ "resolved": "https://registry.npmjs.org/micromark-factory-space/-/micromark-factory-space-2.0.1.tgz",
8263
+ "integrity": "sha512-zRkxjtBxxLd2Sc0d+fbnEunsTj46SWXgXciZmHq0kDYGnck/ZSGj9/wULTV95uoeYiK5hRXP2mJ98Uo4cq/LQg==",
8264
+ "funding": [
8265
+ {
8266
+ "type": "GitHub Sponsors",
8267
+ "url": "https://github.com/sponsors/unifiedjs"
8268
+ },
8269
+ {
8270
+ "type": "OpenCollective",
8271
+ "url": "https://opencollective.com/unified"
8272
+ }
8273
+ ],
8274
+ "license": "MIT",
8275
+ "dependencies": {
8276
+ "micromark-util-character": "^2.0.0",
8277
+ "micromark-util-types": "^2.0.0"
8278
+ }
8279
+ },
8280
+ "node_modules/micromark-factory-title": {
8281
+ "version": "2.0.1",
8282
+ "resolved": "https://registry.npmjs.org/micromark-factory-title/-/micromark-factory-title-2.0.1.tgz",
8283
+ "integrity": "sha512-5bZ+3CjhAd9eChYTHsjy6TGxpOFSKgKKJPJxr293jTbfry2KDoWkhBb6TcPVB4NmzaPhMs1Frm9AZH7OD4Cjzw==",
8284
+ "funding": [
8285
+ {
8286
+ "type": "GitHub Sponsors",
8287
+ "url": "https://github.com/sponsors/unifiedjs"
8288
+ },
8289
+ {
8290
+ "type": "OpenCollective",
8291
+ "url": "https://opencollective.com/unified"
8292
+ }
8293
+ ],
8294
+ "license": "MIT",
8295
+ "dependencies": {
8296
+ "micromark-factory-space": "^2.0.0",
8297
+ "micromark-util-character": "^2.0.0",
8298
+ "micromark-util-symbol": "^2.0.0",
8299
+ "micromark-util-types": "^2.0.0"
8300
+ }
8301
+ },
8302
+ "node_modules/micromark-factory-whitespace": {
8303
+ "version": "2.0.1",
8304
+ "resolved": "https://registry.npmjs.org/micromark-factory-whitespace/-/micromark-factory-whitespace-2.0.1.tgz",
8305
+ "integrity": "sha512-Ob0nuZ3PKt/n0hORHyvoD9uZhr+Za8sFoP+OnMcnWK5lngSzALgQYKMr9RJVOWLqQYuyn6ulqGWSXdwf6F80lQ==",
8306
+ "funding": [
8307
+ {
8308
+ "type": "GitHub Sponsors",
8309
+ "url": "https://github.com/sponsors/unifiedjs"
8310
+ },
8311
+ {
8312
+ "type": "OpenCollective",
8313
+ "url": "https://opencollective.com/unified"
8314
+ }
8315
+ ],
8316
+ "license": "MIT",
8317
+ "dependencies": {
8318
+ "micromark-factory-space": "^2.0.0",
8319
+ "micromark-util-character": "^2.0.0",
8320
+ "micromark-util-symbol": "^2.0.0",
8321
+ "micromark-util-types": "^2.0.0"
8322
+ }
8323
+ },
8324
+ "node_modules/micromark-util-character": {
8325
+ "version": "2.1.1",
8326
+ "resolved": "https://registry.npmjs.org/micromark-util-character/-/micromark-util-character-2.1.1.tgz",
8327
+ "integrity": "sha512-wv8tdUTJ3thSFFFJKtpYKOYiGP2+v96Hvk4Tu8KpCAsTMs6yi+nVmGh1syvSCsaxz45J6Jbw+9DD6g97+NV67Q==",
8328
+ "funding": [
8329
+ {
8330
+ "type": "GitHub Sponsors",
8331
+ "url": "https://github.com/sponsors/unifiedjs"
8332
+ },
8333
+ {
8334
+ "type": "OpenCollective",
8335
+ "url": "https://opencollective.com/unified"
8336
+ }
8337
+ ],
8338
+ "license": "MIT",
8339
+ "dependencies": {
8340
+ "micromark-util-symbol": "^2.0.0",
8341
+ "micromark-util-types": "^2.0.0"
8342
+ }
8343
+ },
8344
+ "node_modules/micromark-util-chunked": {
8345
+ "version": "2.0.1",
8346
+ "resolved": "https://registry.npmjs.org/micromark-util-chunked/-/micromark-util-chunked-2.0.1.tgz",
8347
+ "integrity": "sha512-QUNFEOPELfmvv+4xiNg2sRYeS/P84pTW0TCgP5zc9FpXetHY0ab7SxKyAQCNCc1eK0459uoLI1y5oO5Vc1dbhA==",
8348
+ "funding": [
8349
+ {
8350
+ "type": "GitHub Sponsors",
8351
+ "url": "https://github.com/sponsors/unifiedjs"
8352
+ },
8353
+ {
8354
+ "type": "OpenCollective",
8355
+ "url": "https://opencollective.com/unified"
8356
+ }
8357
+ ],
8358
+ "license": "MIT",
8359
+ "dependencies": {
8360
+ "micromark-util-symbol": "^2.0.0"
8361
+ }
8362
+ },
8363
+ "node_modules/micromark-util-classify-character": {
8364
+ "version": "2.0.1",
8365
+ "resolved": "https://registry.npmjs.org/micromark-util-classify-character/-/micromark-util-classify-character-2.0.1.tgz",
8366
+ "integrity": "sha512-K0kHzM6afW/MbeWYWLjoHQv1sgg2Q9EccHEDzSkxiP/EaagNzCm7T/WMKZ3rjMbvIpvBiZgwR3dKMygtA4mG1Q==",
8367
+ "funding": [
8368
+ {
8369
+ "type": "GitHub Sponsors",
8370
+ "url": "https://github.com/sponsors/unifiedjs"
8371
+ },
8372
+ {
8373
+ "type": "OpenCollective",
8374
+ "url": "https://opencollective.com/unified"
8375
+ }
8376
+ ],
8377
+ "license": "MIT",
8378
+ "dependencies": {
8379
+ "micromark-util-character": "^2.0.0",
8380
+ "micromark-util-symbol": "^2.0.0",
8381
+ "micromark-util-types": "^2.0.0"
8382
+ }
8383
+ },
8384
+ "node_modules/micromark-util-combine-extensions": {
8385
+ "version": "2.0.1",
8386
+ "resolved": "https://registry.npmjs.org/micromark-util-combine-extensions/-/micromark-util-combine-extensions-2.0.1.tgz",
8387
+ "integrity": "sha512-OnAnH8Ujmy59JcyZw8JSbK9cGpdVY44NKgSM7E9Eh7DiLS2E9RNQf0dONaGDzEG9yjEl5hcqeIsj4hfRkLH/Bg==",
8388
+ "funding": [
8389
+ {
8390
+ "type": "GitHub Sponsors",
8391
+ "url": "https://github.com/sponsors/unifiedjs"
8392
+ },
8393
+ {
8394
+ "type": "OpenCollective",
8395
+ "url": "https://opencollective.com/unified"
8396
+ }
8397
+ ],
8398
+ "license": "MIT",
8399
+ "dependencies": {
8400
+ "micromark-util-chunked": "^2.0.0",
8401
+ "micromark-util-types": "^2.0.0"
8402
+ }
8403
+ },
8404
+ "node_modules/micromark-util-decode-numeric-character-reference": {
8405
+ "version": "2.0.2",
8406
+ "resolved": "https://registry.npmjs.org/micromark-util-decode-numeric-character-reference/-/micromark-util-decode-numeric-character-reference-2.0.2.tgz",
8407
+ "integrity": "sha512-ccUbYk6CwVdkmCQMyr64dXz42EfHGkPQlBj5p7YVGzq8I7CtjXZJrubAYezf7Rp+bjPseiROqe7G6foFd+lEuw==",
8408
+ "funding": [
8409
+ {
8410
+ "type": "GitHub Sponsors",
8411
+ "url": "https://github.com/sponsors/unifiedjs"
8412
+ },
8413
+ {
8414
+ "type": "OpenCollective",
8415
+ "url": "https://opencollective.com/unified"
8416
+ }
8417
+ ],
8418
+ "license": "MIT",
8419
+ "dependencies": {
8420
+ "micromark-util-symbol": "^2.0.0"
8421
+ }
8422
+ },
8423
+ "node_modules/micromark-util-decode-string": {
8424
+ "version": "2.0.1",
8425
+ "resolved": "https://registry.npmjs.org/micromark-util-decode-string/-/micromark-util-decode-string-2.0.1.tgz",
8426
+ "integrity": "sha512-nDV/77Fj6eH1ynwscYTOsbK7rR//Uj0bZXBwJZRfaLEJ1iGBR6kIfNmlNqaqJf649EP0F3NWNdeJi03elllNUQ==",
8427
+ "funding": [
8428
+ {
8429
+ "type": "GitHub Sponsors",
8430
+ "url": "https://github.com/sponsors/unifiedjs"
8431
+ },
8432
+ {
8433
+ "type": "OpenCollective",
8434
+ "url": "https://opencollective.com/unified"
8435
+ }
8436
+ ],
8437
+ "license": "MIT",
8438
+ "dependencies": {
8439
+ "decode-named-character-reference": "^1.0.0",
8440
+ "micromark-util-character": "^2.0.0",
8441
+ "micromark-util-decode-numeric-character-reference": "^2.0.0",
8442
+ "micromark-util-symbol": "^2.0.0"
8443
+ }
8444
+ },
8445
+ "node_modules/micromark-util-encode": {
8446
+ "version": "2.0.1",
8447
+ "resolved": "https://registry.npmjs.org/micromark-util-encode/-/micromark-util-encode-2.0.1.tgz",
8448
+ "integrity": "sha512-c3cVx2y4KqUnwopcO9b/SCdo2O67LwJJ/UyqGfbigahfegL9myoEFoDYZgkT7f36T0bLrM9hZTAaAyH+PCAXjw==",
8449
+ "funding": [
8450
+ {
8451
+ "type": "GitHub Sponsors",
8452
+ "url": "https://github.com/sponsors/unifiedjs"
8453
+ },
8454
+ {
8455
+ "type": "OpenCollective",
8456
+ "url": "https://opencollective.com/unified"
8457
+ }
8458
+ ],
8459
+ "license": "MIT"
8460
+ },
8461
+ "node_modules/micromark-util-html-tag-name": {
8462
+ "version": "2.0.1",
8463
+ "resolved": "https://registry.npmjs.org/micromark-util-html-tag-name/-/micromark-util-html-tag-name-2.0.1.tgz",
8464
+ "integrity": "sha512-2cNEiYDhCWKI+Gs9T0Tiysk136SnR13hhO8yW6BGNyhOC4qYFnwF1nKfD3HFAIXA5c45RrIG1ub11GiXeYd1xA==",
8465
+ "funding": [
8466
+ {
8467
+ "type": "GitHub Sponsors",
8468
+ "url": "https://github.com/sponsors/unifiedjs"
8469
+ },
8470
+ {
8471
+ "type": "OpenCollective",
8472
+ "url": "https://opencollective.com/unified"
8473
+ }
8474
+ ],
8475
+ "license": "MIT"
8476
+ },
8477
+ "node_modules/micromark-util-normalize-identifier": {
8478
+ "version": "2.0.1",
8479
+ "resolved": "https://registry.npmjs.org/micromark-util-normalize-identifier/-/micromark-util-normalize-identifier-2.0.1.tgz",
8480
+ "integrity": "sha512-sxPqmo70LyARJs0w2UclACPUUEqltCkJ6PhKdMIDuJ3gSf/Q+/GIe3WKl0Ijb/GyH9lOpUkRAO2wp0GVkLvS9Q==",
8481
+ "funding": [
8482
+ {
8483
+ "type": "GitHub Sponsors",
8484
+ "url": "https://github.com/sponsors/unifiedjs"
8485
+ },
8486
+ {
8487
+ "type": "OpenCollective",
8488
+ "url": "https://opencollective.com/unified"
8489
+ }
8490
+ ],
8491
+ "license": "MIT",
8492
+ "dependencies": {
8493
+ "micromark-util-symbol": "^2.0.0"
8494
+ }
8495
+ },
8496
+ "node_modules/micromark-util-resolve-all": {
8497
+ "version": "2.0.1",
8498
+ "resolved": "https://registry.npmjs.org/micromark-util-resolve-all/-/micromark-util-resolve-all-2.0.1.tgz",
8499
+ "integrity": "sha512-VdQyxFWFT2/FGJgwQnJYbe1jjQoNTS4RjglmSjTUlpUMa95Htx9NHeYW4rGDJzbjvCsl9eLjMQwGeElsqmzcHg==",
8500
+ "funding": [
8501
+ {
8502
+ "type": "GitHub Sponsors",
8503
+ "url": "https://github.com/sponsors/unifiedjs"
8504
+ },
8505
+ {
8506
+ "type": "OpenCollective",
8507
+ "url": "https://opencollective.com/unified"
8508
+ }
8509
+ ],
8510
+ "license": "MIT",
8511
+ "dependencies": {
8512
+ "micromark-util-types": "^2.0.0"
8513
+ }
8514
+ },
8515
+ "node_modules/micromark-util-sanitize-uri": {
8516
+ "version": "2.0.1",
8517
+ "resolved": "https://registry.npmjs.org/micromark-util-sanitize-uri/-/micromark-util-sanitize-uri-2.0.1.tgz",
8518
+ "integrity": "sha512-9N9IomZ/YuGGZZmQec1MbgxtlgougxTodVwDzzEouPKo3qFWvymFHWcnDi2vzV1ff6kas9ucW+o3yzJK9YB1AQ==",
8519
+ "funding": [
8520
+ {
8521
+ "type": "GitHub Sponsors",
8522
+ "url": "https://github.com/sponsors/unifiedjs"
8523
+ },
8524
+ {
8525
+ "type": "OpenCollective",
8526
+ "url": "https://opencollective.com/unified"
8527
+ }
8528
+ ],
8529
+ "license": "MIT",
8530
+ "dependencies": {
8531
+ "micromark-util-character": "^2.0.0",
8532
+ "micromark-util-encode": "^2.0.0",
8533
+ "micromark-util-symbol": "^2.0.0"
8534
+ }
8535
+ },
8536
+ "node_modules/micromark-util-subtokenize": {
8537
+ "version": "2.1.0",
8538
+ "resolved": "https://registry.npmjs.org/micromark-util-subtokenize/-/micromark-util-subtokenize-2.1.0.tgz",
8539
+ "integrity": "sha512-XQLu552iSctvnEcgXw6+Sx75GflAPNED1qx7eBJ+wydBb2KCbRZe+NwvIEEMM83uml1+2WSXpBAcp9IUCgCYWA==",
8540
+ "funding": [
8541
+ {
8542
+ "type": "GitHub Sponsors",
8543
+ "url": "https://github.com/sponsors/unifiedjs"
8544
+ },
8545
+ {
8546
+ "type": "OpenCollective",
8547
+ "url": "https://opencollective.com/unified"
8548
+ }
8549
+ ],
8550
+ "license": "MIT",
8551
+ "dependencies": {
8552
+ "devlop": "^1.0.0",
8553
+ "micromark-util-chunked": "^2.0.0",
8554
+ "micromark-util-symbol": "^2.0.0",
8555
+ "micromark-util-types": "^2.0.0"
8556
+ }
8557
+ },
8558
+ "node_modules/micromark-util-symbol": {
8559
+ "version": "2.0.1",
8560
+ "resolved": "https://registry.npmjs.org/micromark-util-symbol/-/micromark-util-symbol-2.0.1.tgz",
8561
+ "integrity": "sha512-vs5t8Apaud9N28kgCrRUdEed4UJ+wWNvicHLPxCa9ENlYuAY31M0ETy5y1vA33YoNPDFTghEbnh6efaE8h4x0Q==",
8562
+ "funding": [
8563
+ {
8564
+ "type": "GitHub Sponsors",
8565
+ "url": "https://github.com/sponsors/unifiedjs"
8566
+ },
8567
+ {
8568
+ "type": "OpenCollective",
8569
+ "url": "https://opencollective.com/unified"
8570
+ }
8571
+ ],
8572
+ "license": "MIT"
8573
+ },
8574
+ "node_modules/micromark-util-types": {
8575
+ "version": "2.0.2",
8576
+ "resolved": "https://registry.npmjs.org/micromark-util-types/-/micromark-util-types-2.0.2.tgz",
8577
+ "integrity": "sha512-Yw0ECSpJoViF1qTU4DC6NwtC4aWGt1EkzaQB8KPPyCRR8z9TWeV0HbEFGTO+ZY1wB22zmxnJqhPyTpOVCpeHTA==",
8578
+ "funding": [
8579
+ {
8580
+ "type": "GitHub Sponsors",
8581
+ "url": "https://github.com/sponsors/unifiedjs"
8582
+ },
8583
+ {
8584
+ "type": "OpenCollective",
8585
+ "url": "https://opencollective.com/unified"
8586
+ }
8587
+ ],
8588
+ "license": "MIT"
8589
+ },
8590
+ "node_modules/micromatch": {
8591
+ "version": "4.0.8",
8592
+ "resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.8.tgz",
8593
+ "integrity": "sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA==",
8594
+ "dev": true,
8595
+ "license": "MIT",
8596
+ "dependencies": {
8597
+ "braces": "^3.0.3",
8598
+ "picomatch": "^2.3.1"
8599
+ },
8600
+ "engines": {
8601
+ "node": ">=8.6"
8602
+ }
8603
+ },
8604
+ "node_modules/minimatch": {
8605
+ "version": "3.1.2",
8606
+ "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
8607
+ "integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
8608
+ "dev": true,
8609
+ "license": "ISC",
8610
+ "dependencies": {
8611
+ "brace-expansion": "^1.1.7"
8612
+ },
8613
+ "engines": {
8614
+ "node": "*"
8615
+ }
8616
+ },
8617
+ "node_modules/minimist": {
8618
+ "version": "1.2.8",
8619
+ "resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz",
8620
+ "integrity": "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==",
8621
+ "dev": true,
8622
+ "license": "MIT",
8623
+ "funding": {
8624
+ "url": "https://github.com/sponsors/ljharb"
8625
+ }
8626
+ },
8627
+ "node_modules/motion": {
8628
+ "version": "12.23.24",
8629
+ "resolved": "https://registry.npmjs.org/motion/-/motion-12.23.24.tgz",
8630
+ "integrity": "sha512-Rc5E7oe2YZ72N//S3QXGzbnXgqNrTESv8KKxABR20q2FLch9gHLo0JLyYo2hZ238bZ9Gx6cWhj9VO0IgwbMjCw==",
8631
+ "license": "MIT",
8632
+ "dependencies": {
8633
+ "framer-motion": "^12.23.24",
8634
+ "tslib": "^2.4.0"
8635
+ },
8636
+ "peerDependencies": {
8637
+ "@emotion/is-prop-valid": "*",
8638
+ "react": "^18.0.0 || ^19.0.0",
8639
+ "react-dom": "^18.0.0 || ^19.0.0"
8640
+ },
8641
+ "peerDependenciesMeta": {
8642
+ "@emotion/is-prop-valid": {
8643
+ "optional": true
8644
  },
8645
  "react": {
8646
  "optional": true
 
9028
  "node": ">=6"
9029
  }
9030
  },
9031
+ "node_modules/parse-entities": {
9032
+ "version": "4.0.2",
9033
+ "resolved": "https://registry.npmjs.org/parse-entities/-/parse-entities-4.0.2.tgz",
9034
+ "integrity": "sha512-GG2AQYWoLgL877gQIKeRPGO1xF9+eG1ujIb5soS5gPvLQ1y2o8FL90w2QWNdf9I361Mpp7726c+lj3U0qK1uGw==",
9035
+ "license": "MIT",
9036
+ "dependencies": {
9037
+ "@types/unist": "^2.0.0",
9038
+ "character-entities-legacy": "^3.0.0",
9039
+ "character-reference-invalid": "^2.0.0",
9040
+ "decode-named-character-reference": "^1.0.0",
9041
+ "is-alphanumerical": "^2.0.0",
9042
+ "is-decimal": "^2.0.0",
9043
+ "is-hexadecimal": "^2.0.0"
9044
+ },
9045
+ "funding": {
9046
+ "type": "github",
9047
+ "url": "https://github.com/sponsors/wooorm"
9048
+ }
9049
+ },
9050
+ "node_modules/parse-entities/node_modules/@types/unist": {
9051
+ "version": "2.0.11",
9052
+ "resolved": "https://registry.npmjs.org/@types/unist/-/unist-2.0.11.tgz",
9053
+ "integrity": "sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA==",
9054
+ "license": "MIT"
9055
+ },
9056
  "node_modules/parse-json": {
9057
  "version": "5.2.0",
9058
  "resolved": "https://registry.npmjs.org/parse-json/-/parse-json-5.2.0.tgz",
 
9185
  "react-is": "^16.13.1"
9186
  }
9187
  },
9188
+ "node_modules/property-information": {
9189
+ "version": "7.1.0",
9190
+ "resolved": "https://registry.npmjs.org/property-information/-/property-information-7.1.0.tgz",
9191
+ "integrity": "sha512-TwEZ+X+yCJmYfL7TPUOcvBZ4QfoT5YenQiJuX//0th53DE6w0xxLEtfK3iyryQFddXuvkIk51EEgrJQ0WJkOmQ==",
9192
+ "license": "MIT",
9193
+ "funding": {
9194
+ "type": "github",
9195
+ "url": "https://github.com/sponsors/wooorm"
9196
+ }
9197
+ },
9198
  "node_modules/punycode": {
9199
  "version": "2.3.1",
9200
  "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz",
 
9279
  "integrity": "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ==",
9280
  "license": "MIT"
9281
  },
9282
+ "node_modules/react-markdown": {
9283
+ "version": "10.1.0",
9284
+ "resolved": "https://registry.npmjs.org/react-markdown/-/react-markdown-10.1.0.tgz",
9285
+ "integrity": "sha512-qKxVopLT/TyA6BX3Ue5NwabOsAzm0Q7kAPwq6L+wWDwisYs7R8vZ0nRXqq6rkueboxpkjvLGU9fWifiX/ZZFxQ==",
9286
+ "license": "MIT",
9287
+ "dependencies": {
9288
+ "@types/hast": "^3.0.0",
9289
+ "@types/mdast": "^4.0.0",
9290
+ "devlop": "^1.0.0",
9291
+ "hast-util-to-jsx-runtime": "^2.0.0",
9292
+ "html-url-attributes": "^3.0.0",
9293
+ "mdast-util-to-hast": "^13.0.0",
9294
+ "remark-parse": "^11.0.0",
9295
+ "remark-rehype": "^11.0.0",
9296
+ "unified": "^11.0.0",
9297
+ "unist-util-visit": "^5.0.0",
9298
+ "vfile": "^6.0.0"
9299
+ },
9300
+ "funding": {
9301
+ "type": "opencollective",
9302
+ "url": "https://opencollective.com/unified"
9303
+ },
9304
+ "peerDependencies": {
9305
+ "@types/react": ">=18",
9306
+ "react": ">=18"
9307
+ }
9308
+ },
9309
  "node_modules/react-remove-scroll": {
9310
  "version": "2.7.1",
9311
  "resolved": "https://registry.npmjs.org/react-remove-scroll/-/react-remove-scroll-2.7.1.tgz",
 
9487
  "regjsparser": "bin/parser"
9488
  }
9489
  },
9490
+ "node_modules/rehype-sanitize": {
9491
+ "version": "6.0.0",
9492
+ "resolved": "https://registry.npmjs.org/rehype-sanitize/-/rehype-sanitize-6.0.0.tgz",
9493
+ "integrity": "sha512-CsnhKNsyI8Tub6L4sm5ZFsme4puGfc6pYylvXo1AeqaGbjOYyzNv3qZPwvs0oMJ39eryyeOdmxwUIo94IpEhqg==",
9494
+ "license": "MIT",
9495
+ "dependencies": {
9496
+ "@types/hast": "^3.0.0",
9497
+ "hast-util-sanitize": "^5.0.0"
9498
+ },
9499
+ "funding": {
9500
+ "type": "opencollective",
9501
+ "url": "https://opencollective.com/unified"
9502
+ }
9503
+ },
9504
+ "node_modules/remark-gfm": {
9505
+ "version": "4.0.1",
9506
+ "resolved": "https://registry.npmjs.org/remark-gfm/-/remark-gfm-4.0.1.tgz",
9507
+ "integrity": "sha512-1quofZ2RQ9EWdeN34S79+KExV1764+wCUGop5CPL1WGdD0ocPpu91lzPGbwWMECpEpd42kJGQwzRfyov9j4yNg==",
9508
+ "license": "MIT",
9509
+ "dependencies": {
9510
+ "@types/mdast": "^4.0.0",
9511
+ "mdast-util-gfm": "^3.0.0",
9512
+ "micromark-extension-gfm": "^3.0.0",
9513
+ "remark-parse": "^11.0.0",
9514
+ "remark-stringify": "^11.0.0",
9515
+ "unified": "^11.0.0"
9516
+ },
9517
+ "funding": {
9518
+ "type": "opencollective",
9519
+ "url": "https://opencollective.com/unified"
9520
+ }
9521
+ },
9522
+ "node_modules/remark-parse": {
9523
+ "version": "11.0.0",
9524
+ "resolved": "https://registry.npmjs.org/remark-parse/-/remark-parse-11.0.0.tgz",
9525
+ "integrity": "sha512-FCxlKLNGknS5ba/1lmpYijMUzX2esxW5xQqjWxw2eHFfS2MSdaHVINFmhjo+qN1WhZhNimq0dZATN9pH0IDrpA==",
9526
+ "license": "MIT",
9527
+ "dependencies": {
9528
+ "@types/mdast": "^4.0.0",
9529
+ "mdast-util-from-markdown": "^2.0.0",
9530
+ "micromark-util-types": "^2.0.0",
9531
+ "unified": "^11.0.0"
9532
+ },
9533
+ "funding": {
9534
+ "type": "opencollective",
9535
+ "url": "https://opencollective.com/unified"
9536
+ }
9537
+ },
9538
+ "node_modules/remark-rehype": {
9539
+ "version": "11.1.2",
9540
+ "resolved": "https://registry.npmjs.org/remark-rehype/-/remark-rehype-11.1.2.tgz",
9541
+ "integrity": "sha512-Dh7l57ianaEoIpzbp0PC9UKAdCSVklD8E5Rpw7ETfbTl3FqcOOgq5q2LVDhgGCkaBv7p24JXikPdvhhmHvKMsw==",
9542
+ "license": "MIT",
9543
+ "dependencies": {
9544
+ "@types/hast": "^3.0.0",
9545
+ "@types/mdast": "^4.0.0",
9546
+ "mdast-util-to-hast": "^13.0.0",
9547
+ "unified": "^11.0.0",
9548
+ "vfile": "^6.0.0"
9549
+ },
9550
+ "funding": {
9551
+ "type": "opencollective",
9552
+ "url": "https://opencollective.com/unified"
9553
+ }
9554
+ },
9555
+ "node_modules/remark-stringify": {
9556
+ "version": "11.0.0",
9557
+ "resolved": "https://registry.npmjs.org/remark-stringify/-/remark-stringify-11.0.0.tgz",
9558
+ "integrity": "sha512-1OSmLd3awB/t8qdoEOMazZkNsfVTeY4fTsgzcQFdXNq8ToTN4ZGwrMnlda4K6smTFKD+GRV6O48i6Z4iKgPPpw==",
9559
+ "license": "MIT",
9560
+ "dependencies": {
9561
+ "@types/mdast": "^4.0.0",
9562
+ "mdast-util-to-markdown": "^2.0.0",
9563
+ "unified": "^11.0.0"
9564
+ },
9565
+ "funding": {
9566
+ "type": "opencollective",
9567
+ "url": "https://opencollective.com/unified"
9568
+ }
9569
+ },
9570
  "node_modules/resolve": {
9571
  "version": "1.22.11",
9572
  "resolved": "https://registry.npmjs.org/resolve/-/resolve-1.22.11.tgz",
 
9936
  "node": ">=0.10.0"
9937
  }
9938
  },
9939
+ "node_modules/space-separated-tokens": {
9940
+ "version": "2.0.2",
9941
+ "resolved": "https://registry.npmjs.org/space-separated-tokens/-/space-separated-tokens-2.0.2.tgz",
9942
+ "integrity": "sha512-PEGlAwrG8yXGXRjW32fGbg66JAlOAwbObuqVoJpv/mRgoWDQfgH1wDPvtzWyUSNAXBGSk8h755YDbbcEy3SH2Q==",
9943
+ "license": "MIT",
9944
+ "funding": {
9945
+ "type": "github",
9946
+ "url": "https://github.com/sponsors/wooorm"
9947
+ }
9948
+ },
9949
  "node_modules/stable-hash": {
9950
  "version": "0.0.5",
9951
  "resolved": "https://registry.npmjs.org/stable-hash/-/stable-hash-0.0.5.tgz",
 
10080
  "url": "https://github.com/sponsors/ljharb"
10081
  }
10082
  },
10083
+ "node_modules/stringify-entities": {
10084
+ "version": "4.0.4",
10085
+ "resolved": "https://registry.npmjs.org/stringify-entities/-/stringify-entities-4.0.4.tgz",
10086
+ "integrity": "sha512-IwfBptatlO+QCJUo19AqvrPNqlVMpW9YEL2LIVY+Rpv2qsjCGxaDLNRgeGsQWJhfItebuJhsGSLjaBbNSQ+ieg==",
10087
+ "license": "MIT",
10088
+ "dependencies": {
10089
+ "character-entities-html4": "^2.0.0",
10090
+ "character-entities-legacy": "^3.0.0"
10091
+ },
10092
+ "funding": {
10093
+ "type": "github",
10094
+ "url": "https://github.com/sponsors/wooorm"
10095
+ }
10096
+ },
10097
  "node_modules/strip-bom": {
10098
  "version": "3.0.0",
10099
  "resolved": "https://registry.npmjs.org/strip-bom/-/strip-bom-3.0.0.tgz",
 
10117
  "url": "https://github.com/sponsors/sindresorhus"
10118
  }
10119
  },
10120
+ "node_modules/style-to-js": {
10121
+ "version": "1.1.21",
10122
+ "resolved": "https://registry.npmjs.org/style-to-js/-/style-to-js-1.1.21.tgz",
10123
+ "integrity": "sha512-RjQetxJrrUJLQPHbLku6U/ocGtzyjbJMP9lCNK7Ag0CNh690nSH8woqWH9u16nMjYBAok+i7JO1NP2pOy8IsPQ==",
10124
+ "license": "MIT",
10125
+ "dependencies": {
10126
+ "style-to-object": "1.0.14"
10127
+ }
10128
+ },
10129
+ "node_modules/style-to-object": {
10130
+ "version": "1.0.14",
10131
+ "resolved": "https://registry.npmjs.org/style-to-object/-/style-to-object-1.0.14.tgz",
10132
+ "integrity": "sha512-LIN7rULI0jBscWQYaSswptyderlarFkjQ+t79nzty8tcIAceVomEVlLzH5VP4Cmsv6MtKhs7qaAiwlcp+Mgaxw==",
10133
+ "license": "MIT",
10134
+ "dependencies": {
10135
+ "inline-style-parser": "0.2.7"
10136
+ }
10137
+ },
10138
  "node_modules/styled-jsx": {
10139
  "version": "5.1.6",
10140
  "resolved": "https://registry.npmjs.org/styled-jsx/-/styled-jsx-5.1.6.tgz",
 
10306
  "node": ">=8.0"
10307
  }
10308
  },
10309
+ "node_modules/trim-lines": {
10310
+ "version": "3.0.1",
10311
+ "resolved": "https://registry.npmjs.org/trim-lines/-/trim-lines-3.0.1.tgz",
10312
+ "integrity": "sha512-kRj8B+YHZCc9kQYdWfJB2/oUl9rA99qbowYYBtr4ui4mZyAQ2JpvVBd/6U2YloATfqBhBTSMhTpgBHtU0Mf3Rg==",
10313
+ "license": "MIT",
10314
+ "funding": {
10315
+ "type": "github",
10316
+ "url": "https://github.com/sponsors/wooorm"
10317
+ }
10318
+ },
10319
+ "node_modules/trough": {
10320
+ "version": "2.2.0",
10321
+ "resolved": "https://registry.npmjs.org/trough/-/trough-2.2.0.tgz",
10322
+ "integrity": "sha512-tmMpK00BjZiUyVyvrBK7knerNgmgvcV/KLVyuma/SC+TQN167GrMRciANTz09+k3zW8L8t60jWO1GpfkZdjTaw==",
10323
+ "license": "MIT",
10324
+ "funding": {
10325
+ "type": "github",
10326
+ "url": "https://github.com/sponsors/wooorm"
10327
+ }
10328
+ },
10329
  "node_modules/ts-api-utils": {
10330
  "version": "2.1.0",
10331
  "resolved": "https://registry.npmjs.org/ts-api-utils/-/ts-api-utils-2.1.0.tgz",
 
10566
  "node": ">=4"
10567
  }
10568
  },
10569
+ "node_modules/unified": {
10570
+ "version": "11.0.5",
10571
+ "resolved": "https://registry.npmjs.org/unified/-/unified-11.0.5.tgz",
10572
+ "integrity": "sha512-xKvGhPWw3k84Qjh8bI3ZeJjqnyadK+GEFtazSfZv/rKeTkTjOJho6mFqh2SM96iIcZokxiOpg78GazTSg8+KHA==",
10573
+ "license": "MIT",
10574
+ "dependencies": {
10575
+ "@types/unist": "^3.0.0",
10576
+ "bail": "^2.0.0",
10577
+ "devlop": "^1.0.0",
10578
+ "extend": "^3.0.0",
10579
+ "is-plain-obj": "^4.0.0",
10580
+ "trough": "^2.0.0",
10581
+ "vfile": "^6.0.0"
10582
+ },
10583
+ "funding": {
10584
+ "type": "opencollective",
10585
+ "url": "https://opencollective.com/unified"
10586
+ }
10587
+ },
10588
+ "node_modules/unist-util-is": {
10589
+ "version": "6.0.1",
10590
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-6.0.1.tgz",
10591
+ "integrity": "sha512-LsiILbtBETkDz8I9p1dQ0uyRUWuaQzd/cuEeS1hoRSyW5E5XGmTzlwY1OrNzzakGowI9Dr/I8HVaw4hTtnxy8g==",
10592
+ "license": "MIT",
10593
+ "dependencies": {
10594
+ "@types/unist": "^3.0.0"
10595
+ },
10596
+ "funding": {
10597
+ "type": "opencollective",
10598
+ "url": "https://opencollective.com/unified"
10599
+ }
10600
+ },
10601
+ "node_modules/unist-util-position": {
10602
+ "version": "5.0.0",
10603
+ "resolved": "https://registry.npmjs.org/unist-util-position/-/unist-util-position-5.0.0.tgz",
10604
+ "integrity": "sha512-fucsC7HjXvkB5R3kTCO7kUjRdrS0BJt3M/FPxmHMBOm8JQi2BsHAHFsy27E0EolP8rp0NzXsJ+jNPyDWvOJZPA==",
10605
+ "license": "MIT",
10606
+ "dependencies": {
10607
+ "@types/unist": "^3.0.0"
10608
+ },
10609
+ "funding": {
10610
+ "type": "opencollective",
10611
+ "url": "https://opencollective.com/unified"
10612
+ }
10613
+ },
10614
+ "node_modules/unist-util-stringify-position": {
10615
+ "version": "4.0.0",
10616
+ "resolved": "https://registry.npmjs.org/unist-util-stringify-position/-/unist-util-stringify-position-4.0.0.tgz",
10617
+ "integrity": "sha512-0ASV06AAoKCDkS2+xw5RXJywruurpbC4JZSm7nr7MOt1ojAzvyyaO+UxZf18j8FCF6kmzCZKcAgN/yu2gm2XgQ==",
10618
+ "license": "MIT",
10619
+ "dependencies": {
10620
+ "@types/unist": "^3.0.0"
10621
+ },
10622
+ "funding": {
10623
+ "type": "opencollective",
10624
+ "url": "https://opencollective.com/unified"
10625
+ }
10626
+ },
10627
+ "node_modules/unist-util-visit": {
10628
+ "version": "5.1.0",
10629
+ "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-5.1.0.tgz",
10630
+ "integrity": "sha512-m+vIdyeCOpdr/QeQCu2EzxX/ohgS8KbnPDgFni4dQsfSCtpz8UqDyY5GjRru8PDKuYn7Fq19j1CQ+nJSsGKOzg==",
10631
+ "license": "MIT",
10632
+ "dependencies": {
10633
+ "@types/unist": "^3.0.0",
10634
+ "unist-util-is": "^6.0.0",
10635
+ "unist-util-visit-parents": "^6.0.0"
10636
+ },
10637
+ "funding": {
10638
+ "type": "opencollective",
10639
+ "url": "https://opencollective.com/unified"
10640
+ }
10641
+ },
10642
+ "node_modules/unist-util-visit-parents": {
10643
+ "version": "6.0.2",
10644
+ "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-6.0.2.tgz",
10645
+ "integrity": "sha512-goh1s1TBrqSqukSc8wrjwWhL0hiJxgA8m4kFxGlQ+8FYQ3C/m11FcTs4YYem7V664AhHVvgoQLk890Ssdsr2IQ==",
10646
+ "license": "MIT",
10647
+ "dependencies": {
10648
+ "@types/unist": "^3.0.0",
10649
+ "unist-util-is": "^6.0.0"
10650
+ },
10651
+ "funding": {
10652
+ "type": "opencollective",
10653
+ "url": "https://opencollective.com/unified"
10654
+ }
10655
+ },
10656
  "node_modules/unrs-resolver": {
10657
  "version": "1.11.1",
10658
  "resolved": "https://registry.npmjs.org/unrs-resolver/-/unrs-resolver-1.11.1.tgz",
 
10771
  }
10772
  }
10773
  },
10774
+ "node_modules/vfile": {
10775
+ "version": "6.0.3",
10776
+ "resolved": "https://registry.npmjs.org/vfile/-/vfile-6.0.3.tgz",
10777
+ "integrity": "sha512-KzIbH/9tXat2u30jf+smMwFCsno4wHVdNmzFyL+T/L3UGqqk6JKfVqOFOZEpZSHADH1k40ab6NUIXZq422ov3Q==",
10778
+ "license": "MIT",
10779
+ "dependencies": {
10780
+ "@types/unist": "^3.0.0",
10781
+ "vfile-message": "^4.0.0"
10782
+ },
10783
+ "funding": {
10784
+ "type": "opencollective",
10785
+ "url": "https://opencollective.com/unified"
10786
+ }
10787
+ },
10788
+ "node_modules/vfile-message": {
10789
+ "version": "4.0.3",
10790
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-4.0.3.tgz",
10791
+ "integrity": "sha512-QTHzsGd1EhbZs4AsQ20JX1rC3cOlt/IWJruk893DfLRr57lcnOeMaWG4K0JrRta4mIJZKth2Au3mM3u03/JWKw==",
10792
+ "license": "MIT",
10793
+ "dependencies": {
10794
+ "@types/unist": "^3.0.0",
10795
+ "unist-util-stringify-position": "^4.0.0"
10796
+ },
10797
+ "funding": {
10798
+ "type": "opencollective",
10799
+ "url": "https://opencollective.com/unified"
10800
+ }
10801
+ },
10802
  "node_modules/which": {
10803
  "version": "2.0.2",
10804
  "resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
 
10955
  "peerDependencies": {
10956
  "zod": "^3.25.0 || ^4.0.0"
10957
  }
10958
+ },
10959
+ "node_modules/zwitch": {
10960
+ "version": "2.0.4",
10961
+ "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz",
10962
+ "integrity": "sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==",
10963
+ "license": "MIT",
10964
+ "funding": {
10965
+ "type": "github",
10966
+ "url": "https://github.com/sponsors/wooorm"
10967
+ }
10968
  }
10969
  }
10970
  }
frontend/package.json CHANGED
@@ -22,7 +22,10 @@
22
  "react-dom": "^19.2.1",
23
  "react-dropzone": "^14.3.8",
24
  "react-icons": "^5.5.0",
 
25
  "react-use-measure": "^2.1.7",
 
 
26
  "tailwind-merge": "^3.3.1"
27
  },
28
  "devDependencies": {
 
22
  "react-dom": "^19.2.1",
23
  "react-dropzone": "^14.3.8",
24
  "react-icons": "^5.5.0",
25
+ "react-markdown": "^10.1.0",
26
  "react-use-measure": "^2.1.7",
27
+ "rehype-sanitize": "^6.0.0",
28
+ "remark-gfm": "^4.0.1",
29
  "tailwind-merge": "^3.3.1"
30
  },
31
  "devDependencies": {
ingest.py CHANGED
@@ -100,7 +100,11 @@ def ingest_single_technique(
100
 
101
 
102
  def ingest_data():
103
- """Load CBT book, chunk it 6 ways, and upload ALL to a SINGLE Pinecone index."""
 
 
 
 
104
  load_dotenv()
105
 
106
  pinecone_key = os.getenv("PINECONE_API_KEY")
@@ -137,6 +141,7 @@ def ingest_data():
137
  print(f"{'='*80}")
138
 
139
  all_chunks = []
 
140
  results = {}
141
 
142
  for i, technique in enumerate(CHUNKING_TECHNIQUES, 1):
@@ -149,6 +154,11 @@ def ingest_data():
149
  total_techniques=len(CHUNKING_TECHNIQUES),
150
  )
151
  all_chunks.extend(chunks)
 
 
 
 
 
152
  results[technique["name"]] = {
153
  "status": "success",
154
  "chunks": len(chunks),
@@ -215,6 +225,9 @@ def ingest_data():
215
 
216
  print("\nYou can now start the API server with:")
217
  print(" python -m uvicorn api:app --host 0.0.0.0 --port 8000")
 
 
 
218
 
219
 
220
  if __name__ == "__main__":
 
100
 
101
 
102
  def ingest_data():
103
+ """Load CBT book, chunk it 6 ways, and upload ALL to a SINGLE Pinecone index.
104
+
105
+ Returns:
106
+ Tuple of (all_chunks, configured_technique_chunks, processor) for reuse in retrieval pipeline.
107
+ """
108
  load_dotenv()
109
 
110
  pinecone_key = os.getenv("PINECONE_API_KEY")
 
141
  print(f"{'='*80}")
142
 
143
  all_chunks = []
144
+ configured_technique_chunks = []
145
  results = {}
146
 
147
  for i, technique in enumerate(CHUNKING_TECHNIQUES, 1):
 
154
  total_techniques=len(CHUNKING_TECHNIQUES),
155
  )
156
  all_chunks.extend(chunks)
157
+
158
+ # Save chunks for the configured technique (for retrieval pipeline)
159
+ if technique["name"] == cfg.processing['technique']:
160
+ configured_technique_chunks = chunks
161
+
162
  results[technique["name"]] = {
163
  "status": "success",
164
  "chunks": len(chunks),
 
225
 
226
  print("\nYou can now start the API server with:")
227
  print(" python -m uvicorn api:app --host 0.0.0.0 --port 8000")
228
+
229
+ # Return chunks and processor for reuse in retrieval pipeline
230
+ return all_chunks, configured_technique_chunks, proc, index
231
 
232
 
233
  if __name__ == "__main__":
main.py CHANGED
@@ -1,4 +1,7 @@
1
  import os
 
 
 
2
  from dotenv import load_dotenv
3
  from config_loader import cfg
4
 
@@ -8,6 +11,7 @@ from retriever.generator import RAGGenerator
8
  from retriever.processor import ChunkProcessor
9
  from retriever.evaluator import RAGEvaluator
10
  from data_loader import load_cbt_book, get_book_stats
 
11
 
12
  # Import model fleet
13
  from models.llama_3_8b import Llama3_8B
@@ -27,172 +31,440 @@ MODEL_MAP = {
27
  load_dotenv()
28
 
29
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
30
  def main():
31
- """Main function to run the RAG tournament on CBT book."""
32
  hf_token = os.getenv("HF_TOKEN")
33
  pinecone_key = os.getenv("PINECONE_API_KEY")
34
- groq_key = os.getenv("GROQ_API_KEY")
35
 
36
  # Verify environment variables
37
  if not hf_token:
38
  raise RuntimeError("HF_TOKEN not found in environment variables")
39
  if not pinecone_key:
40
  raise RuntimeError("PINECONE_API_KEY not found in environment variables")
41
- if not groq_key:
42
- raise RuntimeError("GROQ_API_KEY not found in environment variables")
43
 
44
- # Example query for testing
45
  query = "What is cognitive behavior therapy and how does it work?"
46
 
47
  print("=" * 80)
48
- print("CBT RAG SYSTEM - LOADING DATA")
49
- print("=" * 80)
50
-
51
- # 1. Data Ingestion - Load CBT Book
52
- raw_data = load_cbt_book("EntireBookCleaned.txt")
53
- stats = get_book_stats(raw_data)
54
- print(f"Book Statistics: {stats}")
55
-
56
- print("\n" + "=" * 80)
57
- print("CHUNKING AND EMBEDDING")
58
- print("=" * 80)
59
-
60
- # 2. Chunking & Embedding
61
- proc = ChunkProcessor(model_name=cfg.processing['embedding_model'])
62
- final_chunks = proc.process(
63
- raw_data,
64
- technique=cfg.processing['technique'],
65
- chunk_size=cfg.processing['chunk_size'],
66
- chunk_overlap=cfg.processing['chunk_overlap'],
67
- max_docs=cfg.project.get('doc_limit'), # None means load all
68
- verbose=True
69
- )
70
-
71
- print(f"\nTotal chunks created: {len(final_chunks)}")
72
-
73
- print("\n" + "=" * 80)
74
- print("VECTOR DATABASE SETUP")
75
- print("=" * 80)
76
-
77
- # 3. Vector DB - Create/Update Pinecone Index
78
- index = get_pinecone_index(
79
- pinecone_key,
80
- cfg.db['base_index_name'],
81
- technique=cfg.processing['technique'],
82
- dimension=cfg.db['dimension']
83
- )
84
- refresh_pinecone_index(index, final_chunks, batch_size=cfg.db['batch_size'])
85
-
86
- print("\n" + "=" * 80)
87
- print("RETRIEVAL SETUP")
88
  print("=" * 80)
89
 
90
- # 4. Retrieval Setup
91
- retriever = HybridRetriever(final_chunks, proc.encoder)
92
-
93
  print("\n" + "=" * 80)
94
- print(f"TESTING QUERY: {query}")
95
  print("=" * 80)
96
 
97
- # Test retrieval
98
- context_chunks = retriever.search(
99
- query, index,
100
- mode=cfg.retrieval['mode'],
101
- rerank_strategy=cfg.retrieval['rerank_strategy'],
102
- use_mmr=cfg.retrieval['use_mmr'],
103
- top_k=cfg.retrieval['top_k'],
104
- final_k=cfg.retrieval['final_k']
105
- )
106
-
107
- print(f"\nRetrieved {len(context_chunks)} context chunks")
108
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
109
  print("\n" + "=" * 80)
110
- print("MODEL TOURNAMENT")
111
  print("=" * 80)
112
 
113
- # 5. Initialize Models
 
114
  rag_engine = RAGGenerator()
115
  models = {name: MODEL_MAP[name](token=hf_token) for name in cfg.model_list}
116
 
117
- # 6. Setup Evaluator with Judge
 
 
 
 
 
118
  evaluator = RAGEvaluator(
119
  judge_model=cfg.gen['judge_model'],
120
  embedding_model=proc.encoder,
121
- api_key=groq_key
122
  )
123
 
124
- tournament_results = {}
 
 
 
125
 
126
- # 7. Tournament Loop
127
- for name, model_inst in models.items():
128
- print(f"\n{'='*60}")
129
- print(f"Processing {name}")
130
- print('='*60)
 
 
131
  try:
132
- # Generation
133
- answer = rag_engine.get_answer(
134
- model_inst, query, context_chunks,
135
- temperature=cfg.gen['temperature']
 
 
 
 
136
  )
 
 
 
 
 
 
 
 
 
 
 
137
 
138
- print(f"\nAnswer from {name}:")
139
- print(answer[:500] + "..." if len(answer) > 500 else answer)
140
-
141
- # Faithfulness Evaluation
142
- faith = evaluator.evaluate_faithfulness(answer, context_chunks)
143
- # Relevancy Evaluation
144
- rel = evaluator.evaluate_relevancy(query, answer)
145
-
146
- tournament_results[name] = {
147
- "answer": answer,
148
- "Faithfulness": faith['score'],
149
- "Relevancy": rel['score'],
150
- "Claims": faith['details']
151
- }
152
-
153
- print(f"\n{name} Results:")
154
- print(f" Faithfulness: {faith['score']:.1f}%")
155
- print(f" Relevancy: {rel['score']:.3f}")
156
 
157
- except Exception as e:
158
- print(f"Error evaluating {name}: {e}")
159
- tournament_results[name] = {
160
- "answer": "",
161
- "Faithfulness": 0,
162
- "Relevancy": 0,
163
- "Claims": [],
164
- "error": str(e)
165
- }
166
 
167
- # 8. Final Results Summary
168
  print("\n" + "=" * 80)
169
- print("TOURNAMENT RESULTS SUMMARY")
170
  print("=" * 80)
171
 
172
  print(f"\nQuery: {query}")
173
- print(f"\nRetrieved Context Chunks: {len(context_chunks)}")
 
 
 
 
174
  print("\n" + "-" * 60)
175
- print(f"{'Model':<20} {'Faithfulness':>15} {'Relevancy':>15}")
176
  print("-" * 60)
177
 
178
- for name, results in tournament_results.items():
179
- faith = results.get('Faithfulness', 0)
180
- rel = results.get('Relevancy', 0)
181
- print(f"{name:<20} {faith:>14.1f}% {rel:>15.3f}")
 
 
 
 
 
 
 
 
 
 
 
182
 
183
  print("-" * 60)
184
 
185
- # Find best model
186
- if tournament_results:
187
- best_model = max(
188
- tournament_results.items(),
189
- key=lambda x: x[1].get('Faithfulness', 0) + x[1].get('Relevancy', 0)
190
- )
191
- print(f"\nBest Overall Model: {best_model[0]}")
192
- print(f" Faithfulness: {best_model[1]['Faithfulness']:.1f}%")
193
- print(f" Relevancy: {best_model[1]['Relevancy']:.3f}")
194
 
195
- return tournament_results
196
 
197
 
198
  if __name__ == "__main__":
 
1
  import os
2
+ import json
3
+ import time
4
+ from datetime import datetime
5
  from dotenv import load_dotenv
6
  from config_loader import cfg
7
 
 
11
  from retriever.processor import ChunkProcessor
12
  from retriever.evaluator import RAGEvaluator
13
  from data_loader import load_cbt_book, get_book_stats
14
+ from ingest import ingest_data, CHUNKING_TECHNIQUES
15
 
16
  # Import model fleet
17
  from models.llama_3_8b import Llama3_8B
 
31
  load_dotenv()
32
 
33
 
34
+ def run_rag_for_technique(technique_name, query, index, encoder, models, evaluator, rag_engine):
35
+ """Run RAG pipeline for a specific chunking technique."""
36
+
37
+ print(f"\n{'='*80}")
38
+ print(f"TECHNIQUE: {technique_name.upper()}")
39
+ print(f"{'='*80}")
40
+
41
+ # Filter chunks by technique metadata
42
+ query_vector = encoder.encode(query).tolist()
43
+
44
+ # Query with metadata filter for this technique - get more candidates for reranking
45
+ res = index.query(
46
+ vector=query_vector,
47
+ top_k=100, # Get 100 candidates for reranking
48
+ include_metadata=True,
49
+ filter={"technique": {"$eq": technique_name}}
50
+ )
51
+
52
+ # Extract context chunks
53
+ all_candidates = [match['metadata']['text'] for match in res['matches']]
54
+
55
+ print(f"\nRetrieved {len(all_candidates)} candidate chunks for technique '{technique_name}'")
56
+
57
+ if not all_candidates:
58
+ print(f"WARNING: No chunks found for technique '{technique_name}'")
59
+ return {}
60
+
61
+ # Apply cross-encoder reranking to get top 5
62
+ # Don't create a temporary retriever with empty list (causes division by zero in BM25)
63
+ from sentence_transformers import CrossEncoder
64
+ rerank_model = CrossEncoder('cross-encoder/ms-marco-MiniLM-L-6-v2')
65
+ pairs = [[query, chunk] for chunk in all_candidates]
66
+ scores = rerank_model.predict(pairs)
67
+ ranked = sorted(zip(all_candidates, scores), key=lambda x: x[1], reverse=True)
68
+ context_chunks = [chunk for chunk, _ in ranked[:5]]
69
+
70
+ print(f"After reranking: {len(context_chunks)} chunks (top 5)")
71
+
72
+ # Print the final RAG context being passed to models
73
+ print(f"\n{'='*80}")
74
+ print(f"📚 FINAL RAG CONTEXT FOR TECHNIQUE '{technique_name.upper()}'")
75
+ print(f"{'='*80}")
76
+ for i, chunk in enumerate(context_chunks, 1):
77
+ print(f"\n[Chunk {i}] ({len(chunk)} chars):")
78
+ print(f"{'─'*60}")
79
+ print(chunk)
80
+ print(f"{'─'*60}")
81
+ print(f"\n{'='*80}")
82
+
83
+ # Run model tournament for this technique
84
+ tournament_results = {}
85
+
86
+ for name, model_inst in models.items():
87
+ print(f"\n{'-'*60}")
88
+ print(f"Model: {name}")
89
+ print(f"{'-'*60}")
90
+ try:
91
+ # Generation
92
+ answer = rag_engine.get_answer(
93
+ model_inst, query, context_chunks,
94
+ temperature=cfg.gen['temperature']
95
+ )
96
+
97
+ print(f"\n{'─'*60}")
98
+ print(f"📝 FULL ANSWER from {name}:")
99
+ print(f"{'─'*60}")
100
+ print(answer)
101
+ print(f"{'─'*60}")
102
+
103
+ # Faithfulness Evaluation (strict=False reduces API calls from ~22 to ~3 per eval)
104
+ faith = evaluator.evaluate_faithfulness(answer, context_chunks, strict=False)
105
+ # Relevancy Evaluation
106
+ rel = evaluator.evaluate_relevancy(query, answer)
107
+
108
+ tournament_results[name] = {
109
+ "answer": answer,
110
+ "Faithfulness": faith['score'],
111
+ "Relevancy": rel['score'],
112
+ "Claims": faith['details'],
113
+ "context_chunks": context_chunks
114
+ }
115
+
116
+ print(f"\n📊 EVALUATION SCORES:")
117
+ print(f" Faithfulness: {faith['score']:.1f}%")
118
+ print(f" Relevancy: {rel['score']:.3f}")
119
+ print(f" Combined: {faith['score'] + rel['score']:.3f}")
120
+
121
+ except Exception as e:
122
+ print(f" Error evaluating {name}: {e}")
123
+ tournament_results[name] = {
124
+ "answer": "",
125
+ "Faithfulness": 0,
126
+ "Relevancy": 0,
127
+ "Claims": [],
128
+ "error": str(e),
129
+ "context_chunks": context_chunks
130
+ }
131
+
132
+ return tournament_results
133
+
134
+
135
+ def generate_findings_document(all_results, query, output_file="rag_ablation_findings.md"):
136
+ """Generate detailed markdown document with findings from all techniques."""
137
+
138
+ timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
139
+
140
+ content = f"""# RAG Ablation Study Findings
141
+
142
+ **Generated:** {timestamp}
143
+
144
+ ## Overview
145
+
146
+ This document presents findings from a comparative analysis of 6 different chunking techniques
147
+ applied to a Cognitive Behavioral Therapy (CBT) book. Each technique was evaluated using
148
+ multiple LLM models with RAG (Retrieval-Augmented Generation) pipeline.
149
+
150
+ **Query:** {query}
151
+
152
+ ## Chunking Techniques Evaluated
153
+
154
+ 1. **Fixed** - Fixed-size chunking (1000 chars, 100 overlap)
155
+ 2. **Sentence** - Sentence-level chunking (NLTK)
156
+ 3. **Paragraph** - Paragraph-level chunking (\\n\\n boundaries)
157
+ 4. **Semantic** - Semantic chunking (embedding similarity)
158
+ 5. **Recursive** - Recursive chunking (hierarchical separators)
159
+ 6. **Page** - Page-level chunking (--- Page markers)
160
+
161
+ ## Results by Technique
162
+
163
+ """
164
+
165
+ # Add results for each technique
166
+ for technique_name, model_results in all_results.items():
167
+ content += f"### {technique_name.upper()} Chunking\n\n"
168
+
169
+ if not model_results:
170
+ content += "*No results available for this technique.*\n\n"
171
+ continue
172
+
173
+ # Create results table
174
+ content += "| Model | Faithfulness | Relevancy | Combined Score |\n"
175
+ content += "|-------|--------------|-----------|----------------|\n"
176
+
177
+ for model_name, results in model_results.items():
178
+ faith = results.get('Faithfulness', 0)
179
+ rel = results.get('Relevancy', 0)
180
+ combined = faith + rel
181
+ content += f"| {model_name} | {faith:.1f}% | {rel:.3f} | {combined:.3f} |\n"
182
+
183
+ # Find best model for this technique
184
+ if model_results:
185
+ best_model = max(
186
+ model_results.items(),
187
+ key=lambda x: x[1].get('Faithfulness', 0) + x[1].get('Relevancy', 0)
188
+ )
189
+ best_name = best_model[0]
190
+ best_faith = best_model[1].get('Faithfulness', 0)
191
+ best_rel = best_model[1].get('Relevancy', 0)
192
+
193
+ content += f"\n**Best Model:** {best_name} (Faithfulness: {best_faith:.1f}%, Relevancy: {best_rel:.3f})\n\n"
194
+
195
+ # Add detailed RAG results for each model
196
+ content += "#### Detailed RAG Results\n\n"
197
+
198
+ for model_name, results in model_results.items():
199
+ answer = results.get('answer', '')
200
+ context_chunks = results.get('context_chunks', [])
201
+ faith = results.get('Faithfulness', 0)
202
+ rel = results.get('Relevancy', 0)
203
+
204
+ content += f"**{model_name}** (Faithfulness: {faith:.1f}%, Relevancy: {rel:.3f})\n\n"
205
+
206
+ # Add answer
207
+ content += "📝 **Full Answer:**\n\n"
208
+ content += f"```\n{answer}\n```\n\n"
209
+
210
+ # Add context chunks used
211
+ content += "📚 **Context Chunks Used:**\n\n"
212
+ for i, chunk in enumerate(context_chunks, 1):
213
+ content += f"**Chunk {i}:**\n"
214
+ content += f"```\n{chunk}\n```\n\n"
215
+
216
+ content += "---\n\n"
217
+
218
+ # Add comparative analysis
219
+ content += """## Comparative Analysis
220
+
221
+ ### Overall Performance Ranking
222
+
223
+ | Rank | Technique | Avg Faithfulness | Avg Relevancy | Avg Combined |
224
+ |------|-----------|------------------|---------------|--------------|
225
+ """
226
+
227
+ # Calculate averages for each technique
228
+ technique_averages = {}
229
+ for technique_name, model_results in all_results.items():
230
+ if model_results:
231
+ avg_faith = sum(r.get('Faithfulness', 0) for r in model_results.values()) / len(model_results)
232
+ avg_rel = sum(r.get('Relevancy', 0) for r in model_results.values()) / len(model_results)
233
+ avg_combined = avg_faith + avg_rel
234
+ technique_averages[technique_name] = {
235
+ 'faith': avg_faith,
236
+ 'rel': avg_rel,
237
+ 'combined': avg_combined
238
+ }
239
+
240
+ # Sort by combined score
241
+ sorted_techniques = sorted(
242
+ technique_averages.items(),
243
+ key=lambda x: x[1]['combined'],
244
+ reverse=True
245
+ )
246
+
247
+ for rank, (technique_name, averages) in enumerate(sorted_techniques, 1):
248
+ content += f"| {rank} | {technique_name} | {averages['faith']:.1f}% | {averages['rel']:.3f} | {averages['combined']:.3f} |\n"
249
+
250
+ content += """
251
+ ### Key Findings
252
+
253
+ """
254
+
255
+ if sorted_techniques:
256
+ best_technique = sorted_techniques[0][0]
257
+ worst_technique = sorted_techniques[-1][0]
258
+
259
+ content += f"""
260
+ 1. **Best Performing Technique:** {best_technique}
261
+ - Achieved highest combined score across all models
262
+ - Recommended for production RAG applications
263
+
264
+ 2. **Worst Performing Technique:** {worst_technique}
265
+ - Lower combined scores across models
266
+ - May need optimization or different configuration
267
+
268
+ 3. **Model Consistency:**
269
+ - Analyzed which models perform consistently across techniques
270
+ - Identified technique-specific model preferences
271
+
272
+ """
273
+
274
+ content += """## Recommendations
275
+
276
+ Based on the ablation study results:
277
+
278
+ 1. **Primary Recommendation:** Use the best-performing chunking technique for your specific use case
279
+ 2. **Hybrid Approach:** Consider combining techniques for different types of queries
280
+ 3. **Model Selection:** Choose models that perform well across multiple techniques
281
+ 4. **Parameter Tuning:** Optimize chunk sizes and overlaps based on document characteristics
282
+
283
+ ## Technical Details
284
+
285
+ - **Embedding Model:** Jina embeddings (512 dimensions)
286
+ - **Vector Database:** Pinecone (serverless, AWS us-east-1)
287
+ - **Judge Model:** Openrouter Free models
288
+ - **Retrieval:** Top 5 chunks per technique
289
+ - **Evaluation Metrics:** Faithfulness (context grounding), Relevancy (query addressing)
290
+
291
+ ---
292
+
293
+ *This report was automatically generated by the RAG Ablation Study Pipeline.*
294
+ """
295
+
296
+ # Write to file
297
+ with open(output_file, 'w', encoding='utf-8') as f:
298
+ f.write(content)
299
+
300
+ print(f"\nFindings document saved to: {output_file}")
301
+ return output_file
302
+
303
+
304
  def main():
305
+ """Main function to run RAG ablation study across all 6 chunking techniques."""
306
  hf_token = os.getenv("HF_TOKEN")
307
  pinecone_key = os.getenv("PINECONE_API_KEY")
308
+ openrouter_key = os.getenv("OPENROUTER_API_KEY")
309
 
310
  # Verify environment variables
311
  if not hf_token:
312
  raise RuntimeError("HF_TOKEN not found in environment variables")
313
  if not pinecone_key:
314
  raise RuntimeError("PINECONE_API_KEY not found in environment variables")
315
+ if not openrouter_key:
316
+ raise RuntimeError("OPENROUTER_API_KEY not found in environment variables")
317
 
318
+ # Test query
319
  query = "What is cognitive behavior therapy and how does it work?"
320
 
321
  print("=" * 80)
322
+ print("RAG ABLATION STUDY - 6 CHUNKING TECHNIQUES")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
323
  print("=" * 80)
324
 
325
+ # Step 1: Check if data already exists, skip ingestion if so
 
 
326
  print("\n" + "=" * 80)
327
+ print("STEP 1: CHECKING/INGESTING DATA WITH ALL 6 TECHNIQUES")
328
  print("=" * 80)
329
 
330
+ # Check if index already has data
331
+ pinecone_key = os.getenv("PINECONE_API_KEY")
332
+ from vector_db import get_index_by_name
333
+ index_name = f"{cfg.db['base_index_name']}-{cfg.processing['technique']}"
334
+
335
+ print(f"\nChecking for existing index: {index_name}")
336
+
337
+ try:
338
+ # Try to connect to existing index
339
+ print("Connecting to Pinecone...")
340
+ existing_index = get_index_by_name(pinecone_key, index_name)
341
+ print("Getting index stats...")
342
+ stats = existing_index.describe_index_stats()
343
+ existing_count = stats.get('total_vector_count', 0)
344
+
345
+ if existing_count > 0:
346
+ print(f"\n✓ Found existing index with {existing_count} vectors")
347
+ print("Skipping ingestion - using existing data")
348
+
349
+ # Initialize processor (this loads the embedding model)
350
+ print("Loading embedding model for retrieval...")
351
+ from retriever.processor import ChunkProcessor
352
+ proc = ChunkProcessor(model_name=cfg.processing['embedding_model'], verbose=False)
353
+ index = existing_index
354
+ all_chunks = [] # Empty since we're using existing data
355
+ final_chunks = []
356
+ print("✓ Processor initialized")
357
+ else:
358
+ print("\nIndex exists but is empty. Running full ingestion...")
359
+ all_chunks, final_chunks, proc, index = ingest_data()
360
+ except Exception as e:
361
+ print(f"\nIndex check failed: {e}")
362
+ print("Running full ingestion...")
363
+ all_chunks, final_chunks, proc, index = ingest_data()
364
+
365
+ print(f"\nTechniques to evaluate: {[tech['name'] for tech in CHUNKING_TECHNIQUES]}")
366
+
367
+ # Step 2: Initialize components
368
  print("\n" + "=" * 80)
369
+ print("STEP 2: INITIALIZING COMPONENTS")
370
  print("=" * 80)
371
 
372
+ # Initialize models
373
+ print("\nInitializing models...")
374
  rag_engine = RAGGenerator()
375
  models = {name: MODEL_MAP[name](token=hf_token) for name in cfg.model_list}
376
 
377
+ # Initialize evaluator
378
+ print("Initializing evaluator...")
379
+ openrouter_key = os.getenv("OPENROUTER_API_KEY")
380
+ if not openrouter_key:
381
+ raise RuntimeError("OPENROUTER_API_KEY not found in environment variables")
382
+
383
  evaluator = RAGEvaluator(
384
  judge_model=cfg.gen['judge_model'],
385
  embedding_model=proc.encoder,
386
+ api_key=openrouter_key
387
  )
388
 
389
+ # Step 3: Run RAG for each technique
390
+ print("\n" + "=" * 80)
391
+ print("STEP 3: RUNNING RAG FOR ALL 6 TECHNIQUES")
392
+ print("=" * 80)
393
 
394
+ all_results = {}
395
+
396
+ for i, technique in enumerate(CHUNKING_TECHNIQUES, 1):
397
+ technique_name = technique['name']
398
+
399
+ print(f"\n[{i}/{len(CHUNKING_TECHNIQUES)}] Processing technique: {technique_name}")
400
+
401
  try:
402
+ results = run_rag_for_technique(
403
+ technique_name=technique_name,
404
+ query=query,
405
+ index=index,
406
+ encoder=proc.encoder,
407
+ models=models,
408
+ evaluator=evaluator,
409
+ rag_engine=rag_engine
410
  )
411
+
412
+ all_results[technique_name] = results
413
+
414
+ print(f"\n✓ Completed technique: {technique_name}")
415
+
416
+ except Exception as e:
417
+ import traceback
418
+ print(f"\n✗ Error processing technique {technique_name}: {e}")
419
+ print(f"Full traceback:")
420
+ traceback.print_exc()
421
+ all_results[technique_name] = {}
422
 
423
+ # Step 4: Generate findings document
424
+ print("\n" + "=" * 80)
425
+ print("STEP 4: GENERATING FINDINGS DOCUMENT")
426
+ print("=" * 80)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
427
 
428
+ findings_file = generate_findings_document(all_results, query)
 
 
 
 
 
 
 
 
429
 
430
+ # Step 5: Final summary
431
  print("\n" + "=" * 80)
432
+ print("ABLATION STUDY COMPLETE - SUMMARY")
433
  print("=" * 80)
434
 
435
  print(f"\nQuery: {query}")
436
+ print(f"Techniques evaluated: {len(CHUNKING_TECHNIQUES)}")
437
+ print(f"Models tested: {len(cfg.model_list)}")
438
+ print(f"\nFindings document: {findings_file}")
439
+
440
+ # Print quick summary
441
  print("\n" + "-" * 60)
442
+ print(f"{'Technique':<15} {'Avg Faith':>12} {'Avg Rel':>12} {'Best Model':<20}")
443
  print("-" * 60)
444
 
445
+ for technique_name, model_results in all_results.items():
446
+ if model_results:
447
+ avg_faith = sum(r.get('Faithfulness', 0) for r in model_results.values()) / len(model_results)
448
+ avg_rel = sum(r.get('Relevancy', 0) for r in model_results.values()) / len(model_results)
449
+
450
+ # Find best model
451
+ best_model = max(
452
+ model_results.items(),
453
+ key=lambda x: x[1].get('Faithfulness', 0) + x[1].get('Relevancy', 0)
454
+ )
455
+ best_name = best_model[0]
456
+
457
+ print(f"{technique_name:<15} {avg_faith:>11.1f}% {avg_rel:>12.3f} {best_name:<20}")
458
+ else:
459
+ print(f"{technique_name:<15} {'N/A':>12} {'N/A':>12} {'N/A':<20}")
460
 
461
  print("-" * 60)
462
 
463
+ print("\n✓ Ablation study complete!")
464
+ print(f"✓ Results saved to: {findings_file}")
465
+ print("\nYou can now analyze the findings document to compare chunking techniques.")
 
 
 
 
 
 
466
 
467
+ return all_results
468
 
469
 
470
  if __name__ == "__main__":
models/deepseek_v3.py CHANGED
@@ -5,7 +5,7 @@ class DeepSeek_V3:
5
  self.client = InferenceClient(token=token)
6
  self.model_id = "deepseek-ai/DeepSeek-V3"
7
 
8
- def generate(self, prompt, max_tokens=500, temperature=0.1):
9
  response = ""
10
  try:
11
  for message in self.client.chat_completion(
 
5
  self.client = InferenceClient(token=token)
6
  self.model_id = "deepseek-ai/DeepSeek-V3"
7
 
8
+ def generate(self, prompt, max_tokens=1500, temperature=0.1):
9
  response = ""
10
  try:
11
  for message in self.client.chat_completion(
models/llama_3_8b.py CHANGED
@@ -5,7 +5,7 @@ class Llama3_8B:
5
  self.client = InferenceClient(token=token)
6
  self.model_id = "meta-llama/Meta-Llama-3-8B-Instruct"
7
 
8
- def generate(self, prompt, max_tokens=500, temperature=0.1):
9
  response = ""
10
  for message in self.client.chat_completion(
11
  model=self.model_id,
 
5
  self.client = InferenceClient(token=token)
6
  self.model_id = "meta-llama/Meta-Llama-3-8B-Instruct"
7
 
8
+ def generate(self, prompt, max_tokens=1500, temperature=0.1):
9
  response = ""
10
  for message in self.client.chat_completion(
11
  model=self.model_id,
models/mistral_7b.py CHANGED
@@ -5,7 +5,7 @@ class Mistral_7b:
5
  self.client = InferenceClient(api_key=token)
6
  self.model_id = "mistralai/Mistral-7B-Instruct-v0.2:featherless-ai"
7
 
8
- def generate(self, prompt, max_tokens=500, temperature=0.1):
9
  response = ""
10
  try:
11
  stream = self.client.chat.completions.create(
 
5
  self.client = InferenceClient(api_key=token)
6
  self.model_id = "mistralai/Mistral-7B-Instruct-v0.2:featherless-ai"
7
 
8
+ def generate(self, prompt, max_tokens=1500, temperature=0.1):
9
  response = ""
10
  try:
11
  stream = self.client.chat.completions.create(
models/qwen_2_5.py CHANGED
@@ -5,7 +5,7 @@ class Qwen2_5:
5
  self.client = InferenceClient(token=token)
6
  self.model_id = "Qwen/Qwen2.5-72B-Instruct"
7
 
8
- def generate(self, prompt, max_tokens=500, temperature=0.1):
9
  response = ""
10
  for message in self.client.chat_completion(
11
  model=self.model_id,
 
5
  self.client = InferenceClient(token=token)
6
  self.model_id = "Qwen/Qwen2.5-72B-Instruct"
7
 
8
+ def generate(self, prompt, max_tokens=1500, temperature=0.1):
9
  response = ""
10
  for message in self.client.chat_completion(
11
  model=self.model_id,
models/tiny_aya.py CHANGED
@@ -5,7 +5,7 @@ class TinyAya:
5
  self.client = InferenceClient(token=token)
6
  self.model_id = "CohereLabs/tiny-aya-global"
7
 
8
- def generate(self, prompt, max_tokens=500, temperature=0.1):
9
 
10
  response = ""
11
  try:
 
5
  self.client = InferenceClient(token=token)
6
  self.model_id = "CohereLabs/tiny-aya-global"
7
 
8
+ def generate(self, prompt, max_tokens=1500, temperature=0.1):
9
 
10
  response = ""
11
  try:
rag_ablation_findings.md ADDED
The diff for this file is too large to render. See raw diff
 
requirements.txt CHANGED
@@ -93,3 +93,5 @@ xxhash==3.6.0
93
  yarl==1.23.0
94
  zstandard==0.25.0
95
  groq==0.13.0
 
 
 
93
  yarl==1.23.0
94
  zstandard==0.25.0
95
  groq==0.13.0
96
+ jiter==0.13.0
97
+ openai==2.30.0
retriever/evaluator.py CHANGED
@@ -1,37 +1,68 @@
1
  import re
2
  import numpy as np
3
  from sklearn.metrics.pairwise import cosine_similarity
4
- from groq import Groq
 
5
 
6
 
7
  # ------------------------------------------------------------------
8
- # Groq Judge Wrapper
9
  # ------------------------------------------------------------------
10
 
11
  class GroqJudge:
12
- def __init__(self, api_key: str, model: str = "llama-3.1-8b-instant"):
13
  """
14
- Wraps Groq's chat completions to match the .generate(prompt) interface
15
  expected by RAGEvaluator.
16
 
17
  Args:
18
- api_key: Your Groq API key (https://console.groq.com)
19
- model: Groq model to use. Free tier options:
20
- - "llama-3.1-8b-instant" (fastest)
21
- - "llama-3.3-70b-versatile" (more capable, slower)
22
- - "gemma2-9b-it"
23
  """
24
- self.client = Groq(api_key=api_key)
 
 
 
25
  self.model = model
 
 
 
 
 
 
 
 
 
 
26
 
27
  def generate(self, prompt: str) -> str:
28
- response = self.client.chat.completions.create(
29
- model=self.model,
30
- messages=[{"role": "user", "content": prompt}],
31
- temperature=0.0, # deterministic for evaluation
32
- max_tokens=1024,
33
- )
34
- return response.choices[0].message.content.strip()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
35
 
36
 
37
  # ------------------------------------------------------------------
@@ -41,10 +72,10 @@ class GroqJudge:
41
  class RAGEvaluator:
42
  def __init__(self, judge_model: str, embedding_model, api_key: str, verbose=True):
43
  """
44
- judge_model: Model name string passed to GroqJudge, must match cfg.gen['judge_model']
45
- e.g. "llama-3.1-8b-instant", "llama-3.3-70b-versatile", "gemma2-9b-it"
46
  embedding_model: The proc.encoder (SentenceTransformer) for similarity checks
47
- api_key: Groq API key (https://console.groq.com)
48
  verbose: If True, prints progress via internal helpers
49
  """
50
  self.judge = GroqJudge(api_key=api_key, model=judge_model)
@@ -91,8 +122,14 @@ class RAGEvaluator:
91
  # --- Step B: Verification ---
92
  if strict:
93
  # Per-chunk: claim must be explicitly supported by at least one chunk
94
- verdicts = {i: self._verify_claim_against_chunks(claim, context_list)
95
- for i, claim in enumerate(claims)}
 
 
 
 
 
 
96
  else:
97
  # Batch: all chunks joined, strict burden-of-proof prompt
98
  combined_context = "\n".join(context_list)
@@ -140,7 +177,7 @@ class RAGEvaluator:
140
 
141
  def _verify_claim_against_chunks(self, claim: str, context_list: list[str]) -> bool:
142
  """Verify a single claim against each chunk individually. Returns True if any chunk supports it."""
143
- for chunk in context_list:
144
  prompt = (
145
  f"Context:\n{chunk}\n\n"
146
  f"Claim: {claim}\n\n"
@@ -148,8 +185,14 @@ class RAGEvaluator:
148
  f"Do not infer or assume. Respond with YES or NO only."
149
  )
150
  result = self.judge.generate(prompt)
151
- if "YES" in result.upper():
152
- return True
 
 
 
 
 
 
153
  return False
154
 
155
  # ------------------------------------------------------------------
 
1
  import re
2
  import numpy as np
3
  from sklearn.metrics.pairwise import cosine_similarity
4
+ from openai import OpenAI
5
+ from concurrent.futures import ThreadPoolExecutor, as_completed
6
 
7
 
8
  # ------------------------------------------------------------------
9
+ # OpenRouter Judge Wrapper
10
  # ------------------------------------------------------------------
11
 
12
  class GroqJudge:
13
+ def __init__(self, api_key: str, model: str = "qwen/qwen3.6-plus-preview:free",):
14
  """
15
+ Wraps OpenRouter's chat completions to match the .generate(prompt) interface
16
  expected by RAGEvaluator.
17
 
18
  Args:
19
+ api_key: Your OpenRouter API key (https://openrouter.ai)
20
+ model: OpenRouter model to use (primary model with fallback support)
 
 
 
21
  """
22
+ self.client = OpenAI(
23
+ base_url="https://openrouter.ai/api/v1",
24
+ api_key=api_key,
25
+ )
26
  self.model = model
27
+
28
+ # Fallback models in order of preference (OpenRouter free models)
29
+ self.fallback_models = [
30
+ "qwen/qwen3.6-plus-preview:free",
31
+ "stepfun/step-3.5-flash:free",
32
+ "nvidia/nemotron-3-super-120b-a12b:free",
33
+ "z-ai/glm-4.5-air:free",
34
+ "nvidia/nemotron-3-nano-30b-a3b:free",
35
+ "arcee-ai/trinity-mini:free"
36
+ ]
37
 
38
  def generate(self, prompt: str) -> str:
39
+ """Generate response with fallback support for multiple models."""
40
+ last_error = None
41
+
42
+ # Try primary model first, then fallbacks
43
+ models_to_try = [self.model] + [m for m in self.fallback_models if m != self.model]
44
+
45
+ for model_name in models_to_try:
46
+ try:
47
+ response = self.client.chat.completions.create(
48
+ model=model_name,
49
+ messages=[{"role": "user", "content": prompt}],
50
+
51
+ )
52
+ content = response.choices[0].message.content
53
+ if content is None:
54
+ raise ValueError(f"Model {model_name} returned None content")
55
+ return content.strip()
56
+ except Exception as e:
57
+ last_error = e
58
+ # If rate limited or model unavailable, try next model
59
+ if "429" in str(e) or "rate_limit" in str(e).lower() or "model" in str(e).lower():
60
+ continue
61
+ # For other errors, raise immediately
62
+ raise
63
+
64
+ # If all models fail, raise the last error
65
+ raise last_error
66
 
67
 
68
  # ------------------------------------------------------------------
 
72
  class RAGEvaluator:
73
  def __init__(self, judge_model: str, embedding_model, api_key: str, verbose=True):
74
  """
75
+ judge_model: Model name string passed to OpenRouterJudge, must match cfg.gen['judge_model']
76
+ e.g. "stepfun/step-3.5-flash:free", "nvidia/nemotron-3-super-120b-a12b:free"
77
  embedding_model: The proc.encoder (SentenceTransformer) for similarity checks
78
+ api_key: OpenRouter API key (https://openrouter.ai)
79
  verbose: If True, prints progress via internal helpers
80
  """
81
  self.judge = GroqJudge(api_key=api_key, model=judge_model)
 
122
  # --- Step B: Verification ---
123
  if strict:
124
  # Per-chunk: claim must be explicitly supported by at least one chunk
125
+ # Parallelize across claims as well
126
+ def verify_claim_wrapper(args):
127
+ i, claim = args
128
+ return i, self._verify_claim_against_chunks(claim, context_list)
129
+
130
+ with ThreadPoolExecutor(max_workers=min(len(claims), 5)) as executor:
131
+ futures = [executor.submit(verify_claim_wrapper, (i, claim)) for i, claim in enumerate(claims)]
132
+ verdicts = {i: result for future in as_completed(futures) for i, result in [future.result()]}
133
  else:
134
  # Batch: all chunks joined, strict burden-of-proof prompt
135
  combined_context = "\n".join(context_list)
 
177
 
178
  def _verify_claim_against_chunks(self, claim: str, context_list: list[str]) -> bool:
179
  """Verify a single claim against each chunk individually. Returns True if any chunk supports it."""
180
+ def verify_single_chunk(chunk):
181
  prompt = (
182
  f"Context:\n{chunk}\n\n"
183
  f"Claim: {claim}\n\n"
 
185
  f"Do not infer or assume. Respond with YES or NO only."
186
  )
187
  result = self.judge.generate(prompt)
188
+ return "YES" in result.upper()
189
+
190
+ # Use ThreadPoolExecutor for parallel verification
191
+ with ThreadPoolExecutor(max_workers=min(len(context_list), 5)) as executor:
192
+ futures = [executor.submit(verify_single_chunk, chunk) for chunk in context_list]
193
+ for future in as_completed(futures):
194
+ if future.result():
195
+ return True
196
  return False
197
 
198
  # ------------------------------------------------------------------
retriever/retriever.py CHANGED
@@ -1,20 +1,40 @@
1
  import numpy as np
2
  import time
 
3
  from rank_bm25 import BM25Okapi
4
- from sentence_transformers import CrossEncoder
5
  from sklearn.metrics.pairwise import cosine_similarity
6
  from typing import Optional, List
7
 
 
 
 
 
 
 
 
 
8
  class HybridRetriever:
9
- def __init__(self, final_chunks, embed_model, rerank_model_name='cross-encoder/ms-marco-MiniLM-L-6-v2', verbose: bool = True):
10
  self.final_chunks = final_chunks
11
  self.embed_model = embed_model
12
- self.rerank_model = CrossEncoder(rerank_model_name)
13
  self.verbose = verbose
14
 
15
- self.tokenized_corpus = [chunk['metadata']['text'].lower().split() for chunk in final_chunks]
 
 
 
 
 
 
 
 
 
16
  self.bm25 = BM25Okapi(self.tokenized_corpus)
17
 
 
 
 
 
18
  # ------------------------------------------------------------------
19
  # Retrieval
20
  # ------------------------------------------------------------------
@@ -26,7 +46,7 @@ class HybridRetriever:
26
  return query_vector, chunks
27
 
28
  def _bm25_search(self, query, top_k) -> List[str]:
29
- tokenized_query = query.lower().split()
30
  scores = self.bm25.get_scores(tokenized_query)
31
  top_indices = np.argsort(scores)[::-1][:top_k]
32
  return [self.final_chunks[i]['metadata']['text'] for i in top_indices]
@@ -48,42 +68,126 @@ class HybridRetriever:
48
  # ------------------------------------------------------------------
49
 
50
  def _cross_encoder_rerank(self, query, chunks, final_k) -> List[str]:
51
- pairs = [[query, chunk] for chunk in chunks]
52
- scores = self.rerank_model.predict(pairs)
53
- ranked = sorted(zip(chunks, scores), key=lambda x: x[1], reverse=True)
54
- return [chunk for chunk, _ in ranked[:final_k]]
 
 
 
 
 
 
 
 
 
55
 
56
  # ------------------------------------------------------------------
57
  # MMR (applied after reranking as a diversity filter)
58
  # ------------------------------------------------------------------
59
 
60
  def _maximal_marginal_relevance(self, query_vector, chunks, lambda_param=0.5, top_k=3) -> List[str]:
 
 
 
 
 
 
 
 
 
61
  if not chunks:
 
62
  return []
63
 
64
- chunk_embeddings = self.embed_model.encode(chunks)
 
 
 
 
 
 
 
 
 
65
  query_embedding = query_vector.reshape(1, -1)
66
- relevance_scores = cosine_similarity(query_embedding, chunk_embeddings)[0]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
67
 
 
68
  selected, unselected = [], list(range(len(chunks)))
69
 
70
  first = int(np.argmax(relevance_scores))
71
  selected.append(first)
72
  unselected.remove(first)
 
73
 
 
 
 
74
  while len(selected) < min(top_k, len(chunks)):
75
- mmr_scores = [
76
- (i, lambda_param * relevance_scores[i] - (1 - lambda_param) * max(
77
- cosine_similarity(chunk_embeddings[i].reshape(1, -1),
78
- chunk_embeddings[s].reshape(1, -1))[0][0]
79
- for s in selected
80
- ))
81
- for i in unselected
82
- ]
83
- best = max(mmr_scores, key=lambda x: x[1])[0]
84
- selected.append(best)
85
- unselected.remove(best)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
86
 
 
87
  return [chunks[i] for i in selected]
88
 
89
  # ------------------------------------------------------------------
 
1
  import numpy as np
2
  import time
3
+ import re
4
  from rank_bm25 import BM25Okapi
 
5
  from sklearn.metrics.pairwise import cosine_similarity
6
  from typing import Optional, List
7
 
8
+ # Try to import FlashRank for CPU optimization, fallback to sentence-transformers
9
+ try:
10
+ from flashrank import Ranker, RerankRequest
11
+ FLASHRANK_AVAILABLE = True
12
+ except ImportError:
13
+ from sentence_transformers import CrossEncoder
14
+ FLASHRANK_AVAILABLE = False
15
+
16
  class HybridRetriever:
17
+ def __init__(self, final_chunks, embed_model, rerank_model_name='ms-marco-MiniLM-L-6-v2', verbose: bool = True):
18
  self.final_chunks = final_chunks
19
  self.embed_model = embed_model
 
20
  self.verbose = verbose
21
 
22
+ # Use FlashRank if available (faster on CPU), otherwise fallback to sentence-transformers
23
+ if FLASHRANK_AVAILABLE:
24
+ self.rerank_model = Ranker(model_name=rerank_model_name)
25
+ self.use_flashrank = True
26
+ else:
27
+ self.rerank_model = CrossEncoder(f'cross-encoder/{rerank_model_name}')
28
+ self.use_flashrank = False
29
+
30
+ # Better tokenization for BM25 (strips punctuation)
31
+ self.tokenized_corpus = [self._tokenize(chunk['metadata']['text']) for chunk in final_chunks]
32
  self.bm25 = BM25Okapi(self.tokenized_corpus)
33
 
34
+ def _tokenize(self, text: str) -> List[str]:
35
+ """Tokenize text using regex to strip punctuation."""
36
+ return re.findall(r'\w+', text.lower())
37
+
38
  # ------------------------------------------------------------------
39
  # Retrieval
40
  # ------------------------------------------------------------------
 
46
  return query_vector, chunks
47
 
48
  def _bm25_search(self, query, top_k) -> List[str]:
49
+ tokenized_query = self._tokenize(query)
50
  scores = self.bm25.get_scores(tokenized_query)
51
  top_indices = np.argsort(scores)[::-1][:top_k]
52
  return [self.final_chunks[i]['metadata']['text'] for i in top_indices]
 
68
  # ------------------------------------------------------------------
69
 
70
  def _cross_encoder_rerank(self, query, chunks, final_k) -> List[str]:
71
+ if self.use_flashrank:
72
+ # Use FlashRank for CPU-optimized reranking
73
+ passages = [{"id": i, "text": chunk} for i, chunk in enumerate(chunks)]
74
+ rerank_request = RerankRequest(query=query, passages=passages)
75
+ results = self.rerank_model.rerank(rerank_request)
76
+ ranked_chunks = [res['text'] for res in results]
77
+ return ranked_chunks[:final_k]
78
+ else:
79
+ # Fallback to sentence-transformers CrossEncoder
80
+ pairs = [[query, chunk] for chunk in chunks]
81
+ scores = self.rerank_model.predict(pairs)
82
+ ranked = sorted(zip(chunks, scores), key=lambda x: x[1], reverse=True)
83
+ return [chunk for chunk, _ in ranked[:final_k]]
84
 
85
  # ------------------------------------------------------------------
86
  # MMR (applied after reranking as a diversity filter)
87
  # ------------------------------------------------------------------
88
 
89
  def _maximal_marginal_relevance(self, query_vector, chunks, lambda_param=0.5, top_k=3) -> List[str]:
90
+ """
91
+ Maximum Marginal Relevance (MMR) for diversity filtering.
92
+
93
+ DIVISION BY ZERO DEBUGGING:
94
+ - This method can cause division by zero in cosine_similarity if vectors are zero
95
+ - We've added multiple safeguards to prevent this
96
+ """
97
+ print(f" [MMR DEBUG] Starting MMR with {len(chunks)} chunks, top_k={top_k}")
98
+
99
  if not chunks:
100
+ print(f" [MMR DEBUG] No chunks, returning empty list")
101
  return []
102
 
103
+ # STEP 1: Encode chunks to get embeddings
104
+ print(f" [MMR DEBUG] Encoding {len(chunks)} chunks...")
105
+ try:
106
+ chunk_embeddings = self.embed_model.encode(chunks)
107
+ print(f" [MMR DEBUG] Chunk embeddings shape: {chunk_embeddings.shape}")
108
+ except Exception as e:
109
+ print(f" [MMR DEBUG] ERROR encoding chunks: {e}")
110
+ return chunks[:top_k]
111
+
112
+ # STEP 2: Reshape query vector
113
  query_embedding = query_vector.reshape(1, -1)
114
+ print(f" [MMR DEBUG] Query embedding shape: {query_embedding.shape}")
115
+
116
+ # STEP 3: Check for zero vectors (POTENTIAL DIVISION BY ZERO SOURCE)
117
+ print(f" [MMR DEBUG] Checking for zero vectors...")
118
+ query_norm = np.linalg.norm(query_embedding)
119
+ chunk_norms = np.linalg.norm(chunk_embeddings, axis=1)
120
+
121
+ print(f" [MMR DEBUG] Query norm: {query_norm}")
122
+ print(f" [MMR DEBUG] Chunk norms min: {chunk_norms.min()}, max: {chunk_norms.max()}")
123
+
124
+ # Check for zero or near-zero vectors
125
+ if query_norm < 1e-10 or np.any(chunk_norms < 1e-10):
126
+ print(f" [MMR DEBUG] WARNING: Zero or near-zero vectors detected!")
127
+ print(f" [MMR DEBUG] Query norm < 1e-10: {query_norm < 1e-10}")
128
+ print(f" [MMR DEBUG] Any chunk norm < 1e-10: {np.any(chunk_norms < 1e-10)}")
129
+ print(f" [MMR DEBUG] Falling back to simple selection without MMR")
130
+ return chunks[:top_k]
131
+
132
+ # STEP 4: Compute relevance scores (POTENTIAL DIVISION BY ZERO SOURCE)
133
+ print(f" [MMR DEBUG] Computing relevance scores with cosine_similarity...")
134
+ try:
135
+ relevance_scores = cosine_similarity(query_embedding, chunk_embeddings)[0]
136
+ print(f" [MMR DEBUG] Relevance scores computed successfully")
137
+ print(f" [MMR DEBUG] Relevance scores shape: {relevance_scores.shape}")
138
+ print(f" [MMR DEBUG] Relevance scores min: {relevance_scores.min()}, max: {relevance_scores.max()}")
139
+ except Exception as e:
140
+ print(f" [MMR DEBUG] ERROR computing relevance scores: {e}")
141
+ print(f" [MMR DEBUG] Falling back to simple selection")
142
+ return chunks[:top_k]
143
 
144
+ # STEP 5: Initialize selection
145
  selected, unselected = [], list(range(len(chunks)))
146
 
147
  first = int(np.argmax(relevance_scores))
148
  selected.append(first)
149
  unselected.remove(first)
150
+ print(f" [MMR DEBUG] Selected first chunk: index {first}")
151
 
152
+ # STEP 6: Iteratively select chunks using MMR
153
+ print(f" [MMR DEBUG] Starting MMR iteration...")
154
+ iteration = 0
155
  while len(selected) < min(top_k, len(chunks)):
156
+ iteration += 1
157
+ print(f" [MMR DEBUG] Iteration {iteration}: selected={len(selected)}, unselected={len(unselected)}")
158
+
159
+ # Calculate MMR scores
160
+ mmr_scores = []
161
+ for i in unselected:
162
+ # Compute max similarity to already selected items
163
+ max_sim = -1
164
+ for s in selected:
165
+ try:
166
+ # POTENTIAL DIVISION BY ZERO SOURCE: cosine_similarity
167
+ sim = cosine_similarity(
168
+ chunk_embeddings[i].reshape(1, -1),
169
+ chunk_embeddings[s].reshape(1, -1)
170
+ )[0][0]
171
+ max_sim = max(max_sim, sim)
172
+ except Exception as e:
173
+ print(f" [MMR DEBUG] ERROR computing similarity between chunk {i} and {s}: {e}")
174
+ # If similarity computation fails, use 0
175
+ max_sim = max(max_sim, 0)
176
+
177
+ mmr_score = lambda_param * relevance_scores[i] - (1 - lambda_param) * max_sim
178
+ mmr_scores.append((i, mmr_score))
179
+
180
+ # Select chunk with highest MMR score
181
+ if mmr_scores:
182
+ best = max(mmr_scores, key=lambda x: x[1])[0]
183
+ selected.append(best)
184
+ unselected.remove(best)
185
+ print(f" [MMR DEBUG] Selected chunk {best} with MMR score {mmr_scores[best][1]:.4f}")
186
+ else:
187
+ print(f" [MMR DEBUG] No MMR scores computed, breaking")
188
+ break
189
 
190
+ print(f" [MMR DEBUG] MMR complete. Selected {len(selected)} chunks")
191
  return [chunks[i] for i in selected]
192
 
193
  # ------------------------------------------------------------------
test_backend.py CHANGED
@@ -117,12 +117,12 @@ def test_evaluator():
117
 
118
  from retriever.evaluator import GroqJudge
119
 
120
- groq_key = os.getenv("GROQ_API_KEY")
121
- if not groq_key:
122
- print("⚠ GROQ_API_KEY not set, skipping evaluator test")
123
  return None
124
 
125
- judge = GroqJudge(api_key=groq_key, model="llama-3.1-8b-instant")
126
 
127
  # Simple test prompt
128
  test_prompt = "What is 2 + 2? Answer with just the number."
 
117
 
118
  from retriever.evaluator import GroqJudge
119
 
120
+ openrouter_key = os.getenv("OPENROUTER_API_KEY")
121
+ if not openrouter_key:
122
+ print("⚠ OPENROUTER_API_KEY not set, skipping evaluator test")
123
  return None
124
 
125
+ judge = GroqJudge(api_key=openrouter_key, model="stepfun/step-3.5-flash:free")
126
 
127
  # Simple test prompt
128
  test_prompt = "What is 2 + 2? Answer with just the number."
vector_db.py CHANGED
@@ -98,7 +98,8 @@ def prepare_vectors_for_upsert(final_chunks):
98
  'title': meta.get('title', ""),
99
  'url': meta.get('url', ""),
100
  'chunk_index': meta.get('chunk_index', 0),
101
- 'technique': meta.get('technique', "unknown")
 
102
  }
103
  })
104
  return vectors
 
98
  'title': meta.get('title', ""),
99
  'url': meta.get('url', ""),
100
  'chunk_index': meta.get('chunk_index', 0),
101
+ 'technique': meta.get('technique', "unknown"),
102
+ 'chunking_technique': meta.get('chunking_technique', "unknown")
103
  }
104
  })
105
  return vectors