id stringlengths 14 17 | text stringlengths 42 2.11k |
|---|---|
4e9727215e95-2400 | Here's how it looks in practice:
import { OpenAI } from "langchain/llms/openai";import { loadQAStuffChain } from "langchain/chains";import { Document } from "langchain/document";// This first example uses the `StuffDocumentsChain`.const llmA = new OpenAI({});const chainA = loadQAStuffChain(llmA);const docs = [ new Document({ pageContent: "Harrison went to Harvard." }), new Document({ pageContent: "Ankush went to Princeton." }),];const resA = await chainA.call({ input_documents: docs, question: "Where did Harrison go to college? ",});console.log({ resA });// { resA: { text: ' Harrison went to Harvard.' } }
API Reference:OpenAI from langchain/llms/openailoadQAStuffChain from langchain/chainsDocument from langchain/document
Refine
Page Title: Refine | 🦜️🔗 Langchain
Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsStuffRefineMap reducePopularAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsDocumentsRefineRefineThe refine documents chain constructs a response by looping over the input documents and iteratively updating its answer. For each document, it passes all non-document inputs, the current document, and the latest intermediate answer to an LLM chain to get a new answer.Since the Refine chain only passes a single document to the LLM at a time, it is well-suited for tasks that require analyzing more documents than can fit in the model's context. |
4e9727215e95-2401 | The obvious tradeoff is that this chain will make far more LLM calls than, for example, the Stuff documents chain.
There are also certain tasks which are difficult to accomplish iteratively. For example, the Refine chain can perform poorly when documents frequently cross-reference one another or when a task requires detailed information from many documents.Here's how it looks in practice:import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model);// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. |
4e9727215e95-2402 | He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America. He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPrompt customizationYou may want to tweak the behavior of a step by changing the prompt. |
4e9727215e95-2403 | Here's an example of how to do that:import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { PromptTemplate } from "langchain/prompts";export const questionPromptTemplateString = `Context information is below.---------------------{context}---------------------Given the context information and no prior knowledge, answer the question: {question}`;const questionPrompt = new PromptTemplate({ inputVariables: ["context", "question"], template: questionPromptTemplateString,});const refinePromptTemplateString = `The original question is as follows: {question}We have provided an existing answer: {existing_answer}We have the opportunity to refine the existing answer(only if needed) with some more context below.------------{context}------------Given the new context, refine the original answer to better answer the |
4e9727215e95-2404 | question.You must provide a response, either original answer or refined answer.`;const refinePrompt = new PromptTemplate({ inputVariables: ["question", "existing_answer", "context"], template: refinePromptTemplateString,});// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model, { questionPrompt, refinePrompt,});// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America. |
4e9727215e95-2405 | He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPromptTemplate from langchain/promptsPreviousStuffNextMap reduceCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsStuffRefineMap reducePopularAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsDocumentsRefineRefineThe refine documents chain constructs a response by looping over the input documents and iteratively updating its answer. For each document, it passes all non-document inputs, the current document, and the latest intermediate answer to an LLM chain to get a new answer.Since the Refine chain only passes a single document to the LLM at a time, it is well-suited for tasks that require analyzing more documents than can fit in the model's context.
The obvious tradeoff is that this chain will make far more LLM calls than, for example, the Stuff documents chain. |
4e9727215e95-2406 | There are also certain tasks which are difficult to accomplish iteratively. For example, the Refine chain can perform poorly when documents frequently cross-reference one another or when a task requires detailed information from many documents.Here's how it looks in practice:import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model);// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. |
4e9727215e95-2407 | He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America. He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPrompt customizationYou may want to tweak the behavior of a step by changing the prompt. |
4e9727215e95-2408 | Here's an example of how to do that:import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { PromptTemplate } from "langchain/prompts";export const questionPromptTemplateString = `Context information is below.---------------------{context}---------------------Given the context information and no prior knowledge, answer the question: {question}`;const questionPrompt = new PromptTemplate({ inputVariables: ["context", "question"], template: questionPromptTemplateString,});const refinePromptTemplateString = `The original question is as follows: {question}We have provided an existing answer: {existing_answer}We have the opportunity to refine the existing answer(only if needed) with some more context below.------------{context}------------Given the new context, refine the original answer to better answer the |
4e9727215e95-2409 | question.You must provide a response, either original answer or refined answer.`;const refinePrompt = new PromptTemplate({ inputVariables: ["question", "existing_answer", "context"], template: refinePromptTemplateString,});// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model, { questionPrompt, refinePrompt,});// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America. |
4e9727215e95-2410 | He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPromptTemplate from langchain/promptsPreviousStuffNextMap reduce
ModulesChainsDocumentsRefineRefineThe refine documents chain constructs a response by looping over the input documents and iteratively updating its answer. For each document, it passes all non-document inputs, the current document, and the latest intermediate answer to an LLM chain to get a new answer.Since the Refine chain only passes a single document to the LLM at a time, it is well-suited for tasks that require analyzing more documents than can fit in the model's context.
The obvious tradeoff is that this chain will make far more LLM calls than, for example, the Stuff documents chain. |
4e9727215e95-2411 | There are also certain tasks which are difficult to accomplish iteratively. For example, the Refine chain can perform poorly when documents frequently cross-reference one another or when a task requires detailed information from many documents.Here's how it looks in practice:import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model);// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. |
4e9727215e95-2412 | He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America. He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPrompt customizationYou may want to tweak the behavior of a step by changing the prompt. |
4e9727215e95-2413 | Here's an example of how to do that:import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { PromptTemplate } from "langchain/prompts";export const questionPromptTemplateString = `Context information is below.---------------------{context}---------------------Given the context information and no prior knowledge, answer the question: {question}`;const questionPrompt = new PromptTemplate({ inputVariables: ["context", "question"], template: questionPromptTemplateString,});const refinePromptTemplateString = `The original question is as follows: {question}We have provided an existing answer: {existing_answer}We have the opportunity to refine the existing answer(only if needed) with some more context below.------------{context}------------Given the new context, refine the original answer to better answer the |
4e9727215e95-2414 | question.You must provide a response, either original answer or refined answer.`;const refinePrompt = new PromptTemplate({ inputVariables: ["question", "existing_answer", "context"], template: refinePromptTemplateString,});// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model, { questionPrompt, refinePrompt,});// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America. |
4e9727215e95-2415 | He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPromptTemplate from langchain/promptsPreviousStuffNextMap reduce
RefineThe refine documents chain constructs a response by looping over the input documents and iteratively updating its answer. For each document, it passes all non-document inputs, the current document, and the latest intermediate answer to an LLM chain to get a new answer.Since the Refine chain only passes a single document to the LLM at a time, it is well-suited for tasks that require analyzing more documents than can fit in the model's context.
The obvious tradeoff is that this chain will make far more LLM calls than, for example, the Stuff documents chain. |
4e9727215e95-2416 | There are also certain tasks which are difficult to accomplish iteratively. For example, the Refine chain can perform poorly when documents frequently cross-reference one another or when a task requires detailed information from many documents.Here's how it looks in practice:import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model);// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. |
4e9727215e95-2417 | He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America. He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPrompt customizationYou may want to tweak the behavior of a step by changing the prompt. |
4e9727215e95-2418 | Here's an example of how to do that:import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { PromptTemplate } from "langchain/prompts";export const questionPromptTemplateString = `Context information is below.---------------------{context}---------------------Given the context information and no prior knowledge, answer the question: {question}`;const questionPrompt = new PromptTemplate({ inputVariables: ["context", "question"], template: questionPromptTemplateString,});const refinePromptTemplateString = `The original question is as follows: {question}We have provided an existing answer: {existing_answer}We have the opportunity to refine the existing answer(only if needed) with some more context below.------------{context}------------Given the new context, refine the original answer to better answer the |
4e9727215e95-2419 | question.You must provide a response, either original answer or refined answer.`;const refinePrompt = new PromptTemplate({ inputVariables: ["question", "existing_answer", "context"], template: refinePromptTemplateString,});// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model, { questionPrompt, refinePrompt,});// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America. |
4e9727215e95-2420 | He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPromptTemplate from langchain/prompts
Since the Refine chain only passes a single document to the LLM at a time, it is well-suited for tasks that require analyzing more documents than can fit in the model's context.
The obvious tradeoff is that this chain will make far more LLM calls than, for example, the Stuff documents chain.
There are also certain tasks which are difficult to accomplish iteratively. For example, the Refine chain can perform poorly when documents frequently cross-reference one another or when a task requires detailed information from many documents. |
4e9727215e95-2421 | import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model);// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America.
He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/ |
4e9727215e95-2422 | API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openai
You may want to tweak the behavior of a step by changing the prompt. Here's an example of how to do that:
import { loadQARefineChain } from "langchain/chains";import { OpenAI } from "langchain/llms/openai";import { TextLoader } from "langchain/document_loaders/fs/text";import { MemoryVectorStore } from "langchain/vectorstores/memory";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { PromptTemplate } from "langchain/prompts";export const questionPromptTemplateString = `Context information is below.---------------------{context}---------------------Given the context information and no prior knowledge, answer the question: {question}`;const questionPrompt = new PromptTemplate({ inputVariables: ["context", "question"], template: questionPromptTemplateString,});const refinePromptTemplateString = `The original question is as follows: {question}We have provided an existing answer: {existing_answer}We have the opportunity to refine the existing answer(only if needed) with some more context below.------------{context}------------Given the new context, refine the original answer to better answer the question.You must provide |
4e9727215e95-2423 | a response, either original answer or refined answer.`;const refinePrompt = new PromptTemplate({ inputVariables: ["question", "existing_answer", "context"], template: refinePromptTemplateString,});// Create the models and chainconst embeddings = new OpenAIEmbeddings();const model = new OpenAI({ temperature: 0 });const chain = loadQARefineChain(model, { questionPrompt, refinePrompt,});// Load the documents and create the vector storeconst loader = new TextLoader("./state_of_the_union.txt");const docs = await loader.loadAndSplit();const store = await MemoryVectorStore.fromDocuments(docs, embeddings);// Select the relevant documentsconst question = "What did the president say about Justice Breyer";const relevantDocs = await store.similaritySearch(question);// Call the chainconst res = await chain.call({ input_documents: relevantDocs, question,});console.log(res);/*{ output_text: '\n' + '\n' + "The president said that Justice Stephen Breyer has dedicated his life to serve this country and thanked him for his service. He also mentioned that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, and that the constitutional right affirmed in Roe v. Wade—standing precedent for half a century—is under attack as never before. He emphasized the importance of protecting access to health care, preserving a woman's right to choose, and advancing maternal health care in America.
He also expressed his support for the LGBTQ+ community, and his commitment to protecting their rights, including offering a Unity Agenda for the Nation to beat the opioid epidemic, increase funding for prevention, treatment, harm reduction, and recovery, and strengthen the Violence Against Women Act. "}*/ |
4e9727215e95-2424 | API Reference:loadQARefineChain from langchain/chainsOpenAI from langchain/llms/openaiTextLoader from langchain/document_loaders/fs/textMemoryVectorStore from langchain/vectorstores/memoryOpenAIEmbeddings from langchain/embeddings/openaiPromptTemplate from langchain/prompts
Map reduce
Page Title: Map reduce | 🦜️🔗 Langchain
Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsStuffRefineMap reducePopularAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsDocumentsMap reduceMap reduceThe map reduce documents chain first applies an LLM chain to each document individually (the Map step), treating the chain output as a new document. It then passes all the new documents to a separate combine documents chain to get a single output (the Reduce step). It can optionally first compress, or collapse, the mapped documents to make sure that they fit in the combine documents chain (which will often pass them to an LLM). |
4e9727215e95-2425 | This compression step is performed recursively if necessary.Here's how it looks in practice:import { OpenAI } from "langchain/llms/openai";import { loadQAMapReduceChain } from "langchain/chains";import { Document } from "langchain/document";// Optionally limit the number of concurrent requests to the language model.const model = new OpenAI({ temperature: 0, maxConcurrency: 10 });const chain = loadQAMapReduceChain(model);const docs = [ new Document({ pageContent: "harrison went to harvard" }), new Document({ pageContent: "ankush went to princeton" }),];const res = await chain.call({ input_documents: docs, question: "Where did harrison go to college",});console.log({ res });API Reference:OpenAI from langchain/llms/openailoadQAMapReduceChain from langchain/chainsDocument from langchain/documentPreviousRefineNextPopularCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsStuffRefineMap reducePopularAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsDocumentsMap reduceMap reduceThe map reduce documents chain first applies an LLM chain to each document individually (the Map step), treating the chain output as a new document. It then passes all the new documents to a separate combine documents chain to get a single output (the Reduce step). It can optionally first compress, or collapse, the mapped documents to make sure that they fit in the combine documents chain (which will often pass them to an LLM). |
4e9727215e95-2426 | This compression step is performed recursively if necessary.Here's how it looks in practice:import { OpenAI } from "langchain/llms/openai";import { loadQAMapReduceChain } from "langchain/chains";import { Document } from "langchain/document";// Optionally limit the number of concurrent requests to the language model.const model = new OpenAI({ temperature: 0, maxConcurrency: 10 });const chain = loadQAMapReduceChain(model);const docs = [ new Document({ pageContent: "harrison went to harvard" }), new Document({ pageContent: "ankush went to princeton" }),];const res = await chain.call({ input_documents: docs, question: "Where did harrison go to college",});console.log({ res });API Reference:OpenAI from langchain/llms/openailoadQAMapReduceChain from langchain/chainsDocument from langchain/documentPreviousRefineNextPopular |
4e9727215e95-2427 | ModulesChainsDocumentsMap reduceMap reduceThe map reduce documents chain first applies an LLM chain to each document individually (the Map step), treating the chain output as a new document. It then passes all the new documents to a separate combine documents chain to get a single output (the Reduce step). It can optionally first compress, or collapse, the mapped documents to make sure that they fit in the combine documents chain (which will often pass them to an LLM). This compression step is performed recursively if necessary.Here's how it looks in practice:import { OpenAI } from "langchain/llms/openai";import { loadQAMapReduceChain } from "langchain/chains";import { Document } from "langchain/document";// Optionally limit the number of concurrent requests to the language model.const model = new OpenAI({ temperature: 0, maxConcurrency: 10 });const chain = loadQAMapReduceChain(model);const docs = [ new Document({ pageContent: "harrison went to harvard" }), new Document({ pageContent: "ankush went to princeton" }),];const res = await chain.call({ input_documents: docs, question: "Where did harrison go to college",});console.log({ res });API Reference:OpenAI from langchain/llms/openailoadQAMapReduceChain from langchain/chainsDocument from langchain/documentPreviousRefineNextPopular |
4e9727215e95-2428 | Map reduceThe map reduce documents chain first applies an LLM chain to each document individually (the Map step), treating the chain output as a new document. It then passes all the new documents to a separate combine documents chain to get a single output (the Reduce step). It can optionally first compress, or collapse, the mapped documents to make sure that they fit in the combine documents chain (which will often pass them to an LLM). This compression step is performed recursively if necessary.Here's how it looks in practice:import { OpenAI } from "langchain/llms/openai";import { loadQAMapReduceChain } from "langchain/chains";import { Document } from "langchain/document";// Optionally limit the number of concurrent requests to the language model.const model = new OpenAI({ temperature: 0, maxConcurrency: 10 });const chain = loadQAMapReduceChain(model);const docs = [ new Document({ pageContent: "harrison went to harvard" }), new Document({ pageContent: "ankush went to princeton" }),];const res = await chain.call({ input_documents: docs, question: "Where did harrison go to college",});console.log({ res });API Reference:OpenAI from langchain/llms/openailoadQAMapReduceChain from langchain/chainsDocument from langchain/document |
4e9727215e95-2429 | import { OpenAI } from "langchain/llms/openai";import { loadQAMapReduceChain } from "langchain/chains";import { Document } from "langchain/document";// Optionally limit the number of concurrent requests to the language model.const model = new OpenAI({ temperature: 0, maxConcurrency: 10 });const chain = loadQAMapReduceChain(model);const docs = [ new Document({ pageContent: "harrison went to harvard" }), new Document({ pageContent: "ankush went to princeton" }),];const res = await chain.call({ input_documents: docs, question: "Where did harrison go to college",});console.log({ res });
API Reference:OpenAI from langchain/llms/openailoadQAMapReduceChain from langchain/chainsDocument from langchain/document
Page Title: Popular | 🦜️🔗 Langchain
Paragraphs: |
4e9727215e95-2430 | Page Title: Popular | 🦜️🔗 Langchain
Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAPI chainsRetrieval QAConversational Retrieval QASQLStructured Output with OpenAI functionsSummarizationAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsPopularPopular📄️ API chainsAPIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.📄️ Retrieval QAThis example showcases question answering over an index.📄️ Conversational Retrieval QAThe ConversationalRetrievalQA chain builds on RetrievalQAChain to provide a chat history component.📄️ SQLThis example demonstrates the use of the SQLDatabaseChain for answering questions over a SQL database.📄️ Structured Output with OpenAI functionsMust be used with an OpenAI functions model.📄️ SummarizationA summarization chain can be used to summarize multiple documents. One way is to input multiple smaller documents, after they have been divided into chunks, and operate over them with a MapReduceDocumentsChain. You can also choose instead for the chain that does summarization to be a StuffDocumentsChain, or a RefineDocumentsChain.PreviousMap reduceNextAPI chainsCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc. |
4e9727215e95-2431 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAPI chainsRetrieval QAConversational Retrieval QASQLStructured Output with OpenAI functionsSummarizationAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsPopularPopular📄️ API chainsAPIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.📄️ Retrieval QAThis example showcases question answering over an index.📄️ Conversational Retrieval QAThe ConversationalRetrievalQA chain builds on RetrievalQAChain to provide a chat history component.📄️ SQLThis example demonstrates the use of the SQLDatabaseChain for answering questions over a SQL database.📄️ Structured Output with OpenAI functionsMust be used with an OpenAI functions model.📄️ SummarizationA summarization chain can be used to summarize multiple documents. One way is to input multiple smaller documents, after they have been divided into chunks, and operate over them with a MapReduceDocumentsChain. You can also choose instead for the chain that does summarization to be a StuffDocumentsChain, or a RefineDocumentsChain.PreviousMap reduceNextAPI chains
Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAPI chainsRetrieval QAConversational Retrieval QASQLStructured Output with OpenAI functionsSummarizationAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI reference |
4e9727215e95-2432 | ModulesChainsPopularPopular📄️ API chainsAPIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.📄️ Retrieval QAThis example showcases question answering over an index.📄️ Conversational Retrieval QAThe ConversationalRetrievalQA chain builds on RetrievalQAChain to provide a chat history component.📄️ SQLThis example demonstrates the use of the SQLDatabaseChain for answering questions over a SQL database.📄️ Structured Output with OpenAI functionsMust be used with an OpenAI functions model.📄️ SummarizationA summarization chain can be used to summarize multiple documents. One way is to input multiple smaller documents, after they have been divided into chunks, and operate over them with a MapReduceDocumentsChain. You can also choose instead for the chain that does summarization to be a StuffDocumentsChain, or a RefineDocumentsChain.PreviousMap reduceNextAPI chains |
4e9727215e95-2433 | Popular📄️ API chainsAPIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.📄️ Retrieval QAThis example showcases question answering over an index.📄️ Conversational Retrieval QAThe ConversationalRetrievalQA chain builds on RetrievalQAChain to provide a chat history component.📄️ SQLThis example demonstrates the use of the SQLDatabaseChain for answering questions over a SQL database.📄️ Structured Output with OpenAI functionsMust be used with an OpenAI functions model.📄️ SummarizationA summarization chain can be used to summarize multiple documents. One way is to input multiple smaller documents, after they have been divided into chunks, and operate over them with a MapReduceDocumentsChain. You can also choose instead for the chain that does summarization to be a StuffDocumentsChain, or a RefineDocumentsChain.
APIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.
This example showcases question answering over an index.
The ConversationalRetrievalQA chain builds on RetrievalQAChain to provide a chat history component.
This example demonstrates the use of the SQLDatabaseChain for answering questions over a SQL database.
Must be used with an OpenAI functions model.
A summarization chain can be used to summarize multiple documents. One way is to input multiple smaller documents, after they have been divided into chunks, and operate over them with a MapReduceDocumentsChain. You can also choose instead for the chain that does summarization to be a StuffDocumentsChain, or a RefineDocumentsChain.
API chains
Page Title: API chains | 🦜️🔗 Langchain
Paragraphs: |
4e9727215e95-2434 | Page Title: API chains | 🦜️🔗 Langchain
Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAPI chainsRetrieval QAConversational Retrieval QASQLStructured Output with OpenAI functionsSummarizationAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsPopularAPI chainsAPI chainsAPIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.If your API requires authentication or other headers, you can pass the chain a headers property in the config object.import { OpenAI } from "langchain/llms/openai";import { APIChain } from "langchain/chains";const OPEN_METEO_DOCS = `BASE URL: https://api.open-meteo.com/API DocumentationThe API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:Parameter Format Required Default Descriptionlatitude, longitude Floating point Yes Geographical WGS84 coordinate of the locationhourly String array No A list of weather variables which should be returned. |
4e9727215e95-2435 | Values can be comma separated, or multiple &hourly= parameter in the URL can be used.daily String array No A list of daily weather variable aggregations which should be returned. Values can be comma separated, or multiple &daily= parameter in the URL can be used. If daily weather variables are specified, parameter timezone is required.current_weather Bool No false Include current weather conditions in the JSON output.temperature_unit String No celsius If fahrenheit is set, all temperature values are converted to Fahrenheit.windspeed_unit String No kmh Other wind speed speed units: ms, mph and knprecipitation_unit String No mm Other precipitation amount units: inchtimeformat String No iso8601 If format unixtime is selected, all time values are returned in UNIX epoch time in seconds. Please note that all timestamp are in GMT+0! For daily values with unix timestamps, please apply utc_offset_seconds again to get the correct date.timezone String No GMT If timezone is set, all timestamps are returned as local-time and data is returned starting at 00:00 local-time. Any time zone name from the time zone database is supported. If auto is set as a time zone, the coordinates will be automatically resolved to the local time zone.past_days Integer (0-2) No 0 If past_days is set, yesterday or the day before yesterday data are also returned.start_dateend_date String (yyyy-mm-dd) No The time interval to get weather data. A day must be specified as an ISO8601 date (e.g. |
4e9727215e95-2436 | 2022-06-30).models String array No auto Manually select one or more weather models. Per default, the best suitable weather models will be combined.Variable Valid time Unit Descriptiontemperature_2m Instant °C (°F) Air temperature at 2 meters above groundsnowfall Preceding hour sum cm (inch) Snowfall amount of the preceding hour in centimeters. For the water equivalent in millimeter, divide by 7. E.g. 7 cm snow = 10 mm precipitation water equivalentrain Preceding hour sum mm (inch) Rain from large scale weather systems of the preceding hour in millimetershowers Preceding hour sum mm (inch) Showers from convective precipitation in millimeters from the preceding hourweathercode Instant WMO code Weather condition as a numeric code. Follow WMO weather interpretation codes. See table below for details.snow_depth Instant meters Snow depth on the groundfreezinglevel_height Instant meters Altitude above sea level of the 0°C levelvisibility Instant meters Viewing distance in meters. Influenced by low clouds, humidity and aerosols. Maximum visibility is approximately 24 km.`;export async function run() { const model = new OpenAI({ modelName: "text-davinci-003" }); const chain = APIChain.fromLLMAndAPIDocs(model, OPEN_METEO_DOCS, { headers: { // These headers will be used for API requests made by the chain. }, }); const res = await chain.call({ question: "What is the weather like right now in Munich, Germany in degrees Farenheit? |
4e9727215e95-2437 | ", }); console.log({ res });}API Reference:OpenAI from langchain/llms/openaiAPIChain from langchain/chainsPreviousPopularNextRetrieval QACommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAPI chainsRetrieval QAConversational Retrieval QASQLStructured Output with OpenAI functionsSummarizationAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsPopularAPI chainsAPI chainsAPIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.If your API requires authentication or other headers, you can pass the chain a headers property in the config object.import { OpenAI } from "langchain/llms/openai";import { APIChain } from "langchain/chains";const OPEN_METEO_DOCS = `BASE URL: https://api.open-meteo.com/API DocumentationThe API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:Parameter Format Required Default Descriptionlatitude, longitude Floating point Yes Geographical WGS84 coordinate of the locationhourly String array No A list of weather variables which should be returned. Values can be comma separated, or multiple &hourly= parameter in the URL can be used.daily String array No A list of daily weather variable aggregations which should be returned. |
4e9727215e95-2438 | Values can be comma separated, or multiple &daily= parameter in the URL can be used. If daily weather variables are specified, parameter timezone is required.current_weather Bool No false Include current weather conditions in the JSON output.temperature_unit String No celsius If fahrenheit is set, all temperature values are converted to Fahrenheit.windspeed_unit String No kmh Other wind speed speed units: ms, mph and knprecipitation_unit String No mm Other precipitation amount units: inchtimeformat String No iso8601 If format unixtime is selected, all time values are returned in UNIX epoch time in seconds. Please note that all timestamp are in GMT+0! For daily values with unix timestamps, please apply utc_offset_seconds again to get the correct date.timezone String No GMT If timezone is set, all timestamps are returned as local-time and data is returned starting at 00:00 local-time. Any time zone name from the time zone database is supported. If auto is set as a time zone, the coordinates will be automatically resolved to the local time zone.past_days Integer (0-2) No 0 If past_days is set, yesterday or the day before yesterday data are also returned.start_dateend_date String (yyyy-mm-dd) No The time interval to get weather data. A day must be specified as an ISO8601 date (e.g. 2022-06-30).models String array No auto Manually select one or more weather models. |
4e9727215e95-2439 | Per default, the best suitable weather models will be combined.Variable Valid time Unit Descriptiontemperature_2m Instant °C (°F) Air temperature at 2 meters above groundsnowfall Preceding hour sum cm (inch) Snowfall amount of the preceding hour in centimeters. For the water equivalent in millimeter, divide by 7. E.g. 7 cm snow = 10 mm precipitation water equivalentrain Preceding hour sum mm (inch) Rain from large scale weather systems of the preceding hour in millimetershowers Preceding hour sum mm (inch) Showers from convective precipitation in millimeters from the preceding hourweathercode Instant WMO code Weather condition as a numeric code. Follow WMO weather interpretation codes. See table below for details.snow_depth Instant meters Snow depth on the groundfreezinglevel_height Instant meters Altitude above sea level of the 0°C levelvisibility Instant meters Viewing distance in meters. Influenced by low clouds, humidity and aerosols. Maximum visibility is approximately 24 km.`;export async function run() { const model = new OpenAI({ modelName: "text-davinci-003" }); const chain = APIChain.fromLLMAndAPIDocs(model, OPEN_METEO_DOCS, { headers: { // These headers will be used for API requests made by the chain. }, }); const res = await chain.call({ question: "What is the weather like right now in Munich, Germany in degrees Farenheit?
", }); console.log({ res });}API Reference:OpenAI from langchain/llms/openaiAPIChain from langchain/chainsPreviousPopularNextRetrieval QA |
4e9727215e95-2440 | ModulesChainsPopularAPI chainsAPI chainsAPIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.If your API requires authentication or other headers, you can pass the chain a headers property in the config object.import { OpenAI } from "langchain/llms/openai";import { APIChain } from "langchain/chains";const OPEN_METEO_DOCS = `BASE URL: https://api.open-meteo.com/API DocumentationThe API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:Parameter Format Required Default Descriptionlatitude, longitude Floating point Yes Geographical WGS84 coordinate of the locationhourly String array No A list of weather variables which should be returned. Values can be comma separated, or multiple &hourly= parameter in the URL can be used.daily String array No A list of daily weather variable aggregations which should be returned. Values can be comma separated, or multiple &daily= parameter in the URL can be used. |
4e9727215e95-2441 | If daily weather variables are specified, parameter timezone is required.current_weather Bool No false Include current weather conditions in the JSON output.temperature_unit String No celsius If fahrenheit is set, all temperature values are converted to Fahrenheit.windspeed_unit String No kmh Other wind speed speed units: ms, mph and knprecipitation_unit String No mm Other precipitation amount units: inchtimeformat String No iso8601 If format unixtime is selected, all time values are returned in UNIX epoch time in seconds. Please note that all timestamp are in GMT+0! For daily values with unix timestamps, please apply utc_offset_seconds again to get the correct date.timezone String No GMT If timezone is set, all timestamps are returned as local-time and data is returned starting at 00:00 local-time. Any time zone name from the time zone database is supported. If auto is set as a time zone, the coordinates will be automatically resolved to the local time zone.past_days Integer (0-2) No 0 If past_days is set, yesterday or the day before yesterday data are also returned.start_dateend_date String (yyyy-mm-dd) No The time interval to get weather data. A day must be specified as an ISO8601 date (e.g. 2022-06-30).models String array No auto Manually select one or more weather models. |
4e9727215e95-2442 | Per default, the best suitable weather models will be combined.Variable Valid time Unit Descriptiontemperature_2m Instant °C (°F) Air temperature at 2 meters above groundsnowfall Preceding hour sum cm (inch) Snowfall amount of the preceding hour in centimeters. For the water equivalent in millimeter, divide by 7. E.g. 7 cm snow = 10 mm precipitation water equivalentrain Preceding hour sum mm (inch) Rain from large scale weather systems of the preceding hour in millimetershowers Preceding hour sum mm (inch) Showers from convective precipitation in millimeters from the preceding hourweathercode Instant WMO code Weather condition as a numeric code. Follow WMO weather interpretation codes. See table below for details.snow_depth Instant meters Snow depth on the groundfreezinglevel_height Instant meters Altitude above sea level of the 0°C levelvisibility Instant meters Viewing distance in meters. Influenced by low clouds, humidity and aerosols. Maximum visibility is approximately 24 km.`;export async function run() { const model = new OpenAI({ modelName: "text-davinci-003" }); const chain = APIChain.fromLLMAndAPIDocs(model, OPEN_METEO_DOCS, { headers: { // These headers will be used for API requests made by the chain. }, }); const res = await chain.call({ question: "What is the weather like right now in Munich, Germany in degrees Farenheit?
", }); console.log({ res });}API Reference:OpenAI from langchain/llms/openaiAPIChain from langchain/chainsPreviousPopularNextRetrieval QA |
4e9727215e95-2443 | API chainsAPIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.If your API requires authentication or other headers, you can pass the chain a headers property in the config object.import { OpenAI } from "langchain/llms/openai";import { APIChain } from "langchain/chains";const OPEN_METEO_DOCS = `BASE URL: https://api.open-meteo.com/API DocumentationThe API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:Parameter Format Required Default Descriptionlatitude, longitude Floating point Yes Geographical WGS84 coordinate of the locationhourly String array No A list of weather variables which should be returned. Values can be comma separated, or multiple &hourly= parameter in the URL can be used.daily String array No A list of daily weather variable aggregations which should be returned. Values can be comma separated, or multiple &daily= parameter in the URL can be used. |
4e9727215e95-2444 | If daily weather variables are specified, parameter timezone is required.current_weather Bool No false Include current weather conditions in the JSON output.temperature_unit String No celsius If fahrenheit is set, all temperature values are converted to Fahrenheit.windspeed_unit String No kmh Other wind speed speed units: ms, mph and knprecipitation_unit String No mm Other precipitation amount units: inchtimeformat String No iso8601 If format unixtime is selected, all time values are returned in UNIX epoch time in seconds. Please note that all timestamp are in GMT+0! For daily values with unix timestamps, please apply utc_offset_seconds again to get the correct date.timezone String No GMT If timezone is set, all timestamps are returned as local-time and data is returned starting at 00:00 local-time. Any time zone name from the time zone database is supported. If auto is set as a time zone, the coordinates will be automatically resolved to the local time zone.past_days Integer (0-2) No 0 If past_days is set, yesterday or the day before yesterday data are also returned.start_dateend_date String (yyyy-mm-dd) No The time interval to get weather data. A day must be specified as an ISO8601 date (e.g. 2022-06-30).models String array No auto Manually select one or more weather models. |
4e9727215e95-2445 | Per default, the best suitable weather models will be combined.Variable Valid time Unit Descriptiontemperature_2m Instant °C (°F) Air temperature at 2 meters above groundsnowfall Preceding hour sum cm (inch) Snowfall amount of the preceding hour in centimeters. For the water equivalent in millimeter, divide by 7. E.g. 7 cm snow = 10 mm precipitation water equivalentrain Preceding hour sum mm (inch) Rain from large scale weather systems of the preceding hour in millimetershowers Preceding hour sum mm (inch) Showers from convective precipitation in millimeters from the preceding hourweathercode Instant WMO code Weather condition as a numeric code. Follow WMO weather interpretation codes. See table below for details.snow_depth Instant meters Snow depth on the groundfreezinglevel_height Instant meters Altitude above sea level of the 0°C levelvisibility Instant meters Viewing distance in meters. Influenced by low clouds, humidity and aerosols. Maximum visibility is approximately 24 km.`;export async function run() { const model = new OpenAI({ modelName: "text-davinci-003" }); const chain = APIChain.fromLLMAndAPIDocs(model, OPEN_METEO_DOCS, { headers: { // These headers will be used for API requests made by the chain. }, }); const res = await chain.call({ question: "What is the weather like right now in Munich, Germany in degrees Farenheit? ", }); console.log({ res });}API Reference:OpenAI from langchain/llms/openaiAPIChain from langchain/chains
If your API requires authentication or other headers, you can pass the chain a headers property in the config object. |
4e9727215e95-2446 | import { OpenAI } from "langchain/llms/openai";import { APIChain } from "langchain/chains";const OPEN_METEO_DOCS = `BASE URL: https://api.open-meteo.com/API DocumentationThe API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:Parameter Format Required Default Descriptionlatitude, longitude Floating point Yes Geographical WGS84 coordinate of the locationhourly String array No A list of weather variables which should be returned. Values can be comma separated, or multiple &hourly= parameter in the URL can be used.daily String array No A list of daily weather variable aggregations which should be returned. Values can be comma separated, or multiple &daily= parameter in the URL can be used. If daily weather variables are specified, parameter timezone is required.current_weather Bool No false Include current weather conditions in the JSON output.temperature_unit String No celsius If fahrenheit is set, all temperature values are converted to Fahrenheit.windspeed_unit String No kmh Other wind speed speed units: ms, mph and knprecipitation_unit String No mm Other precipitation amount units: inchtimeformat String No iso8601 If format unixtime is selected, all time values are returned in UNIX epoch time in seconds. Please note that all timestamp are in GMT+0! |
4e9727215e95-2447 | For daily values with unix timestamps, please apply utc_offset_seconds again to get the correct date.timezone String No GMT If timezone is set, all timestamps are returned as local-time and data is returned starting at 00:00 local-time. Any time zone name from the time zone database is supported. If auto is set as a time zone, the coordinates will be automatically resolved to the local time zone.past_days Integer (0-2) No 0 If past_days is set, yesterday or the day before yesterday data are also returned.start_dateend_date String (yyyy-mm-dd) No The time interval to get weather data. A day must be specified as an ISO8601 date (e.g. 2022-06-30).models String array No auto Manually select one or more weather models. Per default, the best suitable weather models will be combined.Variable Valid time Unit Descriptiontemperature_2m Instant °C (°F) Air temperature at 2 meters above groundsnowfall Preceding hour sum cm (inch) Snowfall amount of the preceding hour in centimeters. For the water equivalent in millimeter, divide by 7. E.g. 7 cm snow = 10 mm precipitation water equivalentrain Preceding hour sum mm (inch) Rain from large scale weather systems of the preceding hour in millimetershowers Preceding hour sum mm (inch) Showers from convective precipitation in millimeters from the preceding hourweathercode Instant WMO code Weather condition as a numeric code. Follow WMO weather interpretation codes. |
4e9727215e95-2448 | See table below for details.snow_depth Instant meters Snow depth on the groundfreezinglevel_height Instant meters Altitude above sea level of the 0°C levelvisibility Instant meters Viewing distance in meters. Influenced by low clouds, humidity and aerosols. Maximum visibility is approximately 24 km.`;export async function run() { const model = new OpenAI({ modelName: "text-davinci-003" }); const chain = APIChain.fromLLMAndAPIDocs(model, OPEN_METEO_DOCS, { headers: { // These headers will be used for API requests made by the chain. }, }); const res = await chain.call({ question: "What is the weather like right now in Munich, Germany in degrees Farenheit? ", }); console.log({ res });}
API Reference:OpenAI from langchain/llms/openaiAPIChain from langchain/chains
Retrieval QA
Page Title: Retrieval QA | 🦜️🔗 Langchain
Paragraphs: |
4e9727215e95-2449 | Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAPI chainsRetrieval QAConversational Retrieval QASQLStructured Output with OpenAI functionsSummarizationAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsPopularRetrieval QARetrieval QAThis example showcases question answering over an index.The RetrievalQAChain is a chain that combines a Retriever and a QA chain (described above). It is used to retrieve documents from a Retriever and then use a QA chain to answer a question based on the retrieved documents.UsageIn the below example, we are using a VectorStore as the Retriever. |
4e9727215e95-2450 | By default, the StuffDocumentsChain is used as the QA chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Initialize a retriever wrapper around the vector storeconst vectorStoreRetriever = vectorStore.asRetriever();// Create a chain that uses the OpenAI LLM and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStoreRetriever);const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log({ res });/*{ res: { text: 'The president said that Justice Breyer was an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court and thanked him for his service.' |
4e9727215e95-2451 | }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterCustom QA chainIn the below example, we are using a VectorStore as the Retriever and a MapReduceDocumentsChain as the QA chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAMapReduceChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAMapReduceChain(model), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer? |
4e9727215e95-2452 | ",});console.log({ res });/*{ res: { text: " The president said that Justice Breyer has dedicated his life to serve his country, and thanked him for his service. He also said that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, emphasizing the importance of protecting the rights of citizens, especially women, LGBTQ+ Americans, and access to healthcare. He also expressed his commitment to supporting the younger transgender Americans in America and ensuring they are able to reach their full potential, offering a Unity Agenda for the Nation to beat the opioid epidemic and increase funding for prevention, treatment, harm reduction, and recovery." }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAMapReduceChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterCustom promptsYou can pass in custom prompts to do question answering. |
4e9727215e95-2453 | These prompts are the same prompts as you can pass into the base question answering chains.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAStuffChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import { PromptTemplate } from "langchain/prompts";import * as fs from "fs";const promptTemplate = `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. {context}Question: {question}Answer in Italian:`;const prompt = PromptTemplate.fromTemplate(promptTemplate);// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a stuff chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAStuffChain(model, { prompt }), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer?
",});console.log({ res });/*{ res: { text: ' Il presidente ha elogiato Justice Breyer per il suo servizio e lo ha ringraziato.' |
4e9727215e95-2454 | }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAStuffChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterPromptTemplate from langchain/promptsReturn Source DocumentsAdditionally, we can return the source documents used to answer the question by specifying an optional parameter when constructing the chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from |
4e9727215e95-2455 | "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("data/state_of_the_union_2022.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever(), { returnSourceDocuments: true, // Can also be passed into the constructor});const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log(JSON.stringify(res, null, 2));/*{ "text": " The president thanked Justice Breyer for his service and asked him to stand so he could be seen. ", "sourceDocuments": [ { "pageContent": "Justice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. Stand — let me see you. Thank you.\n\nAnd we all know — no matter what your ideology, we all know one of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court.\n\nAs I did four days ago, I’ve nominated a Circuit Court of Appeals — Ketanji Brown Jackson. |
4e9727215e95-2456 | One of our nation’s top legal minds who will continue in just Brey- — Justice Breyer’s legacy of excellence. A former top litigator in private practice, a former federal public defender from a family of public-school educators and police officers — she’s a consensus builder.\n\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans. ", "metadata": { "loc": { "lines": { "from": 481, "to": 487 } } } }, { "pageContent": "Since she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nJudge Ketanji Brown Jackson\nPresident Biden's Unity AgendaLearn More\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nFolks, if we are to advance liberty and justice, we need to secure our border and fix the immigration system.\n\nAnd as you might guess, I think we can do both. |
4e9727215e95-2457 | At our border, we’ve installed new technology, like cutting-edge scanners, to better detect drug smuggling.\n\nWe’ve set up joint patrols with Mexico and Guatemala to catch more human traffickers.\n\nWe’re putting in place dedicated immigration judges in significant larger number so families fleeing persecution and violence can have their cases — cases heard faster — and those who aren’t legitimately here can be sent back. ", "metadata": { "loc": { "lines": { "from": 487, "to": 499 } } } }, { "pageContent": "These laws don’t infringe on the Second Amendment; they save lives.\n\nGun Violence\n\n\nThe most fundamental right in America is the right to vote and have it counted. And look, it’s under assault.\n\nIn state after state, new laws have been passed not only to suppress the vote — we’ve been there before — but to subvert the entire election. We can’t let this happen.\n\nTonight, I call on the Senate to pass — pass the Freedom to Vote Act. Pass the John Lewis Act — Voting Rights Act. And while you’re at it, pass the DISCLOSE Act so Americans know who is funding our elections.\n\nLook, tonight, I’d — I’d like to honor someone who has dedicated his life to serve this country: Justice Breyer — an Army veteran, Constitutional scholar, retiring Justice of the United States Supreme Court.\n\nJustice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. |
4e9727215e95-2458 | Stand — let me see you. Thank you. ", "metadata": { "loc": { "lines": { "from": 468, "to": 481 } } } }, { "pageContent": "If you want to go forward not backwards, we must protect access to healthcare; preserve a woman’s right to choose — and continue to advance maternal healthcare for all Americans.\n\nRoe v. Wade\n\n\nAnd folks, for our LGBTQ+ Americans, let’s finally get the bipartisan Equality Act to my desk. The onslaught of state laws targeting transgender Americans and their families — it’s simply wrong.\n\nAs I said last year, especially to our younger transgender Americans, I’ll always have your back as your President so you can be yourself and reach your God-given potential.\n\nBipartisan Equality Act\n\n\nFolks as I’ve just demonstrated, while it often appears we do not agree and that — we — we do agree on a lot more things than we acknowledge. ", "metadata": { "loc": { "lines": { "from": 511, "to": 523 } } } } ]}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterPreviousAPI chainsNextConversational Retrieval QACommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc. |
4e9727215e95-2459 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAPI chainsRetrieval QAConversational Retrieval QASQLStructured Output with OpenAI functionsSummarizationAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsPopularRetrieval QARetrieval QAThis example showcases question answering over an index.The RetrievalQAChain is a chain that combines a Retriever and a QA chain (described above). It is used to retrieve documents from a Retriever and then use a QA chain to answer a question based on the retrieved documents.UsageIn the below example, we are using a VectorStore as the Retriever. |
4e9727215e95-2460 | By default, the StuffDocumentsChain is used as the QA chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Initialize a retriever wrapper around the vector storeconst vectorStoreRetriever = vectorStore.asRetriever();// Create a chain that uses the OpenAI LLM and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStoreRetriever);const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log({ res });/*{ res: { text: 'The president said that Justice Breyer was an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court and thanked him for his service.' |
4e9727215e95-2461 | }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterCustom QA chainIn the below example, we are using a VectorStore as the Retriever and a MapReduceDocumentsChain as the QA chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAMapReduceChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAMapReduceChain(model), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer? |
4e9727215e95-2462 | ",});console.log({ res });/*{ res: { text: " The president said that Justice Breyer has dedicated his life to serve his country, and thanked him for his service. He also said that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, emphasizing the importance of protecting the rights of citizens, especially women, LGBTQ+ Americans, and access to healthcare. He also expressed his commitment to supporting the younger transgender Americans in America and ensuring they are able to reach their full potential, offering a Unity Agenda for the Nation to beat the opioid epidemic and increase funding for prevention, treatment, harm reduction, and recovery." }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAMapReduceChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterCustom promptsYou can pass in custom prompts to do question answering. |
4e9727215e95-2463 | These prompts are the same prompts as you can pass into the base question answering chains.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAStuffChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import { PromptTemplate } from "langchain/prompts";import * as fs from "fs";const promptTemplate = `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. {context}Question: {question}Answer in Italian:`;const prompt = PromptTemplate.fromTemplate(promptTemplate);// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a stuff chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAStuffChain(model, { prompt }), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer?
",});console.log({ res });/*{ res: { text: ' Il presidente ha elogiato Justice Breyer per il suo servizio e lo ha ringraziato.' |
4e9727215e95-2464 | }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAStuffChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterPromptTemplate from langchain/promptsReturn Source DocumentsAdditionally, we can return the source documents used to answer the question by specifying an optional parameter when constructing the chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from |
4e9727215e95-2465 | "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("data/state_of_the_union_2022.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever(), { returnSourceDocuments: true, // Can also be passed into the constructor});const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log(JSON.stringify(res, null, 2));/*{ "text": " The president thanked Justice Breyer for his service and asked him to stand so he could be seen. ", "sourceDocuments": [ { "pageContent": "Justice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. Stand — let me see you. Thank you.\n\nAnd we all know — no matter what your ideology, we all know one of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court.\n\nAs I did four days ago, I’ve nominated a Circuit Court of Appeals — Ketanji Brown Jackson. |
4e9727215e95-2466 | One of our nation’s top legal minds who will continue in just Brey- — Justice Breyer’s legacy of excellence. A former top litigator in private practice, a former federal public defender from a family of public-school educators and police officers — she’s a consensus builder.\n\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans. ", "metadata": { "loc": { "lines": { "from": 481, "to": 487 } } } }, { "pageContent": "Since she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nJudge Ketanji Brown Jackson\nPresident Biden's Unity AgendaLearn More\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nFolks, if we are to advance liberty and justice, we need to secure our border and fix the immigration system.\n\nAnd as you might guess, I think we can do both. |
4e9727215e95-2467 | At our border, we’ve installed new technology, like cutting-edge scanners, to better detect drug smuggling.\n\nWe’ve set up joint patrols with Mexico and Guatemala to catch more human traffickers.\n\nWe’re putting in place dedicated immigration judges in significant larger number so families fleeing persecution and violence can have their cases — cases heard faster — and those who aren’t legitimately here can be sent back. ", "metadata": { "loc": { "lines": { "from": 487, "to": 499 } } } }, { "pageContent": "These laws don’t infringe on the Second Amendment; they save lives.\n\nGun Violence\n\n\nThe most fundamental right in America is the right to vote and have it counted. And look, it’s under assault.\n\nIn state after state, new laws have been passed not only to suppress the vote — we’ve been there before — but to subvert the entire election. We can’t let this happen.\n\nTonight, I call on the Senate to pass — pass the Freedom to Vote Act. Pass the John Lewis Act — Voting Rights Act. And while you’re at it, pass the DISCLOSE Act so Americans know who is funding our elections.\n\nLook, tonight, I’d — I’d like to honor someone who has dedicated his life to serve this country: Justice Breyer — an Army veteran, Constitutional scholar, retiring Justice of the United States Supreme Court.\n\nJustice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. |
4e9727215e95-2468 | Stand — let me see you. Thank you. ", "metadata": { "loc": { "lines": { "from": 468, "to": 481 } } } }, { "pageContent": "If you want to go forward not backwards, we must protect access to healthcare; preserve a woman’s right to choose — and continue to advance maternal healthcare for all Americans.\n\nRoe v. Wade\n\n\nAnd folks, for our LGBTQ+ Americans, let’s finally get the bipartisan Equality Act to my desk. The onslaught of state laws targeting transgender Americans and their families — it’s simply wrong.\n\nAs I said last year, especially to our younger transgender Americans, I’ll always have your back as your President so you can be yourself and reach your God-given potential.\n\nBipartisan Equality Act\n\n\nFolks as I’ve just demonstrated, while it often appears we do not agree and that — we — we do agree on a lot more things than we acknowledge. ", "metadata": { "loc": { "lines": { "from": 511, "to": 523 } } } } ]}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterPreviousAPI chainsNextConversational Retrieval QA |
4e9727215e95-2469 | ModulesChainsPopularRetrieval QARetrieval QAThis example showcases question answering over an index.The RetrievalQAChain is a chain that combines a Retriever and a QA chain (described above). It is used to retrieve documents from a Retriever and then use a QA chain to answer a question based on the retrieved documents.UsageIn the below example, we are using a VectorStore as the Retriever. |
4e9727215e95-2470 | By default, the StuffDocumentsChain is used as the QA chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Initialize a retriever wrapper around the vector storeconst vectorStoreRetriever = vectorStore.asRetriever();// Create a chain that uses the OpenAI LLM and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStoreRetriever);const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log({ res });/*{ res: { text: 'The president said that Justice Breyer was an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court and thanked him for his service.' |
4e9727215e95-2471 | }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterCustom QA chainIn the below example, we are using a VectorStore as the Retriever and a MapReduceDocumentsChain as the QA chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAMapReduceChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAMapReduceChain(model), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer? |
4e9727215e95-2472 | ",});console.log({ res });/*{ res: { text: " The president said that Justice Breyer has dedicated his life to serve his country, and thanked him for his service. He also said that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, emphasizing the importance of protecting the rights of citizens, especially women, LGBTQ+ Americans, and access to healthcare. He also expressed his commitment to supporting the younger transgender Americans in America and ensuring they are able to reach their full potential, offering a Unity Agenda for the Nation to beat the opioid epidemic and increase funding for prevention, treatment, harm reduction, and recovery." }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAMapReduceChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterCustom promptsYou can pass in custom prompts to do question answering. |
4e9727215e95-2473 | These prompts are the same prompts as you can pass into the base question answering chains.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAStuffChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import { PromptTemplate } from "langchain/prompts";import * as fs from "fs";const promptTemplate = `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. {context}Question: {question}Answer in Italian:`;const prompt = PromptTemplate.fromTemplate(promptTemplate);// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a stuff chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAStuffChain(model, { prompt }), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer?
",});console.log({ res });/*{ res: { text: ' Il presidente ha elogiato Justice Breyer per il suo servizio e lo ha ringraziato.' |
4e9727215e95-2474 | }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAStuffChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterPromptTemplate from langchain/promptsReturn Source DocumentsAdditionally, we can return the source documents used to answer the question by specifying an optional parameter when constructing the chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from |
4e9727215e95-2475 | "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("data/state_of_the_union_2022.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever(), { returnSourceDocuments: true, // Can also be passed into the constructor});const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log(JSON.stringify(res, null, 2));/*{ "text": " The president thanked Justice Breyer for his service and asked him to stand so he could be seen. ", "sourceDocuments": [ { "pageContent": "Justice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. Stand — let me see you. Thank you.\n\nAnd we all know — no matter what your ideology, we all know one of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court.\n\nAs I did four days ago, I’ve nominated a Circuit Court of Appeals — Ketanji Brown Jackson. |
4e9727215e95-2476 | One of our nation’s top legal minds who will continue in just Brey- — Justice Breyer’s legacy of excellence. A former top litigator in private practice, a former federal public defender from a family of public-school educators and police officers — she’s a consensus builder.\n\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans. ", "metadata": { "loc": { "lines": { "from": 481, "to": 487 } } } }, { "pageContent": "Since she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nJudge Ketanji Brown Jackson\nPresident Biden's Unity AgendaLearn More\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nFolks, if we are to advance liberty and justice, we need to secure our border and fix the immigration system.\n\nAnd as you might guess, I think we can do both. |
4e9727215e95-2477 | At our border, we’ve installed new technology, like cutting-edge scanners, to better detect drug smuggling.\n\nWe’ve set up joint patrols with Mexico and Guatemala to catch more human traffickers.\n\nWe’re putting in place dedicated immigration judges in significant larger number so families fleeing persecution and violence can have their cases — cases heard faster — and those who aren’t legitimately here can be sent back. ", "metadata": { "loc": { "lines": { "from": 487, "to": 499 } } } }, { "pageContent": "These laws don’t infringe on the Second Amendment; they save lives.\n\nGun Violence\n\n\nThe most fundamental right in America is the right to vote and have it counted. And look, it’s under assault.\n\nIn state after state, new laws have been passed not only to suppress the vote — we’ve been there before — but to subvert the entire election. We can’t let this happen.\n\nTonight, I call on the Senate to pass — pass the Freedom to Vote Act. Pass the John Lewis Act — Voting Rights Act. And while you’re at it, pass the DISCLOSE Act so Americans know who is funding our elections.\n\nLook, tonight, I’d — I’d like to honor someone who has dedicated his life to serve this country: Justice Breyer — an Army veteran, Constitutional scholar, retiring Justice of the United States Supreme Court.\n\nJustice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. |
4e9727215e95-2478 | Stand — let me see you. Thank you. ", "metadata": { "loc": { "lines": { "from": 468, "to": 481 } } } }, { "pageContent": "If you want to go forward not backwards, we must protect access to healthcare; preserve a woman’s right to choose — and continue to advance maternal healthcare for all Americans.\n\nRoe v. Wade\n\n\nAnd folks, for our LGBTQ+ Americans, let’s finally get the bipartisan Equality Act to my desk. The onslaught of state laws targeting transgender Americans and their families — it’s simply wrong.\n\nAs I said last year, especially to our younger transgender Americans, I’ll always have your back as your President so you can be yourself and reach your God-given potential.\n\nBipartisan Equality Act\n\n\nFolks as I’ve just demonstrated, while it often appears we do not agree and that — we — we do agree on a lot more things than we acknowledge. ", "metadata": { "loc": { "lines": { "from": 511, "to": 523 } } } } ]}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterPreviousAPI chainsNextConversational Retrieval QA |
4e9727215e95-2479 | Retrieval QAThis example showcases question answering over an index.The RetrievalQAChain is a chain that combines a Retriever and a QA chain (described above). It is used to retrieve documents from a Retriever and then use a QA chain to answer a question based on the retrieved documents.UsageIn the below example, we are using a VectorStore as the Retriever. By default, the StuffDocumentsChain is used as the QA chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Initialize a retriever wrapper around the vector storeconst vectorStoreRetriever = vectorStore.asRetriever();// Create a chain that uses the OpenAI LLM and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStoreRetriever);const res = await chain.call({ query: "What did the president say about Justice Breyer? |
4e9727215e95-2480 | ",});console.log({ res });/*{ res: { text: 'The president said that Justice Breyer was an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court and thanked him for his service.'
}}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterCustom QA chainIn the below example, we are using a VectorStore as the Retriever and a MapReduceDocumentsChain as the QA chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAMapReduceChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAMapReduceChain(model), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer? |
4e9727215e95-2481 | ",});console.log({ res });/*{ res: { text: " The president said that Justice Breyer has dedicated his life to serve his country, and thanked him for his service. He also said that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, emphasizing the importance of protecting the rights of citizens, especially women, LGBTQ+ Americans, and access to healthcare. He also expressed his commitment to supporting the younger transgender Americans in America and ensuring they are able to reach their full potential, offering a Unity Agenda for the Nation to beat the opioid epidemic and increase funding for prevention, treatment, harm reduction, and recovery." }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAMapReduceChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterCustom promptsYou can pass in custom prompts to do question answering. |
4e9727215e95-2482 | These prompts are the same prompts as you can pass into the base question answering chains.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAStuffChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import { PromptTemplate } from "langchain/prompts";import * as fs from "fs";const promptTemplate = `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. {context}Question: {question}Answer in Italian:`;const prompt = PromptTemplate.fromTemplate(promptTemplate);// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a stuff chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAStuffChain(model, { prompt }), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer?
",});console.log({ res });/*{ res: { text: ' Il presidente ha elogiato Justice Breyer per il suo servizio e lo ha ringraziato.' |
4e9727215e95-2483 | }}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAStuffChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterPromptTemplate from langchain/promptsReturn Source DocumentsAdditionally, we can return the source documents used to answer the question by specifying an optional parameter when constructing the chain.import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from |
4e9727215e95-2484 | "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("data/state_of_the_union_2022.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever(), { returnSourceDocuments: true, // Can also be passed into the constructor});const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log(JSON.stringify(res, null, 2));/*{ "text": " The president thanked Justice Breyer for his service and asked him to stand so he could be seen. ", "sourceDocuments": [ { "pageContent": "Justice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. Stand — let me see you. Thank you.\n\nAnd we all know — no matter what your ideology, we all know one of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court.\n\nAs I did four days ago, I’ve nominated a Circuit Court of Appeals — Ketanji Brown Jackson. |
4e9727215e95-2485 | One of our nation’s top legal minds who will continue in just Brey- — Justice Breyer’s legacy of excellence. A former top litigator in private practice, a former federal public defender from a family of public-school educators and police officers — she’s a consensus builder.\n\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans. ", "metadata": { "loc": { "lines": { "from": 481, "to": 487 } } } }, { "pageContent": "Since she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nJudge Ketanji Brown Jackson\nPresident Biden's Unity AgendaLearn More\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nFolks, if we are to advance liberty and justice, we need to secure our border and fix the immigration system.\n\nAnd as you might guess, I think we can do both. |
4e9727215e95-2486 | At our border, we’ve installed new technology, like cutting-edge scanners, to better detect drug smuggling.\n\nWe’ve set up joint patrols with Mexico and Guatemala to catch more human traffickers.\n\nWe’re putting in place dedicated immigration judges in significant larger number so families fleeing persecution and violence can have their cases — cases heard faster — and those who aren’t legitimately here can be sent back. ", "metadata": { "loc": { "lines": { "from": 487, "to": 499 } } } }, { "pageContent": "These laws don’t infringe on the Second Amendment; they save lives.\n\nGun Violence\n\n\nThe most fundamental right in America is the right to vote and have it counted. And look, it’s under assault.\n\nIn state after state, new laws have been passed not only to suppress the vote — we’ve been there before — but to subvert the entire election. We can’t let this happen.\n\nTonight, I call on the Senate to pass — pass the Freedom to Vote Act. Pass the John Lewis Act — Voting Rights Act. And while you’re at it, pass the DISCLOSE Act so Americans know who is funding our elections.\n\nLook, tonight, I’d — I’d like to honor someone who has dedicated his life to serve this country: Justice Breyer — an Army veteran, Constitutional scholar, retiring Justice of the United States Supreme Court.\n\nJustice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. |
4e9727215e95-2487 | Stand — let me see you. Thank you. ", "metadata": { "loc": { "lines": { "from": 468, "to": 481 } } } }, { "pageContent": "If you want to go forward not backwards, we must protect access to healthcare; preserve a woman’s right to choose — and continue to advance maternal healthcare for all Americans.\n\nRoe v. Wade\n\n\nAnd folks, for our LGBTQ+ Americans, let’s finally get the bipartisan Equality Act to my desk. The onslaught of state laws targeting transgender Americans and their families — it’s simply wrong.\n\nAs I said last year, especially to our younger transgender Americans, I’ll always have your back as your President so you can be yourself and reach your God-given potential.\n\nBipartisan Equality Act\n\n\nFolks as I’ve just demonstrated, while it often appears we do not agree and that — we — we do agree on a lot more things than we acknowledge. ", "metadata": { "loc": { "lines": { "from": 511, "to": 523 } } } } ]}*/API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitter
The RetrievalQAChain is a chain that combines a Retriever and a QA chain (described above). It is used to retrieve documents from a Retriever and then use a QA chain to answer a question based on the retrieved documents. |
4e9727215e95-2488 | In the below example, we are using a VectorStore as the Retriever. By default, the StuffDocumentsChain is used as the QA chain.
In the below example, we are using a VectorStore as the Retriever and a MapReduceDocumentsChain as the QA chain. |
4e9727215e95-2489 | import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAMapReduceChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAMapReduceChain(model), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log({ res });/*{ res: { text: " The president said that Justice Breyer has dedicated his life to serve his country, and thanked him for his service. He also said that Judge Ketanji Brown Jackson will continue Justice Breyer's legacy of excellence, emphasizing the importance of protecting the rights of citizens, especially women, LGBTQ+ Americans, and access to healthcare.
He also expressed his commitment to supporting the younger transgender Americans in America and ensuring they are able to reach their full potential, offering a Unity Agenda for the Nation to beat the opioid epidemic and increase funding for prevention, treatment, harm reduction, and recovery." }}*/ |
4e9727215e95-2490 | API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAMapReduceChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitter
You can pass in custom prompts to do question answering. These prompts are the same prompts as you can pass into the base question answering chains. |
4e9727215e95-2491 | import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain, loadQAStuffChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import { PromptTemplate } from "langchain/prompts";import * as fs from "fs";const promptTemplate = `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. {context}Question: {question}Answer in Italian:`;const prompt = PromptTemplate.fromTemplate(promptTemplate);// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("state_of_the_union.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a stuff chain and HNSWLib vector store.const chain = new RetrievalQAChain({ combineDocumentsChain: loadQAStuffChain(model, { prompt }), retriever: vectorStore.asRetriever(),});const res = await chain.call({ query: "What did the president say about Justice Breyer?
",});console.log({ res });/*{ res: { text: ' Il presidente ha elogiato Justice Breyer per il suo servizio e lo ha ringraziato.' }}*/ |
4e9727215e95-2492 | API Reference:OpenAI from langchain/llms/openaiRetrievalQAChain from langchain/chainsloadQAStuffChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterPromptTemplate from langchain/prompts
Additionally, we can return the source documents used to answer the question by specifying an optional parameter when constructing the chain. |
4e9727215e95-2493 | import { OpenAI } from "langchain/llms/openai";import { RetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import * as fs from "fs";// Initialize the LLM to use to answer the question.const model = new OpenAI({});const text = fs.readFileSync("data/state_of_the_union_2022.txt", "utf8");const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });const docs = await textSplitter.createDocuments([text]);// Create a vector store from the documents.const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());// Create a chain that uses a map reduce chain and HNSWLib vector store.const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever(), { returnSourceDocuments: true, // Can also be passed into the constructor});const res = await chain.call({ query: "What did the president say about Justice Breyer? ",});console.log(JSON.stringify(res, null, 2));/*{ "text": " The president thanked Justice Breyer for his service and asked him to stand so he could be seen. ", "sourceDocuments": [ { "pageContent": "Justice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. Stand — let me see you. |
4e9727215e95-2494 | Thank you.\n\nAnd we all know — no matter what your ideology, we all know one of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court.\n\nAs I did four days ago, I’ve nominated a Circuit Court of Appeals — Ketanji Brown Jackson. One of our nation’s top legal minds who will continue in just Brey- — Justice Breyer’s legacy of excellence. A former top litigator in private practice, a former federal public defender from a family of public-school educators and police officers — she’s a consensus builder.\n\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans. ", "metadata": { "loc": { "lines": { "from": 481, "to": 487 } } } }, { "pageContent": "Since she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nJudge Ketanji Brown Jackson\nPresident Biden's Unity AgendaLearn More\nSince she’s been nominated, she’s received a broad range of support, including the Fraternal Order of Police and former judges appointed by Democrats and Republicans.\n\nFolks, if we are to advance liberty and justice, we need to secure our border and fix the immigration system.\n\nAnd as you might guess, I think we can do both. |
4e9727215e95-2495 | At our border, we’ve installed new technology, like cutting-edge scanners, to better detect drug smuggling.\n\nWe’ve set up joint patrols with Mexico and Guatemala to catch more human traffickers.\n\nWe’re putting in place dedicated immigration judges in significant larger number so families fleeing persecution and violence can have their cases — cases heard faster — and those who aren’t legitimately here can be sent back. ", "metadata": { "loc": { "lines": { "from": 487, "to": 499 } } } }, { "pageContent": "These laws don’t infringe on the Second Amendment; they save lives.\n\nGun Violence\n\n\nThe most fundamental right in America is the right to vote and have it counted. And look, it’s under assault.\n\nIn state after state, new laws have been passed not only to suppress the vote — we’ve been there before — but to subvert the entire election. We can’t let this happen.\n\nTonight, I call on the Senate to pass — pass the Freedom to Vote Act. Pass the John Lewis Act — Voting Rights Act. And while you’re at it, pass the DISCLOSE Act so Americans know who is funding our elections.\n\nLook, tonight, I’d — I’d like to honor someone who has dedicated his life to serve this country: Justice Breyer — an Army veteran, Constitutional scholar, retiring Justice of the United States Supreme Court.\n\nJustice Breyer, thank you for your service. Thank you, thank you, thank you. I mean it. Get up. |
4e9727215e95-2496 | Stand — let me see you. Thank you. ", "metadata": { "loc": { "lines": { "from": 468, "to": 481 } } } }, { "pageContent": "If you want to go forward not backwards, we must protect access to healthcare; preserve a woman’s right to choose — and continue to advance maternal healthcare for all Americans.\n\nRoe v. Wade\n\n\nAnd folks, for our LGBTQ+ Americans, let’s finally get the bipartisan Equality Act to my desk. The onslaught of state laws targeting transgender Americans and their families — it’s simply wrong.\n\nAs I said last year, especially to our younger transgender Americans, I’ll always have your back as your President so you can be yourself and reach your God-given potential.\n\nBipartisan Equality Act\n\n\nFolks as I’ve just demonstrated, while it often appears we do not agree and that — we — we do agree on a lot more things than we acknowledge. ", "metadata": { "loc": { "lines": { "from": 511, "to": 523 } } } } ]}*/
Conversational Retrieval QA
Page Title: Conversational Retrieval QA | 🦜️🔗 Langchain
Paragraphs: |
4e9727215e95-2497 | Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAPI chainsRetrieval QAConversational Retrieval QASQLStructured Output with OpenAI functionsSummarizationAdditionalMemoryAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI referenceModulesChainsPopularConversational Retrieval QAConversational Retrieval QAThe ConversationalRetrievalQA chain builds on RetrievalQAChain to provide a chat history component.It first combines the chat history (either explicitly passed in or retrieved from the provided memory) and the question into a standalone question, then looks up relevant documents from the retriever, and finally passes those documents and the question to a question answering chain to return a response.To create one, you will need a retriever. |
4e9727215e95-2498 | In the below example, we will create one from a vector store, which can be created from embeddings.import { ChatOpenAI } from "langchain/chat_models/openai";import { ConversationalRetrievalQAChain } from "langchain/chains";import { HNSWLib } from "langchain/vectorstores/hnswlib";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";import { BufferMemory } from "langchain/memory";import * as fs from "fs";export const run = async () => { /* Initialize the LLM to use to answer the question */ const model = new ChatOpenAI({}); /* Load in the file we want to do question answering over */ const text = fs.readFileSync("state_of_the_union.txt", "utf8"); /* Split the text into chunks */ const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 }); const docs = await textSplitter.createDocuments([text]); /* Create the vectorstore */ const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings()); /* Create the chain */ const chain = ConversationalRetrievalQAChain.fromLLM( model, vectorStore.asRetriever(), { memory: new BufferMemory({ memoryKey: "chat_history", // Must be set to "chat_history" }), } ); /* Ask it a question */ const question = "What did the president say about Justice Breyer? |
4e9727215e95-2499 | "; const res = await chain.call({ question }); console.log(res); /* Ask it a follow up question */ const followUpRes = await chain.call({ question: "Was that nice? ", }); console.log(followUpRes);};API Reference:ChatOpenAI from langchain/chat_models/openaiConversationalRetrievalQAChain from langchain/chainsHNSWLib from langchain/vectorstores/hnswlibOpenAIEmbeddings from langchain/embeddings/openaiRecursiveCharacterTextSplitter from langchain/text_splitterBufferMemory from langchain/memoryIn the above code snippet, the fromLLM method of the ConversationalRetrievalQAChain class has the following signature:static fromLLM( llm: BaseLanguageModel, retriever: BaseRetriever, options? : { questionGeneratorChainOptions? : { llm? : BaseLanguageModel; template? : string; }; qaChainOptions? : QAChainParams; returnSourceDocuments?
: boolean; }): ConversationalRetrievalQAChainHere's an explanation of each of the attributes of the options object:questionGeneratorChainOptions: An object that allows you to pass a custom template and LLM to the underlying question generation chain.If the template is provided, the ConversationalRetrievalQAChain will use this template to generate a question from the conversation context instead of using the question provided in the question parameter.Passing in a separate LLM (llm) here allows you to use a cheaper/faster model to create the condensed question while using a more powerful model for the final response, and can reduce unnecessary latency.qaChainOptions: Options that allow you to customize the specific QA chain used in the final step. The default is the StuffDocumentsChain, but you can customize which chain is used by passing in a type parameter. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.