id stringlengths 14 17 | text stringlengths 42 2.11k |
|---|---|
4e9727215e95-2700 | import { OpenAIModerationChain, LLMChain } from "langchain/chains";import { PromptTemplate } from "langchain/prompts";import { OpenAI } from "langchain/llms/openai";// A string containing potentially offensive content from the userconst badString = "Bad naughty words from user";try { // Create a new instance of the Op... |
4e9727215e95-2701 | Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAdditionalOpenAI functions chainsAnalyze DocumentSelf-critique chain with constitutional AIModerationDynamically s... |
4e9727215e95-2702 | When you don't know the answer to a question you admit that you don't know.Here is a question:{input}`;const mathTemplate = `You are a very good mathematician. You are great at answering math questions. You are so good because you are able to break down hard problems into their component parts, answer the component par... |
4e9727215e95-2703 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsHow toFoundationalDocumentsPopularAdditionalOpenAI functions chainsAnalyze DocumentSelf-critique chain with constitutional AIModerationDynamically selecting from multiple promptsDynamically selecting from multiple retrieversMemoryAgentsC... |
4e9727215e95-2704 | When you don't know the answer to a question you admit that you don't know.Here is a question:{input}`;const mathTemplate = `You are a very good mathematician. You are great at answering math questions. You are so good because you are able to break down hard problems into their component parts, answer the component par... |
4e9727215e95-2705 | ModulesChainsAdditionalDynamically selecting from multiple promptsDynamically selecting from multiple promptsThis notebook demonstrates how to use the RouterChain paradigm to create a chain that dynamically selects the prompt to use for a given input. Specifically we show how to use the MultiPromptChain to create a que... |
4e9727215e95-2706 | You are great at answering questions about history in a concise and easy to understand manner. When you don't know the answer to a question you admit that you don't know.Here is a question:{input}`;const promptTemplates = [physicsTemplate, mathTemplate, historyTemplate];const multiPromptChain = MultiPromptChain.fromLLM... |
4e9727215e95-2707 | Dynamically selecting from multiple promptsThis notebook demonstrates how to use the RouterChain paradigm to create a chain that dynamically selects the prompt to use for a given input. Specifically we show how to use the MultiPromptChain to create a question-answering chain that selects the prompt which is most releva... |
4e9727215e95-2708 | When you don't know the answer to a question you admit that you don't know.Here is a question:{input}`;const promptTemplates = [physicsTemplate, mathTemplate, historyTemplate];const multiPromptChain = MultiPromptChain.fromLLMAndPrompts(llm, { promptNames, promptDescriptions, promptTemplates,});const testPromise1 = m... |
4e9727215e95-2709 | import { MultiPromptChain } from "langchain/chains";import { OpenAIChat } from "langchain/llms/openai";const llm = new OpenAIChat();const promptNames = ["physics", "math", "history"];const promptDescriptions = [ "Good for answering questions about physics", "Good for answering math questions", "Good for answering qu... |
4e9727215e95-2710 | ",});const testPromise2 = multiPromptChain.call({ input: "What is the derivative of x^2? ",});const testPromise3 = multiPromptChain.call({ input: "Who was the first president of the United States? ",});const [{ text: result1 }, { text: result2 }, { text: result3 }] = await Promise.all([testPromise1, testPromise2, te... |
4e9727215e95-2711 | Specifically we show how to use the MultiRetrievalQAChain to create a question-answering chain that selects the retrieval QA chain which is most relevant for a given question, and then answers the question using it.import { MultiRetrievalQAChain } from "langchain/chains";import { OpenAIChat } from "langchain/llms/opena... |
4e9727215e95-2712 | We'll send him cheesy movies the worst we can find He'll have to sit and watch them all and we'll monitor his mind", "Now keep in mind Joel can't control where the movies begin or end Because he used those special parts to make his robot friends. Robot Roll Call Cambot Gypsy Tom Servo Croooow", "If you're wonderi... |
4e9727215e95-2713 | pay-or-play contracts We're zany to the max There's baloney in our slacks", "We're Animanie Totally insaney Here's the show's namey", "Animaniacs Those are the facts", ], { series: "Animaniacs" }, embeddings);const llm = new OpenAIChat();const retrieverNames = ["aqua teen", "mst3k", "animaniacs"];const retriev... |
4e9727215e95-2714 | ",});const [ { text: result1, sourceDocuments: sourceDocuments1 }, { text: result2, sourceDocuments: sourceDocuments2 }, { text: result3, sourceDocuments: sourceDocuments3 },] = await Promise.all([testPromise1, testPromise2, testPromise3]);console.log(sourceDocuments1, sourceDocuments2, sourceDocuments3);console.log... |
4e9727215e95-2715 | Specifically we show how to use the MultiRetrievalQAChain to create a question-answering chain that selects the retrieval QA chain which is most relevant for a given question, and then answers the question using it.import { MultiRetrievalQAChain } from "langchain/chains";import { OpenAIChat } from "langchain/llms/opena... |
4e9727215e95-2716 | We'll send him cheesy movies the worst we can find He'll have to sit and watch them all and we'll monitor his mind", "Now keep in mind Joel can't control where the movies begin or end Because he used those special parts to make his robot friends. Robot Roll Call Cambot Gypsy Tom Servo Croooow", "If you're wonderi... |
4e9727215e95-2717 | pay-or-play contracts We're zany to the max There's baloney in our slacks", "We're Animanie Totally insaney Here's the show's namey", "Animaniacs Those are the facts", ], { series: "Animaniacs" }, embeddings);const llm = new OpenAIChat();const retrieverNames = ["aqua teen", "mst3k", "animaniacs"];const retriev... |
4e9727215e95-2718 | ",});const [ { text: result1, sourceDocuments: sourceDocuments1 }, { text: result2, sourceDocuments: sourceDocuments2 }, { text: result3, sourceDocuments: sourceDocuments3 },] = await Promise.all([testPromise1, testPromise2, testPromise3]);console.log(sourceDocuments1, sourceDocuments2, sourceDocuments3);console.log... |
4e9727215e95-2719 | ModulesChainsAdditionalDynamically selecting from multiple retrieversDynamically selecting from multiple retrieversThis notebook demonstrates how to use the RouterChain paradigm to create a chain that dynamically selects which Retrieval system to use. Specifically we show how to use the MultiRetrievalQAChain to create ... |
4e9727215e95-2720 | There was a guy named Joel not too different from you or me. He worked at Gizmonic Institute, just another face in a red jumpsuit", "He did a good job cleaning up the place but his bosses didn't like him so they shot him into space. We'll send him cheesy movies the worst we can find He'll have to sit and watch them ... |
4e9727215e95-2721 | pay-or-play contracts We're zany to the max There's baloney in our slacks", "We're Animanie Totally insaney Here's the show's namey", "Animaniacs Those are the facts", ], { series: "Animaniacs" }, embeddings);const llm = new OpenAIChat();const retrieverNames = ["aqua teen", "mst3k", "animaniacs"];const retriev... |
4e9727215e95-2722 | ",});const [ { text: result1, sourceDocuments: sourceDocuments1 }, { text: result2, sourceDocuments: sourceDocuments2 }, { text: result3, sourceDocuments: sourceDocuments3 },] = await Promise.all([testPromise1, testPromise2, testPromise3]);console.log(sourceDocuments1, sourceDocuments2, sourceDocuments3);console.log... |
4e9727215e95-2723 | Dynamically selecting from multiple retrieversThis notebook demonstrates how to use the RouterChain paradigm to create a chain that dynamically selects which Retrieval system to use. Specifically we show how to use the MultiRetrievalQAChain to create a question-answering chain that selects the retrieval QA chain which ... |
4e9727215e95-2724 | He worked at Gizmonic Institute, just another face in a red jumpsuit", "He did a good job cleaning up the place but his bosses didn't like him so they shot him into space. We'll send him cheesy movies the worst we can find He'll have to sit and watch them all and we'll monitor his mind", "Now keep in mind Joel ca... |
4e9727215e95-2725 | pay-or-play contracts We're zany to the max There's baloney in our slacks", "We're Animanie Totally insaney Here's the show's namey", "Animaniacs Those are the facts", ], { series: "Animaniacs" }, embeddings);const llm = new OpenAIChat();const retrieverNames = ["aqua teen", "mst3k", "animaniacs"];const retriev... |
4e9727215e95-2726 | ",});const [ { text: result1, sourceDocuments: sourceDocuments1 }, { text: result2, sourceDocuments: sourceDocuments2 }, { text: result3, sourceDocuments: sourceDocuments3 },] = await Promise.all([testPromise1, testPromise2, testPromise3]);console.log(sourceDocuments1, sourceDocuments2, sourceDocuments3);console.log... |
4e9727215e95-2727 | import { MultiRetrievalQAChain } from "langchain/chains";import { OpenAIChat } from "langchain/llms/openai";import { OpenAIEmbeddings } from "langchain/embeddings/openai";import { MemoryVectorStore } from "langchain/vectorstores/memory";const embeddings = new OpenAIEmbeddings();const aquaTeen = await MemoryVectorStore.... |
4e9727215e95-2728 | Robot Roll Call Cambot Gypsy Tom Servo Croooow", "If you're wondering how he eats and breathes and other science facts La la la just repeat to yourself it's just a show I should really just relax.
For Mystery Science Theater 3000", ], { series: "Mystery Science Theater 3000" }, embeddings);const animaniacs = awa... |
4e9727215e95-2729 | pay-or-play contracts We're zany to the max There's baloney in our slacks", "We're Animanie Totally insaney Here's the show's namey", "Animaniacs Those are the facts", ], { series: "Animaniacs" }, embeddings);const llm = new OpenAIChat();const retrieverNames = ["aqua teen", "mst3k", "animaniacs"];const retriev... |
4e9727215e95-2730 | ",});const [ { text: result1, sourceDocuments: sourceDocuments1 }, { text: result2, sourceDocuments: sourceDocuments2 }, { text: result3, sourceDocuments: sourceDocuments3 },] = await Promise.all([testPromise1, testPromise2, testPromise3]);console.log(sourceDocuments1, sourceDocuments2, sourceDocuments3);console.log... |
4e9727215e95-2731 | These are designed to be modular and useful regardless of how they are used.
Secondly, LangChain provides easy ways to incorporate these utilities into chains.Get startedMemory involves keeping a concept of state around throughout a user's interactions with a language model. A user's interactions with a language mode... |
4e9727215e95-2732 | This is a super lightweight wrapper which exposes convenience methods for saving Human messages, AI messages, and then fetching them all.Subclassing this class allows you to use different storage solutions, such as Redis, to keep persistent chat message histories.import { ChatMessageHistory } from "langchain/memory";co... |
4e9727215e95-2733 | This lets you easily pick up state from past conversations. In addition to the above technique, you can do:import { BufferMemory, ChatMessageHistory } from "langchain/memory";import { HumanChatMessage, AIChatMessage } from "langchain/schema";const pastMessages = [ new HumanMessage("My name's Jonas"), new AIMessage("N... |
4e9727215e95-2734 | "}const res2 = await chain.call({ input: "What's my name?" });console.log({ res2 });{response: ' You said your name is Jim. Is there anything else you would like to talk about? '}There are plenty of different types of memory, check out our examples to see more!Creating your own memory classThe BaseMemory interface has... |
4e9727215e95-2735 | The loadMemoryVariables method is responsible for returning the memory variables that are relevant for the current input values.abstract class BaseMemory { abstract loadMemoryVariables(values: InputValues): Promise<MemoryVariables>; abstract saveContext( inputValues: InputValues, outputValues: OutputValues ): ... |
4e9727215e95-2736 | These are designed to be modular and useful regardless of how they are used.
Secondly, LangChain provides easy ways to incorporate these utilities into chains.Get startedMemory involves keeping a concept of state around throughout a user's interactions with a language model. A user's interactions with a language mode... |
4e9727215e95-2737 | This is a super lightweight wrapper which exposes convenience methods for saving Human messages, AI messages, and then fetching them all.Subclassing this class allows you to use different storage solutions, such as Redis, to keep persistent chat message histories.import { ChatMessageHistory } from "langchain/memory";co... |
4e9727215e95-2738 | This lets you easily pick up state from past conversations. In addition to the above technique, you can do:import { BufferMemory, ChatMessageHistory } from "langchain/memory";import { HumanChatMessage, AIChatMessage } from "langchain/schema";const pastMessages = [ new HumanMessage("My name's Jonas"), new AIMessage("N... |
4e9727215e95-2739 | "}const res2 = await chain.call({ input: "What's my name?" });console.log({ res2 });{response: ' You said your name is Jim. Is there anything else you would like to talk about? '}There are plenty of different types of memory, check out our examples to see more!Creating your own memory classThe BaseMemory interface has... |
4e9727215e95-2740 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toIntegrationsAgentsCallbacksModulesGuidesEcosystemAdditional resourcesCommunity navigatorAPI reference
ModulesMemoryOn this pageMemory🚧 Docs under construction 🚧By default, Chains and Agents are stateless,
meaning that the... |
4e9727215e95-2741 | These are designed to be modular and useful regardless of how they are used.
Secondly, LangChain provides easy ways to incorporate these utilities into chains.Get startedMemory involves keeping a concept of state around throughout a user's interactions with a language model. A user's interactions with a language mode... |
4e9727215e95-2742 | This is a super lightweight wrapper which exposes convenience methods for saving Human messages, AI messages, and then fetching them all.Subclassing this class allows you to use different storage solutions, such as Redis, to keep persistent chat message histories.import { ChatMessageHistory } from "langchain/memory";co... |
4e9727215e95-2743 | This lets you easily pick up state from past conversations. In addition to the above technique, you can do:import { BufferMemory, ChatMessageHistory } from "langchain/memory";import { HumanChatMessage, AIChatMessage } from "langchain/schema";const pastMessages = [ new HumanMessage("My name's Jonas"), new AIMessage("N... |
4e9727215e95-2744 | "}const res2 = await chain.call({ input: "What's my name?" });console.log({ res2 });{response: ' You said your name is Jim. Is there anything else you would like to talk about? '}There are plenty of different types of memory, check out our examples to see more!Creating your own memory classThe BaseMemory interface has... |
4e9727215e95-2745 | ModulesMemoryOn this pageMemory🚧 Docs under construction 🚧By default, Chains and Agents are stateless,
meaning that they treat each incoming query independently (like the underlying LLMs and chat models themselves).
In some applications, like chatbots, it is essential
to remember previous interactions, both in the... |
4e9727215e95-2746 | This is a super lightweight wrapper which exposes convenience methods for saving Human messages, AI messages, and then fetching them all.Subclassing this class allows you to use different storage solutions, such as Redis, to keep persistent chat message histories.import { ChatMessageHistory } from "langchain/memory";co... |
4e9727215e95-2747 | This lets you easily pick up state from past conversations. In addition to the above technique, you can do:import { BufferMemory, ChatMessageHistory } from "langchain/memory";import { HumanChatMessage, AIChatMessage } from "langchain/schema";const pastMessages = [ new HumanMessage("My name's Jonas"), new AIMessage("N... |
4e9727215e95-2748 | "}const res2 = await chain.call({ input: "What's my name?" });console.log({ res2 });{response: ' You said your name is Jim. Is there anything else you would like to talk about? '}There are plenty of different types of memory, check out our examples to see more!Creating your own memory classThe BaseMemory interface has... |
4e9727215e95-2749 | meaning that they treat each incoming query independently (like the underlying LLMs and chat models themselves).
In some applications, like chatbots, it is essential
to remember previous interactions, both in the short and long-term.
The Memory class does exactly that.LangChain provides memory components in two form... |
4e9727215e95-2750 | This is a super lightweight wrapper which exposes convenience methods for saving Human messages, AI messages, and then fetching them all.Subclassing this class allows you to use different storage solutions, such as Redis, to keep persistent chat message histories.import { ChatMessageHistory } from "langchain/memory";co... |
4e9727215e95-2751 | This lets you easily pick up state from past conversations. In addition to the above technique, you can do:import { BufferMemory, ChatMessageHistory } from "langchain/memory";import { HumanChatMessage, AIChatMessage } from "langchain/schema";const pastMessages = [ new HumanMessage("My name's Jonas"), new AIMessage("N... |
4e9727215e95-2752 | "}const res2 = await chain.call({ input: "What's my name?" });console.log({ res2 });{response: ' You said your name is Jim. Is there anything else you would like to talk about? '}There are plenty of different types of memory, check out our examples to see more!Creating your own memory classThe BaseMemory interface has... |
4e9727215e95-2753 | By default, Chains and Agents are stateless,
meaning that they treat each incoming query independently (like the underlying LLMs and chat models themselves).
In some applications, like chatbots, it is essential
to remember previous interactions, both in the short and long-term.
The Memory class does exactly that.
... |
4e9727215e95-2754 | Subclassing this class allows you to use different storage solutions, such as Redis, to keep persistent chat message histories.
import { ChatMessageHistory } from "langchain/memory";const history = new ChatMessageHistory();await history.addUserMessage("Hi! ");await history.addAIChatMessage("What's up? ");const message... |
4e9727215e95-2755 | We now show how to use this simple concept in a chain. We first showcase BufferMemory, a wrapper around ChatMessageHistory that extracts the messages into an input variable.
import { OpenAI } from "langchain/llms/openai";import { BufferMemory } from "langchain/memory";import { ConversationChain } from "langchain/chain... |
4e9727215e95-2756 | abstract class BaseChatMemory extends BaseMemory { chatHistory: ChatMessageHistory; abstract loadMemoryVariables(values: InputValues): Promise<MemoryVariables>;}
If you want to implement a more custom memory class, you can subclass BaseMemory and implement both loadMemoryVariables and saveContext methods. The saveCo... |
4e9727215e95-2757 | Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to u... |
4e9727215e95-2758 | This lets you easily pick up state from past conversations:import { BufferMemory, ChatMessageHistory } from "langchain/memory";import { HumanMessage, AIMessage } from "langchain/schema";const pastMessages = [ new HumanMessage("My name's Jonas"), new AIMessage("Nice to meet you, Jonas! "),];const memory = new BufferMe... |
4e9727215e95-2759 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to use multiple memory classes in the same chainConversation summary memoryConversation summa... |
4e9727215e95-2760 | This lets you easily pick up state from past conversations:import { BufferMemory, ChatMessageHistory } from "langchain/memory";import { HumanMessage, AIMessage } from "langchain/schema";const pastMessages = [ new HumanMessage("My name's Jonas"), new AIMessage("Nice to meet you, Jonas! "),];const memory = new BufferMe... |
4e9727215e95-2761 | ModulesMemoryHow-toConversation buffer memoryConversation buffer memoryThis notebook shows how to use BufferMemory. This memory allows for storing of messages, then later formats the messages into a prompt input variable.We can first extract it as a string.import { OpenAI } from "langchain/llms/openai";import { BufferM... |
4e9727215e95-2762 | Conversation buffer memoryThis notebook shows how to use BufferMemory. This memory allows for storing of messages, then later formats the messages into a prompt input variable.We can first extract it as a string.import { OpenAI } from "langchain/llms/openai";import { BufferMemory } from "langchain/memory";import { Conv... |
4e9727215e95-2763 | We can first extract it as a string.
import { OpenAI } from "langchain/llms/openai";import { BufferMemory } from "langchain/memory";import { ConversationChain } from "langchain/chains";const model = new OpenAI({});const memory = new BufferMemory();const chain = new ConversationChain({ llm: model, memory: memory });con... |
4e9727215e95-2764 | The key thing to notice is that setting returnMessages: true makes the memory return a list of chat messages instead of a string.import { ConversationChain } from "langchain/chains";import { ChatOpenAI } from "langchain/chat_models/openai";import { ChatPromptTemplate, HumanMessagePromptTemplate, SystemMessagePromptT... |
4e9727215e95-2765 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to use multiple memory classes in the same chainConversation summary memoryConversation summa... |
4e9727215e95-2766 | ", }); console.log(response);};API Reference:ConversationChain from langchain/chainsChatOpenAI from langchain/chat_models/openaiChatPromptTemplate from langchain/promptsHumanMessagePromptTemplate from langchain/promptsSystemMessagePromptTemplate from langchain/promptsMessagesPlaceholder from langchain/promptsBufferMe... |
4e9727215e95-2767 | ", }); console.log(response);};API Reference:ConversationChain from langchain/chainsChatOpenAI from langchain/chat_models/openaiChatPromptTemplate from langchain/promptsHumanMessagePromptTemplate from langchain/promptsSystemMessagePromptTemplate from langchain/promptsMessagesPlaceholder from langchain/promptsBufferMe... |
4e9727215e95-2768 | ", }); console.log(response);};API Reference:ConversationChain from langchain/chainsChatOpenAI from langchain/chat_models/openaiChatPromptTemplate from langchain/promptsHumanMessagePromptTemplate from langchain/promptsSystemMessagePromptTemplate from langchain/promptsMessagesPlaceholder from langchain/promptsBufferMe... |
4e9727215e95-2769 | API Reference:ConversationChain from langchain/chainsChatOpenAI from langchain/chat_models/openaiChatPromptTemplate from langchain/promptsHumanMessagePromptTemplate from langchain/promptsSystemMessagePromptTemplate from langchain/promptsMessagesPlaceholder from langchain/promptsBufferMemory from langchain/memory
Conve... |
4e9727215e95-2770 | Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to u... |
4e9727215e95-2771 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to use multiple memory classes in the same chainConversation summary memoryConversation summa... |
4e9727215e95-2772 | ModulesMemoryHow-toConversation buffer window memoryConversation buffer window memoryConversationBufferWindowMemory keeps a list of the interactions of the conversation over time. It only uses the last K interactions. This can be useful for keeping a sliding window of the most recent interactions, so the buffer does no... |
4e9727215e95-2773 | Conversation buffer window memoryConversationBufferWindowMemory keeps a list of the interactions of the conversation over time. It only uses the last K interactions. This can be useful for keeping a sliding window of the most recent interactions, so the buffer does not get too largeLet's first explore the basic functio... |
4e9727215e95-2774 | Let's first explore the basic functionality of this type of memory.
import { OpenAI } from "langchain/llms/openai";import { BufferWindowMemory } from "langchain/memory";import { ConversationChain } from "langchain/chains";const model = new OpenAI({});const memory = new BufferWindowMemory({ k: 1 });const chain = new Co... |
4e9727215e95-2775 | Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to u... |
4e9727215e95-2776 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to use multiple memory classes in the same chainConversation summary memoryConversation summa... |
4e9727215e95-2777 | ModulesMemoryHow-toBuffer Window MemoryBuffer Window MemoryBufferWindowMemory keeps track of the back-and-forths in conversation, and then uses a window of size k to surface the last k back-and-forths to use as memory.import { OpenAI } from "langchain/llms/openai";import { BufferWindowMemory } from "langchain/memory";i... |
4e9727215e95-2778 | Buffer Window MemoryBufferWindowMemory keeps track of the back-and-forths in conversation, and then uses a window of size k to surface the last k back-and-forths to use as memory.import { OpenAI } from "langchain/llms/openai";import { BufferWindowMemory } from "langchain/memory";import { ConversationChain } from "langc... |
4e9727215e95-2779 | Page Title: Entity memory | 🦜️🔗 Langchain
Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window me... |
4e9727215e95-2780 | memory, }); const res1 = await chain.call({ input: "Hi! I'm Jim." }); console.log({ res1, memory: await memory.loadMemoryVariables({ input: "Who is Jim?" }), }); const res2 = await chain.call({ input: "I work in construction. What about you? ", }); console.log({ res2, memory: await memory.loadMemo... |
4e9727215e95-2781 | I exist entirely in digital space and am here to assist you with any questions or tasks you may have. Is there anything specific you need help with regarding your work at the Utica branch of Dunder Mifflin? ", memory: { entities: { Jim: 'Jim is a human named Jim who works in sales. ', Utica: 'Utic... |
4e9727215e95-2782 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to use multiple memory classes in the same chainConversation summary memoryConversation summa... |
4e9727215e95-2783 | }); console.log({ res1, memory: await memory.loadMemoryVariables({ input: "Who is Jim?" }), }); const res2 = await chain.call({ input: "I work in construction. What about you? ", }); console.log({ res2, memory: await memory.loadMemoryVariables({ input: "Who is Jim?" }), });};API Reference:OpenAI fr... |
4e9727215e95-2784 | Is there anything specific you need help with regarding your work at the Utica branch of Dunder Mifflin? ", memory: { entities: { Jim: 'Jim is a human named Jim who works in sales. ', Utica: 'Utica is the location of the branch of Dunder Mifflin where Jim works. ', 'Dunder Mifflin': 'Dunder... |
4e9727215e95-2785 | ModulesMemoryHow-toEntity memoryEntity memoryEntity Memory remembers given facts about specific entities in a conversation. It extracts information on entities (using an LLM) and builds up its knowledge about that entity over time (also using an LLM).Let's first walk through using this functionality.import { OpenAI } f... |
4e9727215e95-2786 | }), });};API Reference:OpenAI from langchain/llms/openaiEntityMemory from langchain/memoryENTITY_MEMORY_CONVERSATION_TEMPLATE from langchain/memoryLLMChain from langchain/chainsInspecting the Memory StoreYou can also inspect the memory store directly to see the current summary of each entity:import { OpenAI } from "l... |
4e9727215e95-2787 | ', 'Dunder Mifflin': 'Dunder Mifflin has a branch in Utica.' } } }*/API Reference:OpenAI from langchain/llms/openaiEntityMemory from langchain/memoryENTITY_MEMORY_CONVERSATION_TEMPLATE from langchain/memoryLLMChain from langchain/chainsPreviousBuffer Window MemoryNextHow to use multiple memory classes in the... |
4e9727215e95-2788 | }), });};API Reference:OpenAI from langchain/llms/openaiEntityMemory from langchain/memoryENTITY_MEMORY_CONVERSATION_TEMPLATE from langchain/memoryLLMChain from langchain/chainsInspecting the Memory StoreYou can also inspect the memory store directly to see the current summary of each entity:import { OpenAI } from "l... |
4e9727215e95-2789 | ', 'Dunder Mifflin': 'Dunder Mifflin has a branch in Utica.' } } }*/API Reference:OpenAI from langchain/llms/openaiEntityMemory from langchain/memoryENTITY_MEMORY_CONVERSATION_TEMPLATE from langchain/memoryLLMChain from langchain/chains
Entity Memory remembers given facts about specific entities in a conver... |
4e9727215e95-2790 | API Reference:OpenAI from langchain/llms/openaiEntityMemory from langchain/memoryENTITY_MEMORY_CONVERSATION_TEMPLATE from langchain/memoryLLMChain from langchain/chains
You can also inspect the memory store directly to see the current summary of each entity: |
4e9727215e95-2791 | You can also inspect the memory store directly to see the current summary of each entity:
import { OpenAI } from "langchain/llms/openai";import { EntityMemory, ENTITY_MEMORY_CONVERSATION_TEMPLATE,} from "langchain/memory";import { LLMChain } from "langchain/chains";const memory = new EntityMemory({ llm: new OpenAI(... |
4e9727215e95-2792 | How to use multiple memory classes in the same chain
Page Title: How to use multiple memory classes in the same chain | 🦜️🔗 Langchain
Paragraphs:
Skip to main content🦜️🔗 LangChainDocsUse casesAPILangSmithPython DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-... |
4e9727215e95-2793 | To combine multiple memory classes, we can initialize the CombinedMemory class, and then use that.import { ChatOpenAI } from "langchain/chat_models/openai";import { BufferMemory, CombinedMemory, ConversationSummaryMemory,} from "langchain/memory";import { ConversationChain } from "langchain/chains";import { PromptTe... |
4e9727215e95-2794 | });console.log({ res1 });/* { res1: { response: "Hello Jim! It's nice to meet you. How can I assist you today?" } }*/const res2 = await chain.call({ input: "Can you tell me a joke?" });console.log({ res2 });/* { res2: { response: 'Why did the scarecrow win an award? Because he was outstanding in his ... |
4e9727215e95-2795 | Get startedIntroductionInstallationQuickstartModulesModel I/OData connectionChainsMemoryHow-toConversation buffer memoryUsing Buffer Memory with Chat ModelsConversation buffer window memoryBuffer Window MemoryEntity memoryHow to use multiple memory classes in the same chainConversation summary memoryConversation summa... |
4e9727215e95-2796 | The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Summary of conversation:{conversation_summary}Current conversation:{chat_history_lines}Human: {input}AI:`;const PROMPT = new PromptTemplate({ inputVariables:... |
4e9727215e95-2797 | ModulesMemoryHow-toHow to use multiple memory classes in the same chainHow to use multiple memory classes in the same chainIt is also possible to use multiple memory classes in the same chain. To combine multiple memory classes, we can initialize the CombinedMemory class, and then use that.import { ChatOpenAI } from "l... |
4e9727215e95-2798 | If the AI does not know the answer to a question, it truthfully says it does not know.Summary of conversation:{conversation_summary}Current conversation:{chat_history_lines}Human: {input}AI:`;const PROMPT = new PromptTemplate({ inputVariables: ["input", "conversation_summary", "chat_history_lines"], template: _DEFAUL... |
4e9727215e95-2799 | How to use multiple memory classes in the same chainIt is also possible to use multiple memory classes in the same chain. To combine multiple memory classes, we can initialize the CombinedMemory class, and then use that.import { ChatOpenAI } from "langchain/chat_models/openai";import { BufferMemory, CombinedMemory, ... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.