{
  "cells": [
    {
      "cell_type": "markdown",
      "id": "c298a5c9-b9af-481d-9eba-cbd65f987a8a",
      "metadata": {},
      "source": [
        "# How to use BaseChatMessageHistory with LangGraph\n",
        "\n",
        ":::info Prerequisites\n",
        "\n",
        "This guide assumes familiarity with the following concepts:\n",
        "\n",
        "- [Chat History](/docs/concepts/chat_history)\n",
        "- [RunnableWithMessageHistory](https://api.js.langchain.com/classes/_langchain_core.runnables.RunnableWithMessageHistory.html)\n",
        "- [LangGraph](https://langchain-ai.github.io/langgraphjs/concepts/high_level/)\n",
        "- [Memory](https://langchain-ai.github.io/langgraphjs/concepts/agentic_concepts/#memory)\n",
        "\n",
        ":::\n",
        "\n",
        "We recommend that new LangChain applications take advantage of the [built-in LangGraph peristence](https://langchain-ai.github.io/langgraphjs/concepts/persistence/) to implement memory.\n",
        "\n",
        "In some situations, users may need to keep using an existing persistence solution for chat message history.\n",
        "\n",
        "Here, we will show how to use [LangChain chat message histories](/docs/integrations/memory/) (implementations of [BaseChatMessageHistory](https://api.js.langchain.com/classes/_langchain_core.chat_history.BaseChatMessageHistory.html)) with LangGraph."
      ]
    },
    {
      "cell_type": "markdown",
      "id": "548bc988-167b-43f1-860a-d247e28b2b42",
      "metadata": {},
      "source": [
        "## Set up\n",
        "\n",
        "```typescript\n",
        "process.env.ANTHROPIC_API_KEY = 'YOUR_API_KEY'\n",
        "```\n",
        "\n",
        "```{=mdx}\n",
        "import Npm2Yarn from \"@theme/Npm2Yarn\"\n",
        "\n",
        "<Npm2Yarn>\n",
        "  @langchain/core @langchain/langgraph @langchain/anthropic\n",
        "</Npm2Yarn>\n",
        "```"
      ]
    },
    {
      "attachments": {},
      "cell_type": "markdown",
      "id": "c5e08659-b68c-48f2-8b33-e79b0c6999e1",
      "metadata": {},
      "source": [
        "## ChatMessageHistory\n",
        "\n",
        "A message history needs to be parameterized by a conversation ID or maybe by the 2-tuple of (user ID, conversation ID).\n",
        "\n",
        "Many of the [LangChain chat message histories](/docs/integrations/memory/) will have either a `sessionId` or some `namespace` to allow keeping track of different conversations. Please refer to the specific implementations to check how it is parameterized.\n",
        "\n",
        "The built-in `InMemoryChatMessageHistory` does not contains such a parameterization, so we'll create a dictionary to keep track of the message histories."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 1,
      "id": "28049308-2543-48e6-90d0-37a88951a637",
      "metadata": {},
      "outputs": [],
      "source": [
        "import { InMemoryChatMessageHistory } from \"@langchain/core/chat_history\";\n",
        "\n",
        "const chatsBySessionId: Record<string, InMemoryChatMessageHistory> = {}\n",
        "\n",
        "const getChatHistory = (sessionId: string) => {\n",
        "    let chatHistory: InMemoryChatMessageHistory | undefined = chatsBySessionId[sessionId]\n",
        "    if (!chatHistory) {\n",
        "      chatHistory = new InMemoryChatMessageHistory()\n",
        "      chatsBySessionId[sessionId] = chatHistory\n",
        "    }\n",
        "    return chatHistory\n",
        "}"
      ]
    },
    {
      "attachments": {},
      "cell_type": "markdown",
      "id": "94c53ce3-4212-41e6-8ad3-f0ab5df6130f",
      "metadata": {},
      "source": [
        "## Use with LangGraph\n",
        "\n",
        "Next, we'll set up a basic chat bot using LangGraph. If you're not familiar with LangGraph, you should look at the following [Quick Start Tutorial](https://langchain-ai.github.io/langgraphjs/tutorials/quickstart/).\n",
        "\n",
        "We'll create a [LangGraph node](https://langchain-ai.github.io/langgraphjs/concepts/low_level/#nodes) for the chat model, and manually manage the conversation history, taking into account the conversation ID passed as part of the RunnableConfig.\n",
        "\n",
        "The conversation ID can be passed as either part of the RunnableConfig (as we'll do here), or as part of the [graph state](https://langchain-ai.github.io/langgraphjs/concepts/low_level/#state)."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 2,
      "id": "d818e23f",
      "metadata": {},
      "outputs": [
        {
          "name": "stdout",
          "output_type": "stream",
          "text": [
            "hi! I'm bob\n",
            "Hello Bob! It's nice to meet you. How can I assist you today?\n",
            "what was my name?\n",
            "You said your name is Bob.\n"
          ]
        }
      ],
      "source": [
        "import { v4 as uuidv4 } from \"uuid\";\n",
        "import { ChatAnthropic } from \"@langchain/anthropic\";\n",
        "import { StateGraph, MessagesAnnotation, END, START } from \"@langchain/langgraph\";\n",
        "import { HumanMessage } from \"@langchain/core/messages\";\n",
        "import { RunnableConfig } from \"@langchain/core/runnables\";\n",
        "\n",
        "// Define a chat model\n",
        "const model = new ChatAnthropic({ modelName: \"claude-3-haiku-20240307\" });\n",
        "\n",
        "// Define the function that calls the model\n",
        "const callModel = async (\n",
        "  state: typeof MessagesAnnotation.State,\n",
        "  config: RunnableConfig\n",
        "): Promise<Partial<typeof MessagesAnnotation.State>> => {\n",
        "  if (!config.configurable?.sessionId) {\n",
        "    throw new Error(\n",
        "      \"Make sure that the config includes the following information: {'configurable': {'sessionId': 'some_value'}}\"\n",
        "    );\n",
        "  }\n",
        "\n",
        "  const chatHistory = getChatHistory(config.configurable.sessionId as string);\n",
        "\n",
        "  let messages = [...(await chatHistory.getMessages()), ...state.messages];\n",
        "\n",
        "  if (state.messages.length === 1) {\n",
        "    // First message, ensure it's in the chat history\n",
        "    await chatHistory.addMessage(state.messages[0]);\n",
        "  }\n",
        "\n",
        "  const aiMessage = await model.invoke(messages);\n",
        "\n",
        "  // Update the chat history\n",
        "  await chatHistory.addMessage(aiMessage);\n",
        "\n",
        "  return { messages: [aiMessage] };\n",
        "};\n",
        "\n",
        "// Define a new graph\n",
        "const workflow = new StateGraph(MessagesAnnotation)\n",
        "  .addNode(\"model\", callModel)\n",
        "  .addEdge(START, \"model\")\n",
        "  .addEdge(\"model\", END);\n",
        "\n",
        "const app = workflow.compile();\n",
        "\n",
        "// Create a unique session ID to identify the conversation\n",
        "const sessionId = uuidv4();\n",
        "const config = { configurable: { sessionId }, streamMode: \"values\" as const };\n",
        "\n",
        "const inputMessage = new HumanMessage(\"hi! I'm bob\");\n",
        "\n",
        "for await (const event of await app.stream({ messages: [inputMessage] }, config)) {\n",
        "  const lastMessage = event.messages[event.messages.length - 1];\n",
        "  console.log(lastMessage.content);\n",
        "}\n",
        "\n",
        "// Here, let's confirm that the AI remembers our name!\n",
        "const followUpMessage = new HumanMessage(\"what was my name?\");\n",
        "\n",
        "for await (const event of await app.stream({ messages: [followUpMessage] }, config)) {\n",
        "  const lastMessage = event.messages[event.messages.length - 1];\n",
        "  console.log(lastMessage.content);\n",
        "}"
      ]
    },
    {
      "attachments": {},
      "cell_type": "markdown",
      "id": "da0536dd-9a0b-49e3-b0b6-e8c7abf3b1f9",
      "metadata": {},
      "source": [
        "## Using With RunnableWithMessageHistory\n",
        "\n",
        "This how-to guide used the `messages` and `addMessages` interface of `BaseChatMessageHistory` directly. \n",
        "\n",
        "Alternatively, you can use [RunnableWithMessageHistory](https://api.js.langchain.com/classes/_langchain_core.runnables.RunnableWithMessageHistory.html), as [LCEL](/docs/concepts/lcel/) can be used inside any [LangGraph node](https://langchain-ai.github.io/langgraphjs/concepts/low_level/#nodes).\n",
        "\n",
        "To do that replace the following code:\n",
        "\n",
        "```typescript\n",
        "const callModel = async (\n",
        "  state: typeof MessagesAnnotation.State,\n",
        "  config: RunnableConfig\n",
        "): Promise<Partial<typeof MessagesAnnotation.State>> => {\n",
        "  // highlight-start\n",
        "  if (!config.configurable?.sessionId) {\n",
        "    throw new Error(\n",
        "      \"Make sure that the config includes the following information: {'configurable': {'sessionId': 'some_value'}}\"\n",
        "    );\n",
        "  }\n",
        "\n",
        "  const chatHistory = getChatHistory(config.configurable.sessionId as string);\n",
        "\n",
        "  let messages = [...(await chatHistory.getMessages()), ...state.messages];\n",
        "\n",
        "  if (state.messages.length === 1) {\n",
        "    // First message, ensure it's in the chat history\n",
        "    await chatHistory.addMessage(state.messages[0]);\n",
        "  }\n",
        "\n",
        "  const aiMessage = await model.invoke(messages);\n",
        "\n",
        "  // Update the chat history\n",
        "  await chatHistory.addMessage(aiMessage);\n",
        "  // highlight-end\n",
        "  return { messages: [aiMessage] };\n",
        "};\n",
        "```\n",
        "\n",
        "With the corresponding instance of `RunnableWithMessageHistory` defined in your current application.\n",
        "\n",
        "```typescript\n",
        "const runnable = new RunnableWithMessageHistory({\n",
        "  // ... configuration from existing code\n",
        "});\n",
        "\n",
        "const callModel = async (\n",
        "  state: typeof MessagesAnnotation.State,\n",
        "  config: RunnableConfig\n",
        "): Promise<Partial<typeof MessagesAnnotation.State>> => {\n",
        "  // RunnableWithMessageHistory takes care of reading the message history\n",
        "  // and updating it with the new human message and AI response.\n",
        "  const aiMessage = await runnable.invoke(state.messages, config);\n",
        "  return {\n",
        "    messages: [aiMessage]\n",
        "  };\n",
        "};\n",
        "```"
      ]
    }
  ],
  "metadata": {
    "kernelspec": {
      "display_name": "TypeScript",
      "language": "typescript",
      "name": "tslab"
    },
    "language_info": {
      "codemirror_mode": {
        "mode": "typescript",
        "name": "javascript",
        "typescript": true
      },
      "file_extension": ".ts",
      "mimetype": "text/typescript",
      "name": "typescript",
      "version": "3.7.2"
    }
  },
  "nbformat": 4,
  "nbformat_minor": 5
}