{
 "cells": [
  {
   "cell_type": "markdown",
   "source": [
    "许多AI应用需要在多个交互中共享上下文(会话历史)。在LangGraph中，这种内存可以通过线程级持久性添加到任何StateGraph。在创建任何LangGraph图时，可以通过在编译graph时添加一个检查点来设置graph的状态持久性。\n",
    "\n"
   ],
   "metadata": {
    "collapsed": false
   },
   "id": "f2b64f5446f56698"
  },
  {
   "cell_type": "code",
   "outputs": [],
   "source": [
    "\n",
    "from dotenv import load_dotenv\n",
    "from langchain_community.llms.tongyi import Tongyi\n",
    "from langgraph.constants import START\n",
    "from langgraph.graph import StateGraph, MessagesState\n",
    "\n",
    "load_dotenv()\n",
    "\n",
    "llm = Tongyi()\n",
    "\n",
    "def call_model(state:MessagesState):\n",
    "    response = llm.invoke(state[\"messages\"])\n",
    "    return {\"messages\":response}\n",
    "\n",
    "builder = StateGraph(MessagesState)\n",
    "builder.add_node(\"call_model\", call_model)\n",
    "builder.add_edge(START, \"call_model\")\n",
    "graph = builder.compile()"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2024-11-08T09:27:24.058449Z",
     "start_time": "2024-11-08T09:27:24.050999Z"
    }
   },
   "id": "b3f96d010fb5137f",
   "execution_count": 8
  },
  {
   "cell_type": "code",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "================================\u001B[1m Human Message \u001B[0m=================================\n",
      "\n",
      "hi! I'm bob\n",
      "================================\u001B[1m Human Message \u001B[0m=================================\n",
      "\n",
      "Hello Bob! It's nice to meet you. How can I assist you today? Is there anything specific you'd like to talk about or any questions you have?\n",
      "================================\u001B[1m Human Message \u001B[0m=================================\n",
      "\n",
      "what's my name?\n",
      "================================\u001B[1m Human Message \u001B[0m=================================\n",
      "\n",
      "I'm sorry, but I don't know your name. As an AI language model, I don't have access to personal information about you unless you've shared it with me in this conversation. If you'd like to tell me your name, I'd be happy to learn it!\n"
     ]
    },
    {
     "data": {
      "text/plain": "{'messages': [HumanMessage(content=\"what's my name?\", additional_kwargs={}, response_metadata={}, id='a858bd3d-cf1a-4105-a75b-f9b3539ce376'),\n  HumanMessage(content=\"I’m sorry, but I don't have access to personal information about you, and I don't know your name. Can you tell me what your name is?\", additional_kwargs={}, response_metadata={}, id='1af420af-87f3-4cc0-9bb9-1b259cbd125a')]}"
     },
     "execution_count": 9,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "input_message = {\"type\": \"user\", \"content\": \"hi! I'm bob\"}\n",
    "for chunk in graph.stream({\"messages\": [input_message]}, stream_mode=\"values\"):\n",
    "    chunk[\"messages\"][-1].pretty_print()\n",
    "\n",
    "input_message = {\"type\": \"user\", \"content\": \"what's my name?\"}\n",
    "for chunk in graph.invoke({\"messages\": [input_message]}, stream_mode=\"values\"):\n",
    "    chunk[\"messages\"][-1].pretty_print()"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2024-11-08T09:27:32.130474Z",
     "start_time": "2024-11-08T09:27:24.060899Z"
    }
   },
   "id": "51a5185bd31a6913",
   "execution_count": 9
  },
  {
   "cell_type": "markdown",
   "source": [
    "## Add persistence\n",
    "为了添加持久性，我们需要在编译图时传入一个[Checkpointer](https://langchain-ai.github.io/langgraph/reference/checkpoints/#langgraph.checkpoint.base.BaseCheckpointSaver)\n",
    "\n"
   ],
   "metadata": {
    "collapsed": false
   },
   "id": "fd63cab4a042f445"
  },
  {
   "cell_type": "code",
   "outputs": [
    {
     "ename": "NameError",
     "evalue": "name 'builder' is not defined",
     "output_type": "error",
     "traceback": [
      "\u001B[0;31m---------------------------------------------------------------------------\u001B[0m",
      "\u001B[0;31mNameError\u001B[0m                                 Traceback (most recent call last)",
      "Cell \u001B[0;32mIn[1], line 4\u001B[0m\n\u001B[1;32m      1\u001B[0m \u001B[38;5;28;01mfrom\u001B[39;00m \u001B[38;5;21;01mlanggraph\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01mcheckpoint\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01mmemory\u001B[39;00m \u001B[38;5;28;01mimport\u001B[39;00m MemorySaver\n\u001B[1;32m      3\u001B[0m memory \u001B[38;5;241m=\u001B[39m MemorySaver()\n\u001B[0;32m----> 4\u001B[0m graph \u001B[38;5;241m=\u001B[39m builder\u001B[38;5;241m.\u001B[39mcompile(checkpointer\u001B[38;5;241m=\u001B[39mmemory)\n",
      "\u001B[0;31mNameError\u001B[0m: name 'builder' is not defined"
     ]
    }
   ],
   "source": [
    "from langgraph.checkpoint.memory import MemorySaver\n",
    "\n",
    "memory = MemorySaver()\n",
    "graph = builder.compile(checkpointer=memory)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2024-12-02T09:39:37.051418Z",
     "start_time": "2024-12-02T09:39:35.162464Z"
    }
   },
   "id": "61ff0935e6c994cb",
   "execution_count": 1
  },
  {
   "cell_type": "markdown",
   "source": [
    "现在我们可以与代理进行交互，并发现它能够记住之前的消息！"
   ],
   "metadata": {
    "collapsed": false
   },
   "id": "7e9d7e6fc0ef390e"
  },
  {
   "cell_type": "code",
   "outputs": [
    {
     "data": {
      "text/plain": "{'messages': [HumanMessage(content=\"hi! I'm bob\", additional_kwargs={}, response_metadata={}, id='03540951-13f8-469f-b884-ab7959433ce3'),\n  HumanMessage(content=\"Hello Bob! It's nice to meet you. How can I assist you today? Is there anything specific you'd like to chat about or any questions you have?\", additional_kwargs={}, response_metadata={}, id='00ed6048-29c5-43e5-ad7f-c7ee12fa0401')]}"
     },
     "execution_count": 11,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "config = {\"configurable\": {\"thread_id\": \"1\"}}\n",
    "input_message = {\"type\": \"user\", \"content\": \"hi! I'm bob\"}\n",
    "for chunk in graph.stream({\"messages\": [input_message]}, config, stream_mode=\"values\"):\n",
    "    chunk[\"messages\"][-1].pretty_print()"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2024-11-08T09:27:34.249706Z",
     "start_time": "2024-11-08T09:27:32.141912Z"
    }
   },
   "id": "a3657c8446cc7aa6",
   "execution_count": 11
  },
  {
   "cell_type": "markdown",
   "source": [
    "继续进行会话"
   ],
   "metadata": {
    "collapsed": false
   },
   "id": "7daea199eed717b1"
  },
  {
   "cell_type": "code",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "================================\u001B[1m Human Message \u001B[0m=================================\n",
      "\n",
      "what's my name?\n",
      "================================\u001B[1m Human Message \u001B[0m=================================\n",
      "\n",
      "Your name is Bob! You mentioned it when you said hello. How can I assist you today, Bob?\n"
     ]
    }
   ],
   "source": [
    "input_message = {\"type\": \"user\", \"content\": \"what's my name?\"}\n",
    "for chunk in graph.stream({\"messages\": [input_message]}, config, stream_mode=\"values\"):\n",
    "    chunk[\"messages\"][-1].pretty_print()"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2024-11-08T09:27:36.387630Z",
     "start_time": "2024-11-08T09:27:34.253050Z"
    }
   },
   "id": "1ed76679abf98b80",
   "execution_count": 12
  },
  {
   "cell_type": "markdown",
   "source": [
    "如果我们要开始新的对话，可以传入不同的 thread_id。所有的记忆都消失了！\n",
    "\n"
   ],
   "metadata": {
    "collapsed": false
   },
   "id": "7132d9b0b5b0ed99"
  },
  {
   "cell_type": "code",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "================================\u001B[1m Human Message \u001B[0m=================================\n",
      "\n",
      "what's my name?\n",
      "================================\u001B[1m Human Message \u001B[0m=================================\n",
      "\n",
      "I'm sorry, but I don't have enough information to know your name. Could you please tell me what your name is?\n"
     ]
    }
   ],
   "source": [
    "input_message = {\"type\": \"user\", \"content\": \"what's my name?\"}\n",
    "for chunk in graph.stream(\n",
    "    {\"messages\": [input_message]},\n",
    "    {\"configurable\": {\"thread_id\": \"2\"}},\n",
    "    stream_mode=\"values\",\n",
    "):\n",
    "    chunk[\"messages\"][-1].pretty_print()"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2024-11-08T09:27:38.128544Z",
     "start_time": "2024-11-08T09:27:36.391690Z"
    }
   },
   "id": "d5391c77ba00aee6",
   "execution_count": 13
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
