{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "c20242c4-0010-4065-89f6-0e0b16c7da6e",
   "metadata": {
    "id": "c20242c4-0010-4065-89f6-0e0b16c7da6e"
   },
   "source": [
    "# Deployment\n",
    "\n",
    "## Review\n",
    "\n",
    "We built up to an agent with memory:\n",
    "\n",
    "* `act` - let the model call specific tools\n",
    "* `observe` - pass the tool output back to the model\n",
    "* `reason` - let the model reason about the tool output to decide what to do next (e.g., call another tool or just respond directly)\n",
    "* `persist state` - use an in memory checkpointer to support long-running conversations with interruptions\n",
    "\n",
    "## Goals\n",
    "\n",
    "Now, we'll cover how to actually deploy our agent locally to Studio and to `LangGraph Cloud`."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "id": "f348498b-f277-4514-b163-fe5ed9afe6fa",
   "metadata": {
    "id": "f348498b-f277-4514-b163-fe5ed9afe6fa"
   },
   "outputs": [],
   "source": [
    "%%capture --no-stderr\n",
    "%pip install --quiet -U langgraph_sdk langchain_core"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e4d0f4a7-82ee-4458-bd9a-e246ce2dc4ae",
   "metadata": {
    "id": "e4d0f4a7-82ee-4458-bd9a-e246ce2dc4ae"
   },
   "source": [
    "## Concepts\n",
    "\n",
    "There are a few central concepts to understand -\n",
    "\n",
    "`LangGraph` —\n",
    "- Python and JavaScript library\n",
    "- Allows creation of agent workflows\n",
    "\n",
    "[`LangGraph Server`](https://langchain-ai.github.io/langgraph/concepts/langgraph_server/) —\n",
    "- Bundles the graph code\n",
    "- Provides a task queue for managing asynchronous operations\n",
    "- Offers persistence for maintaining state across interactions\n",
    "- [Deployment options](https://langchain-ai.github.io/langgraph/concepts/deployment_options/) include LangGraph Platform and self-hosted deployment.\n",
    "\n",
    "`LangGraph Platform` --\n",
    "- Hosted service for the LangGraph API\n",
    "- Allows deployment of graphs from GitHub repositories\n",
    "- Also provides monitoring and tracing for deployed graphs\n",
    "- Accessible via a unique URL for each deployment\n",
    "\n",
    "`LangGraph Studio` --\n",
    "- Integrated Development Environment (IDE) for LangGraph applications\n",
    "- Uses the API as its back-end, allowing real-time testing and exploration of graphs\n",
    "- Can be run locally or with cloud-deployment\n",
    "\n",
    "`LangGraph SDK` --\n",
    "- Python and JS library for programmatically interacting with LangGraph graphs\n",
    "- Provides a consistent interface for working with graphs, whether served locally or in the cloud\n",
    "- Allows creation of clients, access to assistants, thread management, and execution of runs\n",
    "\n",
    "`RemoteGraph` --\n",
    "- The RemoteGraph class is a client implementation for calling remote APIs that implement the LangGraph Server API specification.\n",
    "\n",
    "\n",
    "\n",
    "## Testing Locally\n",
    "\n",
    "--\n",
    "\n",
    "**⚠️ DISCLAIMER**\n",
    "\n",
    "*Running Studio currently requires a Mac. If you are not using a Mac, then use langgraph cli for this step.*\n",
    "\n",
    "*Also, if you are running this notebook in Collab, run cloudflared or ngrok tunnel to access the API in Collab environment.*\n",
    "\n",
    "--\n",
    "\n",
    "We can easily connect with graphs that are served locally in LangGraph Studio!\n",
    "\n",
    "We do this via the `url` provided in the lower left corner of the Studio UI.\n",
    "\n",
    "![Screenshot 2024-08-23 at 1.17.05 PM.png](https://cdn.prod.website-files.com/65b8cd72835ceeacd4449a53/66dbad4f53080e6802cec34d_deployment%201.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "fUS7IO0DT01K",
   "metadata": {
    "id": "fUS7IO0DT01K"
   },
   "source": [
    "### Local Setup Options\n",
    "\n",
    "1. [Clone Repo](https://github.com/panaversity/learn-agentic-ai-fundamentals)\n",
    "2. Open [module-1/studio](https://github.com/panaversity/learn-agentic-ai-fundamentals/tree/main/03_langchain_ecosystem/langgraph/course-notebooks/module-1/studio) locally in VS Code and add environment variables.\n",
    "\n",
    "Now use any of these options suitable for your machine.\n",
    "\n",
    "#### 1. [LangGraph Studio App](https://langchain-ai.github.io/langgraph/cloud/how-tos/test_local_deployment/) (Available only on Mac OS for now)\n",
    "\n",
    "#### 2. Run LangGraph Server Locally\n",
    "\n",
    "- Ensure you have `LANGSMITH_API_KEY` environment variable in .env file.\n",
    "\n",
    "```bash\n",
    "pip install langgraph-cli\n",
    "\n",
    "langgraph up\n",
    "```\n",
    "\n",
    "Once setup completes open in bowser:\n",
    "\n",
    "- Docs: http://127.0.0.1:8123/docs\n",
    "- [Access Studio](https://langchain-ai.github.io/langgraph/cloud/how-tos/test_local_deployment/#access-studio) in Browser: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:8123\n",
    "\n",
    "\n",
    "Now continue the following locally or setup a tunnel to use Google Collab Environment:\n",
    "\n",
    "#### 3. Acces Local URL in Collab using NGROK"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "id": "18b281d8-bd07-4721-922c-347838ceee6b",
   "metadata": {
    "id": "18b281d8-bd07-4721-922c-347838ceee6b"
   },
   "outputs": [],
   "source": [
    "from langgraph_sdk import get_client"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "id": "4c96f353-5dc3-41c8-a3e4-6bf07ca455f8",
   "metadata": {
    "id": "4c96f353-5dc3-41c8-a3e4-6bf07ca455f8"
   },
   "outputs": [],
   "source": [
    "# Replace this with the URL of your own deployed graph\n",
    "# URL = \"http://localhost:8123\" # Local Server URL\n",
    "URL = \"https://58f4-39-34-35-175.ngrok-free.app\"\n",
    "client = get_client(url=URL)\n",
    "\n",
    "# Search all hosted graphs\n",
    "assistants = await client.assistants.search()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "id": "6a1352fa-68ad-4963-890e-c95d93570917",
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "id": "6a1352fa-68ad-4963-890e-c95d93570917",
    "outputId": "442794f9-6c97-4066-8fb7-59c765dcaef3"
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "[{'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca',\n",
       "  'graph_id': 'agent',\n",
       "  'created_at': '2024-11-16T14:17:08.398955+00:00',\n",
       "  'updated_at': '2024-11-16T14:17:08.398955+00:00',\n",
       "  'config': {},\n",
       "  'metadata': {'created_by': 'system'},\n",
       "  'version': 1,\n",
       "  'name': 'agent'},\n",
       " {'assistant_id': '228f9934-0cdd-5383-92c8-ee8422522cc2',\n",
       "  'graph_id': 'router',\n",
       "  'created_at': '2024-11-16T14:17:08.383667+00:00',\n",
       "  'updated_at': '2024-11-16T14:17:08.383667+00:00',\n",
       "  'config': {},\n",
       "  'metadata': {'created_by': 'system'},\n",
       "  'version': 1,\n",
       "  'name': 'router'},\n",
       " {'assistant_id': '28d99cab-ad6c-5342-aee5-400bd8dc9b8b',\n",
       "  'graph_id': 'simple_graph',\n",
       "  'created_at': '2024-11-16T14:17:07.809767+00:00',\n",
       "  'updated_at': '2024-11-16T14:17:07.809767+00:00',\n",
       "  'config': {},\n",
       "  'metadata': {'created_by': 'system'},\n",
       "  'version': 1,\n",
       "  'name': 'simple_graph'}]"
      ]
     },
     "execution_count": 19,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "assistants"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "id": "ba9c28a0-d712-496c-b191-7d620589ba33",
   "metadata": {
    "id": "ba9c28a0-d712-496c-b191-7d620589ba33"
   },
   "outputs": [],
   "source": [
    "# We create a thread for tracking the state of our run\n",
    "thread = await client.threads.create()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "id": "R3DnA7lhYM26",
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "id": "R3DnA7lhYM26",
    "outputId": "f01f85b8-0398-4b6d-f3ca-3cb76f27fcbd"
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "{'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83',\n",
       " 'created_at': '2024-11-16T14:31:07.223451+00:00',\n",
       " 'updated_at': '2024-11-16T14:31:07.223451+00:00',\n",
       " 'metadata': {},\n",
       " 'status': 'idle',\n",
       " 'config': {},\n",
       " 'values': None}"
      ]
     },
     "execution_count": 21,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "thread"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2e7e4177-3644-43fa-a2f1-08f73292d1a6",
   "metadata": {
    "id": "2e7e4177-3644-43fa-a2f1-08f73292d1a6"
   },
   "source": [
    "Now, we can run our agent [with `client.runs.stream`](https://langchain-ai.github.io/langgraph/concepts/low_level/#stream-and-astream) with:\n",
    "\n",
    "* The `thread_id`\n",
    "* The `graph_id`\n",
    "* The `input`\n",
    "* The `stream_mode`\n",
    "\n",
    "We'll discuss streaming in depth in a future module.\n",
    "\n",
    "For now, just recognize that we are [streaming](https://langchain-ai.github.io/langgraph/cloud/how-tos/stream_values/) the full value of the state after each step of the graph with `stream_mode=\"values\"`.\n",
    "\n",
    "The state is captured in the `chunk.data`."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "id": "VRnnTpd4YKq-",
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "id": "VRnnTpd4YKq-",
    "outputId": "33f997bd-73ff-4095-c7da-b42841817674"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "StreamPart(event='events', data={'event': 'on_chain_start', 'data': {'input': {'messages': [{'id': None, 'name': None, 'type': 'human', 'content': 'Hi', 'example': False, 'additional_kwargs': {}, 'response_metadata': {}}]}}, 'name': 'agent', 'tags': [], 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https'}, 'parent_ids': []})\n",
      "StreamPart(event='events', data={'event': 'on_chain_start', 'data': {'input': {'messages': [{'content': 'Hi', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': 'ef46e6bd-c35f-41fd-98f2-a8cbe5aa51a8', 'example': False}]}}, 'name': 'assistant', 'tags': ['graph:step:1'], 'run_id': '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97', 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9'}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815']})\n",
      "StreamPart(event='events', data={'event': 'on_chat_model_start', 'data': {'input': {'messages': [[{'content': 'You are a helpful assistant tasked with writing performing arithmetic on a set of inputs.', 'additional_kwargs': {}, 'response_metadata': {}, 'type': 'system', 'name': None, 'id': None}, {'content': 'Hi', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': 'ef46e6bd-c35f-41fd-98f2-a8cbe5aa51a8', 'example': False}]]}}, 'name': 'ChatGoogleGenerativeAI', 'tags': ['seq:step:1'], 'run_id': 'bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9', 'checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9', 'ls_provider': 'google_genai', 'ls_model_name': 'models/gemini-1.5-flash', 'ls_model_type': 'chat', 'ls_temperature': 0.7}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815', '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97']})\n",
      "StreamPart(event='events', data={'event': 'on_chat_model_stream', 'data': {'chunk': {'content': 'Hello', 'additional_kwargs': {}, 'response_metadata': {'safety_ratings': []}, 'type': 'AIMessageChunk', 'name': None, 'id': 'run-bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 163, 'output_tokens': 0, 'total_tokens': 163, 'input_token_details': {'cache_read': 0}}, 'tool_call_chunks': []}}, 'run_id': 'bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'name': 'ChatGoogleGenerativeAI', 'tags': ['seq:step:1'], 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9', 'checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9', 'ls_provider': 'google_genai', 'ls_model_name': 'models/gemini-1.5-flash', 'ls_model_type': 'chat', 'ls_temperature': 0.7}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815', '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97']})\n",
      "StreamPart(event='events', data={'event': 'on_chat_model_stream', 'data': {'chunk': {'content': '! How may I help you today?\\n', 'additional_kwargs': {}, 'response_metadata': {'finish_reason': 'STOP', 'safety_ratings': []}, 'type': 'AIMessageChunk', 'name': None, 'id': 'run-bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 0, 'output_tokens': 10, 'total_tokens': 10, 'input_token_details': {'cache_read': 0}}, 'tool_call_chunks': []}}, 'run_id': 'bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'name': 'ChatGoogleGenerativeAI', 'tags': ['seq:step:1'], 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9', 'checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9', 'ls_provider': 'google_genai', 'ls_model_name': 'models/gemini-1.5-flash', 'ls_model_type': 'chat', 'ls_temperature': 0.7}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815', '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97']})\n",
      "StreamPart(event='events', data={'event': 'on_chat_model_end', 'data': {'output': {'content': 'Hello! How may I help you today?\\n', 'additional_kwargs': {}, 'response_metadata': {'safety_ratings': [], 'finish_reason': 'STOP'}, 'type': 'ai', 'name': None, 'id': 'run-bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 163, 'output_tokens': 10, 'total_tokens': 173, 'input_token_details': {'cache_read': 0}}}, 'input': {'messages': [[{'content': 'You are a helpful assistant tasked with writing performing arithmetic on a set of inputs.', 'additional_kwargs': {}, 'response_metadata': {}, 'type': 'system', 'name': None, 'id': None}, {'content': 'Hi', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': 'ef46e6bd-c35f-41fd-98f2-a8cbe5aa51a8', 'example': False}]]}}, 'run_id': 'bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'name': 'ChatGoogleGenerativeAI', 'tags': ['seq:step:1'], 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9', 'checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9', 'ls_provider': 'google_genai', 'ls_model_name': 'models/gemini-1.5-flash', 'ls_model_type': 'chat', 'ls_temperature': 0.7}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815', '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97']})\n",
      "StreamPart(event='events', data={'event': 'on_chain_start', 'data': {'input': {'messages': [{'content': 'Hi', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': 'ef46e6bd-c35f-41fd-98f2-a8cbe5aa51a8', 'example': False}, {'content': 'Hello! How may I help you today?\\n', 'additional_kwargs': {}, 'response_metadata': {'safety_ratings': [], 'finish_reason': 'STOP'}, 'type': 'ai', 'name': None, 'id': 'run-bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 163, 'output_tokens': 10, 'total_tokens': 173, 'input_token_details': {'cache_read': 0}}}]}}, 'name': 'tools_condition', 'tags': ['seq:step:4'], 'run_id': '691a1717-68fe-4fcc-89df-8339fb8611f7', 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9'}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815', '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97']})\n",
      "StreamPart(event='events', data={'event': 'on_chain_end', 'data': {'output': '__end__', 'input': {'messages': [{'content': 'Hi', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': 'ef46e6bd-c35f-41fd-98f2-a8cbe5aa51a8', 'example': False}, {'content': 'Hello! How may I help you today?\\n', 'additional_kwargs': {}, 'response_metadata': {'safety_ratings': [], 'finish_reason': 'STOP'}, 'type': 'ai', 'name': None, 'id': 'run-bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 163, 'output_tokens': 10, 'total_tokens': 173, 'input_token_details': {'cache_read': 0}}}]}}, 'run_id': '691a1717-68fe-4fcc-89df-8339fb8611f7', 'name': 'tools_condition', 'tags': ['seq:step:4'], 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9'}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815', '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97']})\n",
      "StreamPart(event='events', data={'event': 'on_chain_stream', 'run_id': '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97', 'name': 'assistant', 'tags': ['graph:step:1'], 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9'}, 'data': {'chunk': {'messages': [{'content': 'Hello! How may I help you today?\\n', 'additional_kwargs': {}, 'response_metadata': {'safety_ratings': [], 'finish_reason': 'STOP'}, 'type': 'ai', 'name': None, 'id': 'run-bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 163, 'output_tokens': 10, 'total_tokens': 173, 'input_token_details': {'cache_read': 0}}}]}}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815']})\n",
      "StreamPart(event='events', data={'event': 'on_chain_end', 'data': {'output': {'messages': [{'content': 'Hello! How may I help you today?\\n', 'additional_kwargs': {}, 'response_metadata': {'safety_ratings': [], 'finish_reason': 'STOP'}, 'type': 'ai', 'name': None, 'id': 'run-bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 163, 'output_tokens': 10, 'total_tokens': 173, 'input_token_details': {'cache_read': 0}}}]}, 'input': {'messages': [{'content': 'Hi', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': 'ef46e6bd-c35f-41fd-98f2-a8cbe5aa51a8', 'example': False}]}}, 'run_id': '2fdc9c02-21ba-45bf-b3a6-0acd83a52f97', 'name': 'assistant', 'tags': ['graph:step:1'], 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https', 'langgraph_step': 1, 'langgraph_node': 'assistant', 'langgraph_triggers': ['start:assistant'], 'langgraph_path': ['__pregel_pull', 'assistant'], 'langgraph_checkpoint_ns': 'assistant:c6a0d38a-5da3-9f8e-678e-e81e7d98aad9'}, 'parent_ids': ['1efa4277-6069-665c-b982-beee5bf90815']})\n",
      "StreamPart(event='events', data={'event': 'on_chain_end', 'data': {'output': {'messages': [{'content': 'Hi', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': 'ef46e6bd-c35f-41fd-98f2-a8cbe5aa51a8', 'example': False}, {'content': 'Hello! How may I help you today?\\n', 'additional_kwargs': {}, 'response_metadata': {'safety_ratings': [], 'finish_reason': 'STOP'}, 'type': 'ai', 'name': None, 'id': 'run-bf11dc93-6e14-4e2c-b290-792a486f5ef1', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 163, 'output_tokens': 10, 'total_tokens': 173, 'input_token_details': {'cache_read': 0}}}]}}, 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'name': 'agent', 'tags': [], 'metadata': {'created_by': 'system', 'run_attempt': 1, 'langgraph_version': '0.2.50', 'langgraph_plan': 'developer', 'langgraph_host': 'self-hosted', 'run_id': '1efa4277-6069-665c-b982-beee5bf90815', 'user_id': '', 'graph_id': 'agent', 'thread_id': '2163cddd-743f-4242-91be-65ad51b72a83', 'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca', 'x-forwarded-for': '35.245.160.114', 'x-forwarded-host': '58f4-39-34-35-175.ngrok-free.app', 'x-forwarded-proto': 'https'}, 'parent_ids': []})\n"
     ]
    }
   ],
   "source": [
    "from langchain_core.messages import HumanMessage\n",
    "\n",
    "# Input\n",
    "input = {\"messages\": [HumanMessage(content=\"Hi\")]}\n",
    "\n",
    "# Stream\n",
    "async for chunk in client.runs.stream(\n",
    "        thread['thread_id'],\n",
    "        \"agent\",\n",
    "        input=input,\n",
    "        stream_mode=\"events\",\n",
    "    ):\n",
    "    if chunk.data and chunk.event != \"metadata\":\n",
    "        print(chunk)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "id": "f65a4480-66b3-48bf-9158-191a7b8c1c18",
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "id": "f65a4480-66b3-48bf-9158-191a7b8c1c18",
    "outputId": "005df9a0-8ee7-4f73-b2cb-9cb65b793750"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'content': 'Multiply 3 by 2.', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': 'd25e9583-5ea6-4195-9755-be5859bcfc46', 'example': False}\n",
      "{'content': '', 'additional_kwargs': {'function_call': {'name': 'multiply', 'arguments': '{\"a\": 3.0, \"b\": 2.0}'}}, 'response_metadata': {'prompt_feedback': {'block_reason': 0, 'safety_ratings': []}, 'finish_reason': 'STOP', 'safety_ratings': []}, 'type': 'ai', 'name': None, 'id': 'run-7ae2f824-0eba-474a-97b7-b72633093ef6-0', 'example': False, 'tool_calls': [{'name': 'multiply', 'args': {'a': 3.0, 'b': 2.0}, 'id': 'f2cbef44-3e9a-4b3f-9efa-b6412a7d66a2', 'type': 'tool_call'}], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 182, 'output_tokens': 3, 'total_tokens': 185, 'input_token_details': {'cache_read': 0}}}\n",
      "{'content': '6', 'additional_kwargs': {}, 'response_metadata': {}, 'type': 'tool', 'name': 'multiply', 'id': 'a5125fcc-439d-4370-8e0b-4ca197a40ba5', 'tool_call_id': 'f2cbef44-3e9a-4b3f-9efa-b6412a7d66a2', 'artifact': None, 'status': 'success'}\n",
      "{'content': 'The result is 6.\\n', 'additional_kwargs': {}, 'response_metadata': {'prompt_feedback': {'block_reason': 0, 'safety_ratings': []}, 'finish_reason': 'STOP', 'safety_ratings': []}, 'type': 'ai', 'name': None, 'id': 'run-b9de5bf3-6010-4cca-ac8c-851c098f0a42-0', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': {'input_tokens': 215, 'output_tokens': 7, 'total_tokens': 222, 'input_token_details': {'cache_read': 0}}}\n"
     ]
    }
   ],
   "source": [
    "from langchain_core.messages import HumanMessage\n",
    "\n",
    "# Input\n",
    "input = {\"messages\": [HumanMessage(content=\"Multiply 3 by 2.\")]}\n",
    "\n",
    "# Stream\n",
    "async for chunk in client.runs.stream(\n",
    "        thread['thread_id'],\n",
    "        \"agent\",\n",
    "        input=input,\n",
    "        stream_mode=\"values\",\n",
    "    ):\n",
    "    if chunk.data and chunk.event != \"metadata\":\n",
    "        print(chunk.data['messages'][-1])"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "tpZWZ6T_Zs0U",
   "metadata": {
    "id": "tpZWZ6T_Zs0U"
   },
   "source": [
    "#### **Note:** We will cover LangGraph Server (Agentic Infrastructure) in detail in the Module 6."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "dfa8b850-750c-4054-95e4-1c457a12ec8a",
   "metadata": {
    "id": "dfa8b850-750c-4054-95e4-1c457a12ec8a"
   },
   "source": [
    "## Testing with Cloud (Optional)\n",
    "\n",
    "We can deploy to Cloud via LangSmith, as outlined [here](https://langchain-ai.github.io/langgraph/cloud/quick_start/#deploy-from-github-with-langgraph-cloud).\n",
    "\n",
    "### Create a New Repository on GitHub\n",
    "\n",
    "* Go to your GitHub account\n",
    "* Click on the \"+\" icon in the upper-right corner and select `\"New repository\"`\n",
    "* Name your repository (e.g., `langchain-academy`)\n",
    "\n",
    "### Add Your GitHub Repository as a Remote\n",
    "\n",
    "* Go back to your terminal where you cloned `langchain-academy` at the start of this course\n",
    "* Add your newly created GitHub repository as a remote\n",
    "\n",
    "```\n",
    "git remote add origin https://github.com/your-username/your-repo-name.git\n",
    "```\n",
    "* Push to it\n",
    "```\n",
    "git push -u origin main\n",
    "```\n",
    "\n",
    "### Connect LangSmith to your GitHub Repository\n",
    "\n",
    "* Go to [LangSmith](hhttps://smith.langchain.com/)\n",
    "* Click on `deployments` tab on the left LangSmith panel\n",
    "* Add `+ New Deployment`\n",
    "* Then, select the Github repository (e.g., `langchain-academy`) that you just created for the course\n",
    "* Point the `LangGraph API config file` at one of the `studio` directories\n",
    "* For example, for module-1 use: `module-1/studio/langgraph.json`\n",
    "* Set your API keys (e.g., you can just copy from your `module-1/studio/.env` file)\n",
    "\n",
    "![Screenshot 2024-09-03 at 11.35.12 AM.png](https://cdn.prod.website-files.com/65b8cd72835ceeacd4449a53/66dbad4fd61c93d48e5d0f47_deployment2.png)\n",
    "\n",
    "### Work with your deployment\n",
    "\n",
    "We can then interact with our deployment a few different ways:\n",
    "\n",
    "* With the [SDK](https://langchain-ai.github.io/langgraph/cloud/quick_start/#use-with-the-sdk), as before.\n",
    "* With [LangGraph Studio](https://langchain-ai.github.io/langgraph/cloud/quick_start/#interact-with-your-deployment-via-langgraph-studio).\n",
    "\n",
    "![Screenshot 2024-08-23 at 10.59.36 AM.png](https://cdn.prod.website-files.com/65b8cd72835ceeacd4449a53/66dbad4fa159a09a51d601de_deployment3.png)\n",
    "\n",
    "To use the SDK here in the notebook, simply ensure that `LANGSMITH_API_KEY` is set!"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "646ed351",
   "metadata": {
    "id": "646ed351"
   },
   "outputs": [],
   "source": [
    "import os, getpass\n",
    "\n",
    "def _set_env(var: str):\n",
    "    if not os.environ.get(var):\n",
    "        os.environ[var] = getpass.getpass(f\"{var}: \")\n",
    "\n",
    "_set_env(\"LANGCHAIN_API_KEY\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "97dda16c-c87f-4c03-b910-d647e83400b2",
   "metadata": {
    "id": "97dda16c-c87f-4c03-b910-d647e83400b2"
   },
   "outputs": [],
   "source": [
    "# Replace this with the URL of your deployed graph\n",
    "URL = \"https://langchain-academy-8011c561878d50b1883f7ed11b32d720.default.us.langgraph.app\"\n",
    "client = get_client(url=URL)\n",
    "\n",
    "# Search all hosted graphs\n",
    "assistants = await client.assistants.search()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "aefa37c0-92fe-4e80-9d5a-80a77b1e3dae",
   "metadata": {
    "id": "aefa37c0-92fe-4e80-9d5a-80a77b1e3dae"
   },
   "outputs": [],
   "source": [
    "# Select the agent\n",
    "agent = assistants[0]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "b810376e-f20f-443a-b1ca-d6793f358f82",
   "metadata": {
    "id": "b810376e-f20f-443a-b1ca-d6793f358f82",
    "outputId": "2bd618da-d79a-471e-849b-05e15a80588e"
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "{'assistant_id': 'fe096781-5601-53d2-b2f6-0d3403f7e9ca',\n",
       " 'graph_id': 'agent',\n",
       " 'created_at': '2024-08-23T17:58:02.722920+00:00',\n",
       " 'updated_at': '2024-08-23T17:58:02.722920+00:00',\n",
       " 'config': {},\n",
       " 'metadata': {'created_by': 'system'}}"
      ]
     },
     "execution_count": 38,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "agent"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "32d65d84-1bcf-4af4-a7c9-55e73d6c1947",
   "metadata": {
    "id": "32d65d84-1bcf-4af4-a7c9-55e73d6c1947",
    "outputId": "0ca9d52e-9c8d-43b3-bbd3-22bc91dffd8d"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'content': 'Multiply 3 by 2.', 'additional_kwargs': {'example': False, 'additional_kwargs': {}, 'response_metadata': {}}, 'response_metadata': {}, 'type': 'human', 'name': None, 'id': '8ea04559-f7d4-4c82-89d9-c60fb0502f21', 'example': False}\n",
      "{'content': '', 'additional_kwargs': {'tool_calls': [{'index': 0, 'id': 'call_EQoolxFaaSVU8HrTnCmffLk7', 'function': {'arguments': '{\"a\":3,\"b\":2}', 'name': 'multiply'}, 'type': 'function'}]}, 'response_metadata': {'finish_reason': 'tool_calls', 'model_name': 'gpt-4o-2024-05-13', 'system_fingerprint': 'fp_3aa7262c27'}, 'type': 'ai', 'name': None, 'id': 'run-b0ea5ddd-e9ba-4242-bb8c-80eb52466c76', 'example': False, 'tool_calls': [{'name': 'multiply', 'args': {'a': 3, 'b': 2}, 'id': 'call_EQoolxFaaSVU8HrTnCmffLk7', 'type': 'tool_call'}], 'invalid_tool_calls': [], 'usage_metadata': None}\n",
      "{'content': '6', 'additional_kwargs': {}, 'response_metadata': {}, 'type': 'tool', 'name': 'multiply', 'id': '1bf558e7-79ef-4f21-bb66-acafbd04677a', 'tool_call_id': 'call_EQoolxFaaSVU8HrTnCmffLk7', 'artifact': None, 'status': 'success'}\n",
      "{'content': '3 multiplied by 2 equals 6.', 'additional_kwargs': {}, 'response_metadata': {'finish_reason': 'stop', 'model_name': 'gpt-4o-2024-05-13', 'system_fingerprint': 'fp_3aa7262c27'}, 'type': 'ai', 'name': None, 'id': 'run-ecc4b6ad-af15-4a85-a76c-de2ed0ed8ed9', 'example': False, 'tool_calls': [], 'invalid_tool_calls': [], 'usage_metadata': None}\n"
     ]
    }
   ],
   "source": [
    "from langchain_core.messages import HumanMessage\n",
    "\n",
    "# We create a thread for tracking the state of our run\n",
    "thread = await client.threads.create()\n",
    "\n",
    "# Input\n",
    "input = {\"messages\": [HumanMessage(content=\"Multiply 3 by 2.\")]}\n",
    "\n",
    "# Stream\n",
    "async for chunk in client.runs.stream(\n",
    "        thread['thread_id'],\n",
    "        \"agent\",\n",
    "        input=input,\n",
    "        stream_mode=\"values\",\n",
    "    ):\n",
    "    if chunk.data and chunk.event != \"metadata\":\n",
    "        print(chunk.data['messages'][-1])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "445cb34d-c3b8-4446-a7e3-5fe938abf99b",
   "metadata": {
    "id": "445cb34d-c3b8-4446-a7e3-5fe938abf99b"
   },
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "colab": {
   "provenance": []
  },
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.12.4"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
