{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# How to stream tool calls\n",
    "\n",
    "When tools are called in a streaming context, \n",
    "[message chunks](https://api.js.langchain.com/classes/langchain_core_messages.AIMessageChunk.html) \n",
    "will be populated with [tool call chunk](https://api.js.langchain.com/types/langchain_core_messages_tool.ToolCallChunk.html) \n",
    "objects in a list via the `.tool_call_chunks` attribute. A `ToolCallChunk` includes \n",
    "optional string fields for the tool `name`, `args`, and `id`, and includes an optional \n",
    "integer field `index` that can be used to join chunks together. Fields are optional \n",
    "because portions of a tool call may be streamed across different chunks (e.g., a chunk \n",
    "that includes a substring of the arguments may have null values for the tool name and id).\n",
    "\n",
    "Because message chunks inherit from their parent message class, an \n",
    "[`AIMessageChunk`](https://api.js.langchain.com/classes/langchain_core_messages.AIMessageChunk.html) \n",
    "with tool call chunks will also include `.tool_calls` and `.invalid_tool_calls` fields. \n",
    "These fields are parsed best-effort from the message's tool call chunks.\n",
    "\n",
    "Note that not all providers currently support streaming for tool calls. Before we start let's define our tools and our model."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [],
   "source": [
    "import { z } from \"zod\";\n",
    "import { tool } from \"@langchain/core/tools\";\n",
    "import { ChatOpenAI } from \"@langchain/openai\";\n",
    "\n",
    "const addTool = tool(async (input) => {\n",
    "  return input.a + input.b;\n",
    "}, {\n",
    "  name: \"add\",\n",
    "  description: \"Adds a and b.\",\n",
    "  schema: z.object({\n",
    "    a: z.number(),\n",
    "    b: z.number(),\n",
    "  }),\n",
    "});\n",
    "\n",
    "const multiplyTool = tool(async (input) => {\n",
    "  return input.a * input.b;\n",
    "}, {\n",
    "  name: \"multiply\",\n",
    "  description: \"Multiplies a and b.\",\n",
    "  schema: z.object({\n",
    "    a: z.number(),\n",
    "    b: z.number(),\n",
    "  }),\n",
    "});\n",
    "\n",
    "const tools = [addTool, multiplyTool];\n",
    "\n",
    "const model = new ChatOpenAI({\n",
    "  model: \"gpt-4o\",\n",
    "  temperature: 0,\n",
    "});\n",
    "\n",
    "const modelWithTools = model.bindTools(tools);"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Now let's define our query and stream our output:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '',\n",
      "    id: 'call_MdIlJL5CAYD7iz9gTm5lwWtJ',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: undefined,\n",
      "    args: '{\"a\"',\n",
      "    id: undefined,\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: undefined,\n",
      "    args: ': 3, ',\n",
      "    id: undefined,\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: undefined,\n",
      "    args: '\"b\": 1',\n",
      "    id: undefined,\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: undefined,\n",
      "    args: '2}',\n",
      "    id: undefined,\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'add',\n",
      "    args: '',\n",
      "    id: 'call_ihL9W6ylSRlYigrohe9SClmW',\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: undefined,\n",
      "    args: '{\"a\"',\n",
      "    id: undefined,\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: undefined,\n",
      "    args: ': 11,',\n",
      "    id: undefined,\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: undefined,\n",
      "    args: ' \"b\": ',\n",
      "    id: undefined,\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: undefined,\n",
      "    args: '49}',\n",
      "    id: undefined,\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[]\n",
      "[]\n"
     ]
    }
   ],
   "source": [
    "const query = \"What is 3 * 12? Also, what is 11 + 49?\";\n",
    "\n",
    "const stream = await modelWithTools.stream(query);\n",
    "\n",
    "for await (const chunk of stream) {\n",
    "  console.log(chunk.tool_call_chunks);\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Note that adding message chunks will merge their corresponding tool call chunks. This is the principle by which LangChain's various [tool output parsers](/docs/how_to/output_parser_structured) support streaming.\n",
    "\n",
    "For example, below we accumulate tool call chunks:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\"',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, ',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 1',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 12}',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 12}',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  },\n",
      "  {\n",
      "    name: 'add',\n",
      "    args: '',\n",
      "    id: 'call_ufY7lDSeCQwWbdq1XQQ2PBHR',\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 12}',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  },\n",
      "  {\n",
      "    name: 'add',\n",
      "    args: '{\"a\"',\n",
      "    id: 'call_ufY7lDSeCQwWbdq1XQQ2PBHR',\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 12}',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  },\n",
      "  {\n",
      "    name: 'add',\n",
      "    args: '{\"a\": 11,',\n",
      "    id: 'call_ufY7lDSeCQwWbdq1XQQ2PBHR',\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 12}',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  },\n",
      "  {\n",
      "    name: 'add',\n",
      "    args: '{\"a\": 11, \"b\": ',\n",
      "    id: 'call_ufY7lDSeCQwWbdq1XQQ2PBHR',\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 12}',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  },\n",
      "  {\n",
      "    name: 'add',\n",
      "    args: '{\"a\": 11, \"b\": 49}',\n",
      "    id: 'call_ufY7lDSeCQwWbdq1XQQ2PBHR',\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 12}',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  },\n",
      "  {\n",
      "    name: 'add',\n",
      "    args: '{\"a\": 11, \"b\": 49}',\n",
      "    id: 'call_ufY7lDSeCQwWbdq1XQQ2PBHR',\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n",
      "[\n",
      "  {\n",
      "    name: 'multiply',\n",
      "    args: '{\"a\": 3, \"b\": 12}',\n",
      "    id: 'call_0zGpgVz81Ew0HA4oKblG0s0a',\n",
      "    index: 0,\n",
      "    type: 'tool_call_chunk'\n",
      "  },\n",
      "  {\n",
      "    name: 'add',\n",
      "    args: '{\"a\": 11, \"b\": 49}',\n",
      "    id: 'call_ufY7lDSeCQwWbdq1XQQ2PBHR',\n",
      "    index: 1,\n",
      "    type: 'tool_call_chunk'\n",
      "  }\n",
      "]\n"
     ]
    }
   ],
   "source": [
    "import { concat } from \"@langchain/core/utils/stream\";\n",
    "\n",
    "const stream = await modelWithTools.stream(query);\n",
    "\n",
    "let gathered = undefined;\n",
    "\n",
    "for await (const chunk of stream) {\n",
    "  gathered = gathered !== undefined ? concat(gathered, chunk) : chunk;\n",
    "  console.log(gathered.tool_call_chunks);\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "At the end, we can see the final aggregated tool call chunks include the fully gathered raw string value:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "string\n"
     ]
    }
   ],
   "source": [
    "console.log(typeof gathered.tool_call_chunks[0].args);"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "And we can also see the fully parsed tool call as an object at the end:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "object\n"
     ]
    }
   ],
   "source": [
    "console.log(typeof gathered.tool_calls[0].args);"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "TypeScript",
   "language": "typescript",
   "name": "tslab"
  },
  "language_info": {
   "codemirror_mode": {
    "mode": "typescript",
    "name": "javascript",
    "typescript": true
   },
   "file_extension": ".ts",
   "mimetype": "text/typescript",
   "name": "typescript",
   "version": "3.7.2"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
