{
 "cells": [
  {
   "cell_type": "markdown",
   "source": [
    "LangChain有一些内置的回调处理程序，但是您可能经常希望使用自定义逻辑创建自己的处理程序。\n",
    "为了创建一个定制的回调处理程序，我们需要确定我们希望回调处理程序处理的事件，以及当事件被触发时我们希望回调处理程序做什么。然后我们需要做的就是将回调处理程序附加到对象上，例如通过构造函数或者在运行时。\n",
    "在我们的自定义回调处理程序MyCustomHandler中，我们实现on_llm_new_token处理程序来打印我们刚刚收到的令牌。然后，我们将自定义处理程序作为构造函数回调附加到模型对象上。\n",
    ">更多参见：https://python.langchain.com/api_reference/core/callbacks/langchain_core.callbacks.base.BaseCallbackHandler.html#langchain-core-callbacks-base-basecallbackhandler"
   ],
   "metadata": {
    "collapsed": false
   },
   "id": "f22ee4085a374cc3"
  },
  {
   "cell_type": "code",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "My custom handler, token: \n",
      "My custom handler, token: Sure,\n",
      "My custom handler, token:  here\n",
      "My custom handler, token: 's\n",
      "My custom handler, token:  a bear-y amusing\n",
      "My custom handler, token:  joke for you:\n",
      "\n",
      "My custom handler, token: Why don't bears\n",
      "My custom handler, token:  like fast food?\n",
      "\n",
      "Because they can't\n",
      "My custom handler, token:  stand the wait in\n",
      "My custom handler, token:  line and prefer to\n",
      "My custom handler, token:  just go fishing!\n",
      "My custom handler, token: \n",
      "content=\"Sure, here's a bear-y amusing joke for you:\\n\\nWhy don't bears like fast food?\\n\\nBecause they can't stand the wait in line and prefer to just go fishing!\" additional_kwargs={} response_metadata={'finish_reason': 'stop', 'model_name': 'qwen-max'} id='run-7ea218e4-d1ee-418d-8a48-1199ba2b0565-0'\n"
     ]
    }
   ],
   "source": [
    "from langchain_community.callbacks import get_openai_callback\n",
    "import os\n",
    "\n",
    "from dotenv import load_dotenv\n",
    "from langchain_core.callbacks import BaseCallbackHandler\n",
    "from langchain_core.prompts import ChatPromptTemplate\n",
    "from langchain_openai import ChatOpenAI\n",
    "\n",
    "load_dotenv()\n",
    "\n",
    "\n",
    "class MyCustomHandler(BaseCallbackHandler):\n",
    "    def on_llm_new_token(self, token: str, **kwargs) -> None:\n",
    "        print(f\"My custom handler, token: {token}\")\n",
    "\n",
    "\n",
    "prompt = ChatPromptTemplate.from_messages([\"Tell me a joke about {animal}\"])\n",
    "# To enable streaming, we pass in `streaming=True` to the ChatModel constructor\n",
    "# Additionally, we pass in our custom handler as a list to the callbacks parameter\n",
    "llm = ChatOpenAI(\n",
    "    # 若没有配置环境变量，请用百炼API Key将下行替换为：api_key=\"sk-xxx\",\n",
    "    openai_api_key=os.getenv(\"DASHSCOPE_API_KEY\"),\n",
    "    openai_api_base=\"https://dashscope.aliyuncs.com/compatible-mode/v1\",\n",
    "    model_name=\"qwen-max\",\n",
    "    temperature=0,\n",
    "    # streaming=True时，response_metadata失效\n",
    "    streaming=True, \n",
    "    callbacks=[MyCustomHandler()],\n",
    ")\n",
    "\n",
    "chain = prompt | llm\n",
    "# with get_openai_callback as cb:\n",
    "#     response = chain.invoke({\"animal\": \"bears\"})\n",
    "response = chain.invoke({\"animal\": \"bears\"})\n",
    "    \n",
    "print(response)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2024-11-06T05:55:37.278059Z",
     "start_time": "2024-11-06T05:55:34.419898Z"
    }
   },
   "id": "797ad8ffd1d4c43e",
   "execution_count": 10
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
