{
 "cells": [
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "initial_id",
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    ""
   ]
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "# 1、 模型调用的分类\n",
    "\n",
    "角度1：按照模型功能的不同\n",
    "\n",
    "非对话模型：（LLMs、Text Model）\n",
    "\n",
    "对话模型：（Chat Models）  (推荐)\n",
    "\n",
    "嵌入模型：（Embedding Models） (放到最后RAG章节讲解)\n",
    "\n",
    "\n",
    "角度2：按照模型调用时，参数书写的位置的不同（api-key、base_url、model-name）\n",
    "\n",
    "硬编码的方式：将参数书写在代码中\n",
    "\n",
    "使用环境变量的方式\n",
    "\n",
    "使用配置文件的方式 (推荐)\n",
    "\n",
    "角度3：具体API的调用\n",
    "\n",
    "使用LangChain提供的API (推荐)\n",
    "\n",
    "使用OpenAI 官方的API\n",
    "\n",
    "使用其它平台提供的API\n",
    "\n",
    "\n",
    "# 2、角度1：非对话模型的调用"
   ],
   "id": "fb9585e27ce99be7"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T20:24:52.414993Z",
     "start_time": "2025-09-26T20:24:47.125477Z"
    }
   },
   "cell_type": "code",
   "source": [
    "from langchain_core.messages import SystemMessage, HumanMessage\n",
    "from langchain_openai import OpenAI, OpenAIEmbeddings\n",
    "import os\n",
    "\n",
    "import dotenv\n",
    "\n",
    "dotenv.load_dotenv()\n",
    "os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY1\")\n",
    "os.environ[\"OPENAI_BASE_URL\"] = os.getenv(\"OPENAI_BASE_URL\")\n",
    "\n",
    "###########核心代码############\n",
    "llm = OpenAI()\n",
    "str = llm.invoke(\"写一首关于春天的诗\")\n",
    "print(str)\n",
    "print(type(str))"
   ],
   "id": "de6d59a4436b5e6b",
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/opt/miniconda3/envs/pyth310/lib/python3.10/site-packages/requests/__init__.py:86: RequestsDependencyWarning: Unable to find acceptable character detection dependency (chardet or charset_normalizer).\n",
      "  warnings.warn(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "春天的诗\n",
      "\n",
      "春天来了，带来了温暖，\n",
      "万物复苏，充满生机。\n",
      "花朵绽放，树木发芽，\n",
      "小草绿茵茵，鸟儿欢歌。\n",
      "\n",
      "春风吹拂，轻柔温柔，\n",
      "荡起湖水，拂过山头。\n",
      "阳光明媚，溢满大地，\n",
      "花香四溢，满园芬芳。\n",
      "\n",
      "春雨细细，滋润大地，\n",
      "润物无声，滋养万物。\n",
      "田野耕耘，百花盛开，\n",
      "种下希望，收获的季节。\n",
      "\n",
      "春天的韵律，如诗如画，\n",
      "春天的气息，如歌如梦。\n",
      "让我们把握，春天的美好，\n",
      "用心感受，这春天的\n",
      "<class 'str'>\n"
     ]
    }
   ],
   "execution_count": 1
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "# 2、角度1：对话模型的调用",
   "id": "64e73b4125a28e1f"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T20:43:16.139592Z",
     "start_time": "2025-09-26T20:43:14.854010Z"
    }
   },
   "cell_type": "code",
   "source": [
    "from langchain_openai import ChatOpenAI\n",
    "from langchain_core.messages import SystemMessage, HumanMessage\n",
    "\n",
    "dotenv.load_dotenv()\n",
    "os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY1\")\n",
    "os.environ[\"OPENAI_BASE_URL\"] = os.getenv(\"OPENAI_BASE_URL\")\n",
    "\n",
    "########核心代码############\n",
    "chat_model = ChatOpenAI(model=\"gpt-4o-mini\")\n",
    "\n",
    "messages = [\n",
    "    SystemMessage(content=\"我是人工智能助手，我叫小智\"),\n",
    "    HumanMessage(content=\"你好，我是小明，很高兴认识你\"),\n",
    "]\n",
    "\n",
    "#输入：list[basemessage]\n",
    "#输出：AIMessage\n",
    "response = chat_model.invoke(messages)  # 输入消息列表\n",
    "\n",
    "print(type(response))\n",
    "print(response.content)\n"
   ],
   "id": "fb53ded3cd05d4af",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'langchain_core.messages.ai.AIMessage'>\n",
      "你好，小明！很高兴认识你！有什么我可以帮助你的吗？\n"
     ]
    }
   ],
   "execution_count": 3
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "# 2、角度1：嵌入模型的调用",
   "id": "ae5279ebb618612f"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T20:57:24.808871Z",
     "start_time": "2025-09-26T20:57:21.301164Z"
    }
   },
   "cell_type": "code",
   "source": [
    "from langchain_openai import OpenAIEmbeddings\n",
    "\n",
    "dotenv.load_dotenv()\n",
    "os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY1\")\n",
    "os.environ[\"OPENAI_BASE_URL\"] = os.getenv(\"OPENAI_BASE_URL\")\n",
    "\n",
    "#############\n",
    "embeddings_model = OpenAIEmbeddings(model=\"text-embedding-ada-002\")\n",
    "\n",
    "res1 = embeddings_model.embed_query('我是文档中的数据')\n",
    "print(res1)\n",
    "# 打印结果：[-0.004306625574827194, 0.003083756659179926, -0.013916781172156334, ...., ]"
   ],
   "id": "a9bd6a9d1c8b1957",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[-0.032839685678482056, 0.002757745562121272, -0.010489282198250294, -0.021119264885783195, -0.016841946169734, 0.02123182639479637, -0.03033520095050335, 0.0030461831483989954, -0.01864292286336422, -0.024594588205218315, 0.008948602713644505, 0.031038707122206688, -0.007893343456089497, 0.008610920049250126, -0.004913993179798126, 0.01269829273223877, 0.010165669023990631, -0.003032113192602992, 0.009342566132545471, -0.01716555841267109, -0.0017429373692721128, 0.010341545566916466, -0.0001418005267623812, -0.0003348251339048147, -0.01663089357316494, 0.01847408153116703, 0.013676166534423828, -0.022413717582821846, -0.001504624611698091, -0.01528016198426485, 0.02521367371082306, -0.0197403933852911, -0.020415758714079857, -0.018009766936302185, -0.0012698292266577482, -0.007991833612322807, -0.009645074605941772, -0.022990593686699867, 0.014815847389400005, -0.007717466447502375, 0.009652109816670418, 0.013113361783325672, -0.0027858857065439224, 0.018009766936302185, 0.0053220270201563835, 0.005547149106860161, 0.016898225992918015, -0.021541370078921318, -0.0028878941666334867, 0.015237951651215553, 0.0031974371522665024, 0.020078076049685478, -0.024946341291069984, -0.023497117683291435, 0.010060143657028675, 0.0029564860742539167, 0.008343587629497051, 0.026100091636180878, -0.022638840600848198, -0.018150467425584793, 0.0055858418345451355, 0.01663089357316494, -0.04316715896129608, 0.00393611891195178, -0.01389425341039896, 0.0011599063873291016, -0.006176787428557873, 0.0029002055525779724, 0.029125170782208443, 0.011453086510300636, 0.010644054040312767, 0.023539328947663307, 0.0006489847437478602, 0.015955528244376183, 0.05006152391433716, -0.006711452268064022, -0.008990813046693802, -0.0034172828309237957, 0.011713383719325066, -0.021245896816253662, 0.01789720542728901, -0.026029741391539574, -0.0022793610114604235, -0.0036547163035720587, 0.021668000146746635, 0.024566447362303734, 0.007520484738051891, 0.05222832411527634, -0.02887190692126751, -0.03286782279610634, 0.007154661230742931, 0.015350512228906155, 0.005068764556199312, 0.014407813549041748, 0.008660165593028069, 0.033543191850185394, 0.00439339829608798, 0.01879769377410412, 0.004544652067124844, -0.040690816938877106, 0.008069219999015331, -0.013929429464042187, -0.040296852588653564, -0.0034067302476614714, -0.030841724947094917, 0.01161489263176918, 0.01865699328482151, -0.008259166963398457, 0.020556461066007614, -0.0034612519666552544, -0.02251220867037773, 0.018558502197265625, 0.03241758048534393, -0.026620686054229736, -0.019881093874573708, -0.010383755899965763, 0.009455127641558647, -0.013218887150287628, -0.029716115444898605, -0.005318509414792061, 0.001127369236201048, 0.037933070212602615, 0.010665158741176128, -0.024721218273043633, 0.01664496399462223, 0.03033520095050335, 0.00858981441706419, -0.02704279124736786, 0.018952464684844017, 0.006546128075569868, 0.018516290932893753, 0.016138439998030663, 0.02269512042403221, 0.009940546937286854, -0.03981846943497658, 0.029941236600279808, -0.019599691033363342, 0.0180660467594862, -0.017461031675338745, -0.011291279457509518, 0.016898225992918015, 0.025551356375217438, -0.010369686409831047, -0.01879769377410412, 0.0010877969907596707, -0.00626472570002079, 0.0030760823283344507, -0.014632935635745525, 0.00016972095181699842, -0.0149002680554986, 0.026803597807884216, -0.008983777835965157, 0.0007870479021221399, 0.02470714971423149, -0.00794962327927351, -0.003974811639636755, -0.01571633480489254, 0.0016119093634188175, 0.008463183417916298, -0.017264049500226974, -0.0036125059705227613, 0.023440837860107422, 0.021330317482352257, -0.00924407597631216, 0.013190747238695621, 0.035372309386730194, -0.01795348711311817, -0.012466135434806347, -0.009328496642410755, 0.0025871451944112778, -0.01663089357316494, 0.01919165812432766, -0.00016576371854171157, -0.0026399081107228994, -0.010925455950200558, 0.028801556676626205, 0.00200851121917367, 9.019173739943653e-05, -0.009314426220953465, -0.008308411575853825, -0.032445721328258514, -0.0014114099321886897, 0.0034630107693374157, -0.005051176995038986, -0.010362651199102402, -0.0022529796697199345, 0.005916489753872156, -0.019473060965538025, -0.008899357169866562, -0.010482246987521648, 0.022934312000870705, 0.01452037412673235, -0.002182628959417343, -0.015237951651215553, -0.6186354756355286, -0.01934642903506756, -0.020528320223093033, -0.004330082796514034, 0.016208790242671967, 0.008660165593028069, 0.005884832236915827, 0.023665959015488625, -0.017798714339733124, -0.04828868806362152, -0.007189836818724871, -0.0015846483875066042, 0.0016646722797304392, 0.011424945667386055, -0.007886308245360851, -0.024946341291069984, 0.0031024636700749397, 0.016884155571460724, -0.01777057535946369, 0.005955182481557131, -0.003012766595929861, 0.022273017093539238, -0.019430849701166153, 0.019810743629932404, 0.014661075547337532, 0.016729384660720825, -0.0031200514640659094, -0.009680249728262424, -0.0005254314164631069, 0.019233867526054382, -0.025874970480799675, 0.004157723393291235, 0.00886418204754591, 0.005406447686254978, 0.04204155132174492, -2.7714309908333234e-05, 0.004636107943952084, 0.01453444454818964, 0.01830524019896984, 0.013859078288078308, -0.003834110451862216, -0.011410876177251339, -0.0006419497076421976, 0.0038130052853375673, 0.005839104298502207, 0.0025730750057846308, 0.030757304280996323, 0.015322372317314148, 0.03177035599946976, 0.024355394765734673, -0.01847408153116703, -0.024172484874725342, -0.004386363085359335, -0.004899922758340836, 0.015167600475251675, 0.004593897610902786, 0.007590835448354483, -0.056618206202983856, 0.026733247563242912, -0.002351470524445176, 0.0118892602622509, 0.012817888520658016, -0.01847408153116703, -0.004727563820779324, -0.013169641606509686, -0.0033821074757725, -0.028759347274899483, 0.007028030231595039, 0.006834565661847591, -0.010658123530447483, 0.017081137746572495, 0.0017271084943786263, 0.007281292695552111, 0.015153530053794384, 0.021864982321858406, 0.025368444621562958, 0.025846829637885094, -0.019107237458229065, -0.010165669023990631, 0.0008134294184856117, 0.0006947126821614802, -0.0018009766936302185, -0.008259166963398457, -0.02048610895872116, -0.011861120350658894, -0.011319420300424099, -0.030757304280996323, -0.0246930792927742, 3.0063911253819242e-05, -0.006795872934162617, 0.015012829564511776, 0.016743455082178116, 0.013211852870881557, -0.005772271193563938, -0.005645639728754759, 0.007675256114453077, -0.014421883970499039, -0.023468976840376854, -0.004615002777427435, 0.010890280827879906, -0.004826054908335209, -0.0006982302293181419, 0.016208790242671967, 0.005241123493760824, 0.02868899516761303, 0.0026188029441982508, -0.0177565049380064, -0.003918531350791454, 0.01588517799973488, -0.0013349036453291774, 0.0014034955529496074, 0.0029705562628805637, -0.003200954757630825, -0.0051004220731556416, -0.02852015383541584, -0.022568488493561745, 0.02360967919230461, 0.00465369550511241, -0.0049386159516870975, 0.010390791110694408, 0.014942478388547897, 0.017686154693365097, 0.025874970480799675, -0.01629321090877056, 0.01198071613907814, 0.021696140989661217, -0.0027595043648034334, -0.006321005988866091, -0.0008081531268544495, -0.022441858425736427, -0.0059762876480817795, -0.03677932173013687, 0.029012609273195267, -0.024003641679883003, 0.018713273108005524, -0.017601732164621353, 0.019782602787017822, 0.002040168968960643, 0.00822399090975523, -0.005332579370588064, -0.030644744634628296, 0.0006709693698212504, -0.002543176058679819, -0.033909015357494354, 0.0004372732655610889, -0.019641902297735214, -0.005825033877044916, -0.010320440866053104, -0.00457631004974246, 0.001098349574021995, -0.016377631574869156, 0.0052481587044894695, -0.00866719987243414, 0.006542610470205545, 0.012198803015053272, 0.018389660865068436, -0.0016769836656749249, -0.009201865643262863, -0.019318288192152977, -0.0034313530195504427, -0.024608658626675606, 0.0019364017061889172, -0.004738116171211004, -0.0032695464324206114, -0.02501669153571129, -0.03469694033265114, -0.019796673208475113, 0.0114319808781147, -0.04074709862470627, -0.022892102599143982, 0.00032075500348582864, 0.01260683685541153, -0.010524457320570946, -0.021119264885783195, 0.003974811639636755, -0.004488371778279543, -0.017489172518253326, 0.0055225263349711895, 0.01733439974486828, -0.0269020888954401, -0.002099966863170266, -0.005023036617785692, 0.000836733088362962, -0.01753138191998005, 0.028435733169317245, 0.01525202114135027, -0.000585229485295713, 0.006412461865693331, -0.014632935635745525, 0.007590835448354483, 0.003533361479640007, 0.00046299523091875017, -0.026972439140081406, 0.011277209036052227, 0.01792534627020359, 0.01861478202044964, -0.0024464440066367388, 0.02175242081284523, 0.014984688721597195, 0.013753551989793777, 0.008751621469855309, 0.022005684673786163, -0.017249979078769684, -0.017629873007535934, 0.024594588205218315, -0.043251581490039825, -0.008997848257422447, -0.011593786999583244, 0.0062049273401498795, -0.00019291468197479844, 0.024749359115958214, -0.00019797113782260567, -0.017658013850450516, -0.006532058119773865, 0.02652219496667385, 0.022976523265242577, -0.015040969476103783, -0.007379783317446709, -0.0035667780321091413, 0.00342783541418612, -0.009054129011929035, -0.010383755899965763, 0.011199823580682278, -0.002666289685294032, -0.01848815195262432, 0.010271195322275162, 0.016039948910474777, 0.04026871174573898, 0.02324385568499565, -0.0027313639875501394, -0.026240793988108635, -0.014928407967090607, -0.0008978502009995282, 0.002045445144176483, 0.0030760823283344507, -0.0028966881800442934, 0.031151268631219864, -0.021499158814549446, 0.024946341291069984, 6.688808207400143e-05, -0.010257124900817871, 0.013486219570040703, 0.0018115292768925428, -0.022990593686699867, 0.002675083465874195, -0.0016435671132057905, 0.01416862104088068, 0.009827986359596252, -0.02503076195716858, 0.020373549312353134, -0.01588517799973488, 0.011931470595300198, -0.01306411623954773, -0.014661075547337532, 0.014006814919412136, -0.02337048575282097, 0.007471239194273949, 0.014689216390252113, 0.016912296414375305, 0.02815433032810688, 0.012768642976880074, -0.0012601560447365046, -0.0013269891496747732, 0.01528016198426485, 0.007830027490854263, -0.01972632296383381, -0.021161476150155067, -0.025171462446451187, -0.029547274112701416, 0.012480205856263638, 0.025832759216427803, -0.01543493289500475, 0.01795348711311817, -0.005153185222297907, 0.0246930792927742, -0.007597870193421841, 0.030447762459516525, -0.01187518984079361, 0.01737661100924015, -0.005107457283884287, -0.03351505100727081, -0.0061591994017362595, -0.01233246922492981, 0.033543191850185394, 0.004062749911099672, -0.017686154693365097, -0.01917758770287037, 0.038467735052108765, -0.019458990544080734, 0.0011168166529387236, 0.007309432607144117, -0.004632590338587761, 0.01123499870300293, -0.02030319720506668, 0.0032431650906801224, 0.027858858928084373, 0.02835131250321865, -0.0010710887145251036, -0.0387209989130497, 0.02176649123430252, -0.008139570243656635, -0.016715314239263535, -0.0094903027638793, -0.03452809900045395, 0.03362761065363884, -0.0022019753232598305, 0.0026135267689824104, 0.01646205224096775, -0.01198071613907814, -0.014956548810005188, 0.0004429892578627914, -0.0007369230734184384, -0.029181450605392456, -0.010777720250189304, -0.03649791702628136, 0.007443098817020655, -0.007822992280125618, 0.004206968937069178, 0.027380473911762238, -0.005628052167594433, 0.007112450897693634, -0.01214955747127533, -0.034105997532606125, 0.016841946169734, -0.0013146777637302876, 0.03416227549314499, -0.0004425495571922511, -0.02652219496667385, 0.006440602242946625, 0.0023215715773403645, -0.02854829467833042, -0.02760559506714344, 0.0015301266685128212, -0.0035228088963776827, 0.0036441637203097343, -0.009968687780201435, 0.01306411623954773, 0.007154661230742931, 0.021330317482352257, 0.003834110451862216, -0.014562585391104221, -0.021358458325266838, -0.010468176566064358, -0.024622729048132896, -0.010742544196546078, -0.004509476944804192, 0.02397550269961357, 0.009018953889608383, -0.006384321488440037, -0.016687175258994102, 0.023637818172574043, 0.009891301393508911, 0.009272215887904167, -0.00015982789045665413, -0.00866719987243414, 0.023651888594031334, 0.0003348251339048147, 0.007429028861224651, -0.00393611891195178, 0.009898336604237556, -0.00763304578140378, -0.00351929129101336, 0.00822399090975523, 0.03222059831023216, 0.01792534627020359, 0.01698264665901661, 0.00649336539208889, -0.031939197331666946, -0.0017209528014063835, -0.004854194819927216, -0.012888239696621895, 0.015561563894152641, -0.004706458654254675, -0.022723261266946793, -0.0009444574825465679, 0.0002915155200753361, -0.024172484874725342, -0.025635777041316032, 0.029125170782208443, 0.011002842336893082, 0.018966535106301308, -0.012001820839941502, 0.0016268588369712234, -0.0019152965396642685, -0.006257690489292145, 0.005353684537112713, -0.0014492233749479055, -0.014759566634893417, -0.019824814051389694, -0.008716445416212082, -0.016391701996326447, -0.019318288192152977, -0.012867134064435959, 0.022934312000870705, -0.0022529796697199345, -0.007970728911459446, -0.01187518984079361, -0.023722240701317787, 0.003647681325674057, -0.004337117541581392, 0.02467900887131691, 0.004776809364557266, 0.0027788507286459208, 0.0221041738986969, 0.003661751514300704, -0.019121307879686356, -0.0013815108686685562, -0.011424945667386055, -0.03452809900045395, 0.011924435384571552, -0.02396143227815628, -0.010482246987521648, 0.00340497144497931, 0.004738116171211004, -0.009708389639854431, 0.004178828559815884, 0.02542472630739212, 0.0031077400781214237, 0.009068199433386326, -0.006546128075569868, -0.003742654575034976, 0.008913427591323853, -0.009082268923521042, -0.01629321090877056, 0.004657213110476732, -0.026029741391539574, -0.02722570300102234, -0.0010939526837319136, 0.01698264665901661, 0.006612961180508137, 0.005343132186681032, 0.023103153333067894, 0.004776809364557266, -0.004192898981273174, 0.017503242939710617, -0.007137073669582605, 0.016405772417783737, -0.008660165593028069, 0.018868044018745422, 0.014815847389400005, -0.00976467039436102, 0.04330785945057869, -0.013021905906498432, -0.0123465396463871, 0.004214004147797823, 0.0029617624823004007, 0.025143323466181755, 0.04277319461107254, -0.0025097595062106848, 0.016152510419487953, -0.01454851496964693, 0.0008974105003289878, -0.01831931062042713, -0.009849091060459614, 0.01754545234143734, 0.015336441807448864, -0.0012847788166254759, 0.009891301393508911, -0.01957155205309391, -0.005318509414792061, 0.01352139562368393, 0.003946671728044748, -0.015519353561103344, 0.0055225263349711895, -0.018572572618722916, -0.010390791110694408, 0.0009963411139324307, 0.014773637056350708, 0.028252821415662766, -0.022723261266946793, -0.010461142286658287, 0.003341655945405364, -0.014210831373929977, 0.041056640446186066, 0.010397826321423054, 0.012374679557979107, 0.005979805253446102, -0.012649047188460827, 0.004059232771396637, -0.02413027361035347, -0.006852153688669205, 0.012170663103461266, 0.01829116977751255, 0.0005786340916529298, 0.03514718636870384, -0.006454672198742628, 0.01169227808713913, 0.01661682315170765, 0.0014228419167920947, 0.012409854680299759, -0.010017932392656803, -0.0038200404960662127, -0.021907193586230278, 0.0037004442419856787, 0.008216955699026585, 0.016349490731954575, 0.020697161555290222, 0.007970728911459446, 0.008195850998163223, 0.0059129721485078335, 0.022273017093539238, 0.018333379179239273, -0.0014342739013954997, -0.04986454173922539, -0.008343587629497051, 0.0026100091636180878, 0.004541134461760521, 0.004878817591816187, -0.023257926106452942, -0.006816978100687265, 0.03343062847852707, -0.033346209675073624, 0.033683892339468, -0.0016769836656749249, 0.01625099964439869, -0.03146081045269966, -0.006795872934162617, -0.027028720825910568, 0.023694099858403206, -0.02870306558907032, 0.010531492531299591, -0.011129473336040974, 0.0018255994655191898, 0.04209782928228378, 0.0026416669134050608, 0.026606615632772446, 0.012015891261398792, -0.008688305504620075, 0.011199823580682278, 0.01793941669166088, 0.0049562035128474236, -0.01995144411921501, -0.011220929212868214, -0.0025238296948373318, -0.027464894577860832, -0.020514249801635742, -0.023539328947663307, -0.03731398656964302, 0.02541065588593483, 0.001996199833229184, 0.0020067524164915085, 0.016602754592895508, -0.007288327440619469, -0.01271236315369606, 0.007123003713786602, 0.010010898113250732, 0.031939197331666946, 0.0021210722625255585, 0.03604767471551895, 0.034809503704309464, -0.0050546941347420216, -0.030391480773687363, -0.062246255576610565, -0.0029388985130935907, -0.01571633480489254, 0.02868899516761303, 0.034837644547224045, -0.026789527386426926, -0.013570641167461872, 0.01024305447936058, 0.009919442236423492, 0.017306260764598846, -0.015055039897561073, 0.0051742903888225555, 0.01242392510175705, -0.021541370078921318, -0.0024587553925812244, -0.010454107075929642, -0.03106684796512127, 0.012381714768707752, -0.007274257484823465, -0.015477143228054047, -0.021513229236006737, -0.024228764697909355, 0.015575634315609932, 0.007414958905428648, -0.031010568141937256, 0.02999751828610897, -0.009279251098632812, -0.015561563894152641, 9.733672050060704e-05, -0.0123465396463871, -0.020753441378474236, 0.01453444454818964, -0.01187518984079361, 0.012360609136521816, -0.0002117114927386865, -0.022399647161364555, 0.010060143657028675, 0.017798714339733124, -0.02391922101378441, 0.002272326033562422, -0.030053798109292984, 0.049611277878284454, -0.02303280308842659, -0.0009972205152735114, -0.0033082393929362297, -0.011347560212016106, 0.006475777365267277, 0.0206549521535635, -0.016743455082178116, -0.011263139545917511, -0.007794852368533611, -0.012022926472127438, -0.021808702498674393, 0.009004883468151093, 0.016152510419487953, -0.0026381495408713818, -0.023511188104748726, -0.014379673637449741, -0.008097359910607338, -0.001006893697194755, -0.0008653129916638136, -0.006190857384353876, -0.01902281679213047, 0.0005219138693064451, 0.01152343675494194, 0.019473060965538025, -0.022582558915019035, -0.004203451331704855, 0.02360967919230461, 0.030813585966825485, 0.0050546941347420216, 0.026733247563242912, 0.007126520853489637, -0.012226942926645279, -0.021077055484056473, 0.010172704234719276, 0.022624770179390907, -0.02064088173210621, 0.03238943964242935, -0.009370706975460052, -0.025551356375217438, -0.01883990503847599, -0.003918531350791454, 0.020007725805044174, -0.017264049500226974, 0.01051038783043623, 0.03717328608036041, 0.004826054908335209, 0.003918531350791454, -0.01525202114135027, -0.007193353958427906, 0.014246007427573204, -0.019993655383586884, 0.014112341217696667, -0.010454107075929642, -0.0257624089717865, 0.010390791110694408, -0.008399867452681065, -0.0024956893175840378, 0.006282313261181116, -0.044180210679769516, -0.025537285953760147, -0.0029336221050471067, 0.020556461066007614, -0.0021105194464325905, -0.017418822273612022, -0.012388749979436398, -0.004337117541581392, -0.0008767449762672186, 0.015167600475251675, 0.015322372317314148, 0.01679973490536213, 0.01665903441607952, -0.0001921452203532681, 0.012860098853707314, 0.00958175864070654, -0.007351642940193415, 0.00021764733537565917, -0.006289348471909761, -0.021682070568203926, -0.030278921127319336, -0.0026275969576090574, -0.020359478890895844, -0.007534554693847895, 0.015167600475251675, -0.012571661733090878, -0.03874913975596428, -0.0019416779978200793, -0.014801776967942715, -0.02685987949371338, -0.018037907779216766, 0.020246917381882668, 0.0054767983965575695, 0.016011808067560196, -0.006982302293181419, 0.022441858425736427, -0.0027823683340102434, 0.014049025252461433, -0.02940657176077366, -0.013422904536128044, 0.010271195322275162, -0.022976523265242577, 0.007801887113600969, 0.011410876177251339, -0.016208790242671967, -0.010411896742880344, 0.006943609565496445, -0.012325434014201164, 0.011009876616299152, 0.0180660467594862, 0.008702375926077366, -0.013591745868325233, -0.029209591448307037, 0.00794258899986744, -0.0015573875280097127, -0.01453444454818964, 0.0022019753232598305, -0.03241758048534393, -0.005874279420822859, -0.004695905838161707, 0.005751165561378002, -0.0055225263349711895, 0.007087828125804663, 0.007407923694700003, -0.00411903066560626, -0.0008758656331337988, -0.006064226385205984, 0.01993737556040287, -0.0017033651238307357, -0.008378762751817703, 0.0032660290598869324, -0.01716555841267109, -0.009954617358744144, 0.056252382695674896, 0.007837062701582909, -0.00011129693302791566, -0.003090152284130454, 0.014506304636597633, -0.028309103101491928, 0.004280837252736092, 0.008252131752669811, 0.002071826718747616, 0.02120368555188179, -0.008076255209743977, 0.024059923365712166, -0.007879273034632206, 0.018741413950920105, 0.010897316038608551, -0.012599801644682884, -0.0290407482534647, 0.001027119462378323, 0.007337572984397411, -0.022357437759637833, -0.015406792983412743, -0.013324413448572159, -0.012909344397485256, -0.012142522260546684, -0.00203665136359632, -0.01920572854578495, -0.019107237458229065, -0.01826302893459797, 0.020570531487464905, -0.0024956893175840378, -0.01643391139805317, -0.002263532252982259, 0.020190637558698654, 0.028759347274899483, 0.002894929377362132, 0.3115689754486084, -0.012592766433954239, 0.007618975825607777, 0.017081137746572495, 0.016701243817806244, 0.036103952676057816, 0.007808922324329615, -0.004354705568403006, -0.0033117569983005524, 0.019079096615314484, 0.00895563792437315, -0.0017438167706131935, 0.0094903027638793, -0.0007874876027926803, -0.0049562035128474236, -0.02431318536400795, -0.03525974601507187, -0.005223535932600498, -0.022540349513292313, -0.023525258526206017, 0.034837644547224045, -0.01361988577991724, 0.009602864272892475, -0.02725384198129177, 0.015491213649511337, -0.012543520890176296, -0.012318398803472519, -0.010665158741176128, 0.017643943428993225, 0.02045796997845173, 0.0004660730774048716, -0.006539092864841223, 0.0006775647052563727, -0.012874169275164604, -0.024397606030106544, 0.008962673135101795, 0.022821752354502678, -0.005399412475526333, 0.03222059831023216, 6.0127822507638484e-05, -0.005673780106008053, -0.03644163906574249, -0.0034893923439085484, 0.00022798008285462856, -0.006092366296797991, 0.03905868157744408, -0.013957569375634193, -0.019881093874573708, -0.009201865643262863, 0.025663917884230614, -0.01952934078872204, -0.008751621469855309, 0.017629873007535934, 0.016602754592895508, -0.024622729048132896, 0.015477143228054047, 0.0128038190305233, 0.006641101557761431, -0.018755484372377396, -0.0036089883651584387, 0.004389880690723658, 0.012564626522362232, 0.01470328588038683, -0.008709411136806011, -0.032305020838975906, -0.02173835225403309, -0.03388087451457977, 0.013197782449424267, 0.003693409264087677, -0.02617044188082218, -0.0003645043179858476, -0.008716445416212082, -0.00411903066560626, 0.014801776967942715, -0.04924545809626579, -0.016602754592895508, 0.005360719747841358, 0.02745082415640354, 0.021077055484056473, 0.02688801847398281, -0.027943279594182968, -0.006961197126656771, -0.0054205176420509815, -0.012810853309929371, -0.010981736704707146, -0.016715314239263535, 0.00832951720803976, -0.006736075039952993, -0.01233246922492981, -0.006831048522144556, 0.002328606555238366, 0.007991833612322807, 0.002045445144176483, -0.01716555841267109, 0.021344387903809547, 0.005445140413939953, -0.022582558915019035, 0.00016026757657527924, 0.0022441858891397715, -0.0013305067550390959, -0.04513697698712349, 0.03728584572672844, 0.03286782279610634, 0.02818247117102146, -0.0038235578685998917, -0.003538637887686491, -0.01795348711311817, -0.02669103816151619, -0.0038270754739642143, -0.026254862546920776, 0.0002965719613712281, -0.028660856187343597, 0.010644054040312767, 0.0007101019145920873, -0.004337117541581392, 0.00921593513339758, 0.032108038663864136, -0.015181670896708965, 0.012212873436510563, -0.016560543328523636, 0.00535016693174839, -0.04595304653048515, -0.010482246987521648, 0.01701078750193119, -0.013823903165757656, -0.011910364963114262, 0.005033588968217373, -0.0009558894671499729, 0.009131514467298985, -0.029125170782208443, 0.03146081045269966, -0.011565647087991238, 0.018896184861660004, 0.003346932353451848, -0.013310343027114868, 0.01454851496964693, -0.007400888483971357, 0.008765690959990025, -0.03089800663292408, 0.012128452770411968, -0.008294342085719109, -0.00976467039436102, 0.01846001110970974, -0.005554183851927519, 0.005955182481557131, 0.018192678689956665, 0.02541065588593483, -0.005617499351501465, 0.031742215156555176, -0.02231522649526596, -0.0171514879912138, -0.013880183920264244, 0.012290258891880512, 0.015364582650363445, 0.04097221791744232, 0.0024218212347477674, -0.0350627638399601, -0.015997737646102905, -0.009089304134249687, 0.016321351751685143, -0.0279714185744524, -0.0026363907381892204, 0.043589264154434204, -0.019613761454820633, 0.0013524913229048252, -0.011713383719325066, -0.1838121861219406, 0.011059122160077095, 0.012958589941263199, -0.0186710637062788, 0.017221840098500252, 0.029884956777095795, 0.004185863770544529, 0.00562453456223011, 0.017249979078769684, 0.0032290949020534754, -0.00037044016062282026, -0.010693298652768135, -0.028843767940998077, -0.007935553789138794, -0.0028105084784328938, -0.003573813010007143, -0.02816840074956417, 0.004344152752310038, 0.020612740889191628, 0.020992634817957878, 0.019430849701166153, -0.010003862902522087, -0.0157022662460804, -0.019473060965538025, 0.004906957969069481, 0.022737329825758934, -0.004702941048890352, 0.02742268331348896, -0.008821971714496613, 0.006415979471057653, -0.013120396994054317, -0.010433001443743706, 0.011150578036904335, -0.003003972815349698, 0.006679794285446405, 0.00035175326047465205, -0.008090324699878693, -0.0037813475355505943, -0.027464894577860832, -0.0024482025764882565, -0.00016862171469256282, 0.04899219423532486, -0.003647681325674057, 0.0034753221552819014, 0.00621899776160717, 0.009968687780201435, 0.031517092138528824, -0.0031464328058063984, -0.004685353487730026, -0.0024693079758435488, 0.02818247117102146, -0.03908682242035866, 0.02284989133477211, 0.00018653915321920067, -0.006116989068686962, 0.017784643918275833, -0.002671566093340516, 0.026043811812996864, -0.008259166963398457, 2.0720464817713946e-05, 0.0019258491229265928, -0.005891866981983185, 0.006363216321915388, 0.007830027490854263, 0.014161585830152035, -0.026086021214723587, -0.026845809072256088, 0.002611767966300249, -0.009912407025694847, 0.012065136805176735, -0.007844097912311554, -0.02451016753911972, 0.02413027361035347, 0.010299335233867168, 0.0028826179914176464, 0.01571633480489254, -0.018755484372377396, 0.014400778338313103, 0.004541134461760521, -0.006535575725138187, -0.01609622873365879, 0.013289238326251507, -0.026015670970082283, 0.007696361280977726, 0.019993655383586884, 0.026958370581269264, 0.012888239696621895, -0.0031991959549486637, 0.027535244822502136, -0.012170663103461266, 0.011382735334336758, -0.032642703503370285, -0.013823903165757656, 0.00393611891195178, -0.0009066440397873521, 0.011495296843349934, -0.004055715166032314, 0.01702485792338848, 0.0056491573341190815, -0.014063095673918724, 0.016954507678747177, -0.0051848432049155235, -0.013021905906498432, 0.020260987803339958, 0.012571661733090878, 0.01606808975338936, -0.004270284436643124, 0.012015891261398792, 0.04463045299053192, -0.006257690489292145, -0.051834359765052795, 0.018755484372377396, 0.000493773608468473, 0.02030319720506668, 0.03441553935408592, 0.03449995815753937, -0.04567164182662964, -0.015631914138793945, -0.006535575725138187, 0.015308301895856857, -0.008428008295595646, -0.007618975825607777, -0.0017587662441655993, 0.013605816289782524, -0.0032853754237294197, -0.019247937947511673, -0.0449681356549263, -0.023117223754525185, 0.04040941596031189, 0.023722240701317787, -0.011424945667386055, 0.018600711598992348, -0.012642011977732182, 0.020570531487464905, -0.02448202669620514, 0.01611029915511608, -0.022638840600848198, -0.020936353132128716, -0.02647998556494713, 0.004636107943952084, 0.016166578978300095, 0.02999751828610897, -0.008807901293039322, -0.006514470558613539, -0.014302287250757217, 0.03205175697803497, -0.010292300023138523, -0.013507325202226639, 0.01105208694934845, -0.011973680928349495, 0.012459100224077702, 0.009265180677175522, -0.03036334179341793, 0.04699423536658287, 0.015364582650363445, -0.02068309113383293, 0.004189381375908852, -0.03177035599946976, 0.010250089690089226, -0.006517987698316574, -0.01737661100924015, -0.006700899451971054, -0.04181642830371857, -0.007914448156952858, 0.022723261266946793, -0.009567688219249249, 0.0062612080946564674, 0.013978674076497555, 0.004769774153828621, -0.014189726673066616, 0.0037743125576525927, -0.022624770179390907, -0.012846029363572598, 0.023328276351094246, -0.004031092394143343, -0.003756724763661623, -0.03331806883215904, 0.011453086510300636, -0.018952464684844017, 0.02118961699306965, 0.040521975606679916, 0.003255476476624608, 0.001049983431585133, 0.03402157500386238, -0.0013261097483336926, 0.006373769138008356, -0.022174526005983353, -0.01379576325416565, -0.03770795091986656, 0.0290407482534647, -0.001421962515451014, 0.00950437318533659, -0.012374679557979107, 0.0009435781394131482, 0.009757635183632374, -0.023525258526206017, -0.002583627589046955, 0.013655061833560467, -0.031010568141937256, -0.006180304568260908, -0.020739372819662094, -0.015786686912178993, -0.0206549521535635, 0.0016198237426578999, 0.016166578978300095, -0.03745468705892563, -0.004833089653402567, -0.020021796226501465, -0.025551356375217438, -0.0011168166529387236, 0.012360609136521816, 0.022090105339884758, -0.01625099964439869, -0.005427552852779627, 0.012191767804324627, -0.03511904552578926, -0.012649047188460827, -0.0037919001188129187, 0.007787817157804966, -0.027000579982995987, 0.00906116422265768, 0.023454906418919563, 0.010123458690941334, -0.007984799332916737, 0.0012680705403909087, -0.009982757270336151, -0.010313405655324459, -0.007801887113600969, -0.0753033384680748, 0.008413937874138355, -0.0031147750560194254, -0.00735867815092206, 0.014098270796239376, -0.007098380941897631, -0.006862706039100885, -0.006957679521292448, -0.0037356195971369743, 0.009849091060459614, -0.017629873007535934, 0.002303983783349395, -0.010932491160929203, -0.014801776967942715, -0.02434132620692253, -0.028815627098083496, 0.020007725805044174, 0.018713273108005524, 0.01528016198426485, 0.017826855182647705, 0.006275278050452471, 0.0019557480700314045, -0.001136163016781211, 0.0022195631172508, -0.018558502197265625, 0.005234088283032179, -0.009314426220953465, 0.021836841478943825, -0.02009214647114277, -0.0008178263087756932, -0.01352139562368393, -0.024622729048132896, -0.005469763185828924, -0.0025238296948373318, 0.010545562952756882, -0.02391922101378441, 0.020598670467734337, 0.01627914048731327, 0.02137252874672413, 0.029350291937589645, -0.02266697958111763, -0.04063453525304794, 0.005863726604729891, -0.0017789921257644892, -0.00968728493899107, 0.025101112201809883, -0.03509090468287468, -0.0052376058883965015, 0.03348691016435623, -0.01665903441607952, 0.028112120926380157, 0.013880183920264244, 0.012388749979436398, -0.02446795627474785, -0.004136618226766586, -0.019881093874573708, 0.018023837357759476, 0.007098380941897631, 0.005103939678519964, -0.007168731652200222, 0.01471735630184412, -0.007013959810137749, -0.0026856360491365194, 0.010819930583238602, 0.013493254780769348, -0.004382845479995012, -0.012782713398337364, -0.0009831503266468644, 0.018417799845337868, -0.02653626538813114, -0.020697161555290222, -0.013183712027966976, 0.007717466447502375, 0.021611720323562622, 0.011199823580682278, 0.020176567137241364, 0.005842621438205242, 0.003661751514300704, 0.0009514925768598914, 0.028083980083465576, 0.017137419432401657, -0.00048585920012556016, -0.01647612266242504, 0.005934077315032482, 0.015336441807448864, 0.019458990544080734, -0.012599801644682884, 0.018178608268499374, -0.015181670896708965, 0.01435153279453516, -0.0034137654583901167, 0.016884155571460724, 0.008364692330360413, 0.0027911621145904064, 0.008814936503767967, -0.0015811308985576034, 0.0021562473848462105, -0.007618975825607777, 0.03891798108816147, 0.03888984024524689, 0.005381824914366007, -0.0005825913394801319, 0.006444119848310947, -0.020753441378474236, -0.046178169548511505, 0.01299376506358385, -0.030588462948799133, -0.036300934851169586, -0.020176567137241364, 0.010313405655324459, -0.012353574857115746, -0.0037215494085103273, -0.007168731652200222, 0.029575414955615997, -0.0014887956203892827, -0.002611767966300249, -0.006845118477940559, -0.011213894002139568, -0.036469776183366776, 0.028210612013936043, 0.013718376867473125, 0.005142632871866226, 0.021428808569908142, -0.009011918678879738, 0.024721218273043633, 0.03925566375255585, 0.009117444045841694, -0.009715424850583076, -0.004660730715841055, 0.023314205929636955, -0.0021562473848462105, -0.0015107803046703339, 0.0002479860559105873, -0.01808011718094349, -0.01160082221031189, 0.007309432607144117, -0.0329241044819355, 0.01862885244190693, 0.00011201143206562847, 0.04713493585586548, 0.005103939678519964, -0.011938505806028843, 0.010060143657028675, -0.004709976259618998, -0.010953596793115139, 0.0016119093634188175, 0.0033346209675073624, -0.01454851496964693, -0.010693298652768135, 0.03559742867946625, -0.016138439998030663, -0.023103153333067894, -0.00813253503292799, -0.02671917714178562, 0.012128452770411968, -0.012233978137373924, 0.009877231903374195, -0.0026821186766028404, 0.0028650301974266768, 0.029350291937589645, 0.014112341217696667, 0.01864292286336422, -0.00013168762961868197, -0.0019065026426687837, -0.0006428291089832783, 0.01920572854578495, -0.02228708565235138, -0.0021105194464325905, -0.0008815816254355013, 0.015350512228906155, -0.0024235800374299288, -0.021105196326971054, -0.024200623854994774, 0.0025906627997756004, -0.02287803217768669, -0.014112341217696667, -0.023694099858403206, 0.0350627638399601, 0.01810825802385807, 0.00942698772996664, 0.0026786010712385178, -0.01679973490536213, 0.005473280791193247, 0.0159273874014616, -0.01793941669166088, 0.018713273108005524, -0.02083786390721798, -0.01975446380674839]\n"
     ]
    }
   ],
   "execution_count": 7
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "# 3、角度2：参数位置不同举例\n",
    "\n",
    "\n",
    "## 3.1 硬编码的方式\n",
    "\n",
    "以对话模型为例："
   ],
   "id": "f9942d966d3c37aa"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:10:06.833157Z",
     "start_time": "2025-09-26T21:10:02.534677Z"
    }
   },
   "cell_type": "code",
   "source": [
    "\n",
    "\n",
    "# 调用非对话模型：\n",
    "# llms = OpenAI(...)\n",
    "\n",
    "# 调用对话模型：\n",
    "chat_model = ChatOpenAI(\n",
    "    #必须要设置的3个参数\n",
    "    model=\"gpt-4o-mini\",  #默认使用的是gpt-3.5-turbo模型\n",
    "    base_url=\"https://api.openai-proxy.org/v1\",\n",
    "    api_key=\"sk-9RiXU7Yiev0NuA94Mxl57wDVtwNavjiQYQue5E6qcgILyJv9\"\n",
    ")\n",
    "\n",
    "# 调用模型\n",
    "response = chat_model.invoke(\"什么是langchain?\")\n",
    "\n",
    "# 查看响应的文本\n",
    "# print(response.content)\n",
    "print(response)"
   ],
   "id": "a968acd5b71f4fd5",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "content='LangChain 是一个用于构建基于语言模型的应用程序的框架。它的主要目标是简化开发者在使用大型语言模型（如 OpenAI 的 GPT-3、GPT-4 等）时的工作流程。LangChain 提供了一系列工具和组件，帮助开发者快速构建、调整和部署与自然语言处理相关的应用程序。\\n\\nLangChain 主要包含以下几个方面的功能：\\n\\n1. **文档链（Document Chains）**：处理和管理信息流，通过使用语言模型处理输入，并生成输出。\\n\\n2. **处理工具（Tooling）**：与外部数据源或API的集成，使得语言模型能够访问和利用其他信息，例如数据库、搜索引擎等。\\n\\n3. **记忆（Memory）**：提供上下文记忆功能，使得模型能够在交互中保持状态，从而提升用户体验。\\n\\n4. **代理（Agents）**：创建能够根据特定任务和目标选择并调用适当工具的智能代理。\\n\\n5. **评估和调试**：针对构建的语言模型应用进行性能评估和调试的工具和方法。\\n\\n通过这些功能，LangChain 使得开发者能够更高效地创建复杂的自然语言处理应用，能够在对话系统、问答系统、内容生成和更多领域中发挥作用。' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 290, 'prompt_tokens': 12, 'total_tokens': 302, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-mini-2024-07-18', 'system_fingerprint': 'fp_efad92c60b', 'id': 'chatcmpl-CK9zXapN5VN88hRxzBuJnEDEuAtbf', 'service_tier': None, 'finish_reason': 'stop', 'logprobs': None} id='run--2d458aad-074d-429b-9fc0-0e7a266bcf13-0' usage_metadata={'input_tokens': 12, 'output_tokens': 290, 'total_tokens': 302, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}}\n"
     ]
    }
   ],
   "execution_count": 19
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "演示非对话模型",
   "id": "ccca49a1ec037d6"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:13:20.775825Z",
     "start_time": "2025-09-26T21:13:18.222395Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 调用非对话模型：\n",
    "llm = OpenAI(\n",
    "    api_key=\"sk-9RiXU7Yiev0NuA94Mxl57wDVtwNavjiQYQue5E6qcgILyJv9\",\n",
    "    base_url=\"https://api.openai-proxy.org/v1\",\n",
    ")\n",
    "\n",
    "# 调用模型\n",
    "response = llm.invoke(\"什么是langchain?\")\n",
    "\n",
    "# 查看响应的文本\n",
    "print(response)"
   ],
   "id": "480b65c83aefd507",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\n",
      "Langchain是一个区块链平台，旨在为语言和文化相关的内容和服务提供一个去中心化的生态系统。它通过基于区块链的智能合约和加密货币的使用，为语言学习、翻译、文化旅游等领域的参与者提供更加安全、高效、透明的交易和服务。通过使用Langchain，用户可以更容易地找到合适的语言学习或翻译服务，同时也可以更容易地将自己的语言能力或文化知识变现，并获得公平的报酬。\n"
     ]
    }
   ],
   "execution_count": 20
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 3.2 使用环境变量的方式\n",
    "\n",
    "说明：使用环境变量的方式，在jupyter中执行不合适，需要在.py文件中执行"
   ],
   "id": "10884aedd3fff3b9"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:16:26.025345Z",
     "start_time": "2025-09-26T21:16:23.101784Z"
    }
   },
   "cell_type": "code",
   "source": [
    "\n",
    "\n",
    "# 1、获取对话模型：\n",
    "chat_model = ChatOpenAI(\n",
    "    model=\"gpt-4o-mini\",\n",
    "    base_url=os.environ[\"OPENAI_BASE_URL\"],\n",
    "    api_key=os.environ[\"OPENAI_API_KEY\"],\n",
    ")\n",
    "\n",
    "# 2、调用模型\n",
    "response = chat_model.invoke(\"什么是langchain?\")\n",
    "\n",
    "# 3、查看响应的文本\n",
    "# print(response.content)\n",
    "print(response)"
   ],
   "id": "941cab592c980499",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "content='LangChain 是一个用于构建基于语言模型（如 GPT-3 和 ChatGPT）的应用程序的框架。它提供了一系列工具和组件，帮助开发者更方便地创建和集成自然语言处理（NLP）应用。LangChain 的核心目标是使开发者能够高效地构建与语言模型交互的复杂应用，而无需关注底层实现的细节。\\n\\nLangChain 主要包括以下几个关键组件：\\n\\n1. **链（Chains）**：允许将多个操作串联起来，从而实现更复杂的任务。例如，可以将数据检索、文本生成和最终输出整合在一起。\\n\\n2. **代理（Agents）**：具备决策能力的组件，根据用户输入动态选择执行的操作或调用特定的链，以实现更灵活和智能的响应。\\n\\n3. **记忆（Memory）**：允许应用保存用户的上下文信息，从而在后续的对话中提供更个性化和连贯的回应。\\n\\n4. **数据连接（Data Connectors）**：支持与各种外部数据源（如数据库、API等）的连接，使得应用能够实时访问和处理数据。\\n\\nLangChain 适用于 Chatbots、问答系统、自动生成内容等多种应用场景，是一个强大的工具，帮助开发者利用先进的语言模型技术构建高效的应用程序。' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 296, 'prompt_tokens': 12, 'total_tokens': 308, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-mini-2024-07-18', 'system_fingerprint': 'fp_efad92c60b', 'id': 'chatcmpl-CKA5f4WLXeOqbKoBblh94zqOsTcds', 'service_tier': None, 'finish_reason': 'stop', 'logprobs': None} id='run--bd1c86bd-2b09-4851-83bf-70e8f84b49e0-0' usage_metadata={'input_tokens': 12, 'output_tokens': 296, 'total_tokens': 308, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}}\n"
     ]
    }
   ],
   "execution_count": 21
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 3.3 使用配置文件的方式（推荐）\n",
    "\n",
    "使用.env的配置文件\n",
    "\n",
    "方式1："
   ],
   "id": "ca1476f610973ea2"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:19:33.552520Z",
     "start_time": "2025-09-26T21:19:30.395910Z"
    }
   },
   "cell_type": "code",
   "source": [
    "\n",
    "#加载配置文件\n",
    "dotenv.load_dotenv()\n",
    "\n",
    "# 1、获取对话模型：\n",
    "chat_model = ChatOpenAI(\n",
    "    model=\"gpt-4o-mini\",\n",
    "    api_key=os.getenv(\"OPENAI_API_KEY1\"),\n",
    "    base_url=os.getenv(\"OPENAI_BASE_URL\")\n",
    ")\n",
    "\n",
    "# 2、调用模型\n",
    "response = chat_model.invoke(\"什么是langchain?\")\n",
    "\n",
    "# 3、查看响应的文本\n",
    "# print(response.content)\n",
    "print(response)"
   ],
   "id": "f51a1cbe6a39ba7b",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "content='LangChain 是一个用于构建基于语言模型的应用程序的框架，特别适用于与大型语言模型（如 OpenAI 的 GPT、Google 的 BERT 等）进行交互的应用程序。它提供了一系列的工具和组件，帮助开发者快速构建、管理和部署语言模型相关的应用。\\n\\nLangChain 的特点包括：\\n\\n1. **模块化设计**：LangChain 提供多种模块，可以用于不同的任务，如文本生成、问答、对话系统等。开发者可以根据需求组合这些模块。\\n\\n2. **集成多种数据源**：LangChain 可以与各种数据源（如数据库、API、文档等）进行集成，从而提升语言模型的功能。\\n\\n3. **便于定制**：开发者可以根据特定需求定制和扩展 LangChain 中的组件，以实现更加具体的功能。\\n\\n4. **流畅的工作流程**：LangChain 提供了一种流畅的工作流程，使得开发者能够更容易地构建和管理复杂的语言模型应用。\\n\\n这个框架特别适合需要在对话系统、文本分析、信息抽取等领域中使用语言模型的应用。通过 LangChain，开发者可以更加高效地构建具有强大语言理解能力的应用程序。' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 279, 'prompt_tokens': 12, 'total_tokens': 291, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-mini-2024-07-18', 'system_fingerprint': 'fp_efad92c60b', 'id': 'chatcmpl-CKA8hnULuM1tGOEmZ3cD793i6tMit', 'service_tier': None, 'finish_reason': 'stop', 'logprobs': None} id='run--93f28cf7-c0da-4461-876c-dc30482a3223-0' usage_metadata={'input_tokens': 12, 'output_tokens': 279, 'total_tokens': 291, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}}\n"
     ]
    }
   ],
   "execution_count": 22
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "方式2：",
   "id": "a4c32c1f76b0b61d"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:23:56.060340Z",
     "start_time": "2025-09-26T21:23:54.105543Z"
    }
   },
   "cell_type": "code",
   "source": [
    "\n",
    "\n",
    "#加载配置文件\n",
    "dotenv.load_dotenv()\n",
    "\n",
    "os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY1\")\n",
    "os.environ[\"OPENAI_BASE_URL\"] = os.getenv(\"OPENAI_BASE_URL\")\n",
    "\n",
    "# 1、获取对话模型：\n",
    "chat_model = ChatOpenAI(\n",
    "    #必须要设置的3个参数\n",
    "    #model_name=\"gpt-4o-mini\",   #默认使用的是gpt-3.5-turbo模型\n",
    "    #当没有显式的声明base_url和api_key的时候，默认会从环境变量中查找\n",
    ")\n",
    "\n",
    "# 2、调用模型\n",
    "response = chat_model.invoke(\"什么是langchain?\")\n",
    "\n",
    "# 3、查看响应的文本\n",
    "# print(response.content)\n",
    "print(response)"
   ],
   "id": "967a14f9f50ca1e4",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "content='Langchain是一个区块链项目，旨在通过区块链技术和智能合约实现多语言之间的即时翻译和交流。该项目旨在消除语言障碍，让各种语言的人们能够轻松地进行沟通和交流。Langchain使用加密货币作为支付手段，用户可以使用该平台进行文字翻译、语音翻译和实时聊天等服务。' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 130, 'prompt_tokens': 14, 'total_tokens': 144, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'gpt-3.5-turbo-0125', 'system_fingerprint': 'fp_0165350fbb', 'id': 'chatcmpl-CKACwnW8FGgOTSEZyljsGS1XEiFxi', 'service_tier': None, 'finish_reason': 'stop', 'logprobs': None} id='run--c3801afb-54c2-4d74-b3f8-677458100da8-0' usage_metadata={'input_tokens': 14, 'output_tokens': 130, 'total_tokens': 144, 'input_token_details': {}, 'output_token_details': {}}\n"
     ]
    }
   ],
   "execution_count": 23
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "体会其它参数的使用：",
   "id": "1197bb62754bae36"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:30:07.346781Z",
     "start_time": "2025-09-26T21:30:05.572923Z"
    }
   },
   "cell_type": "code",
   "source": [
    "\n",
    "#加载配置文件\n",
    "dotenv.load_dotenv()\n",
    "os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY1\")\n",
    "os.environ[\"OPENAI_BASE_URL\"] = os.getenv(\"OPENAI_BASE_URL\")\n",
    "\n",
    "# 1、获取对话模型：\n",
    "chat_model = ChatOpenAI(\n",
    "    temperature=0.7,\n",
    "    max_tokens=20,\n",
    ")\n",
    "\n",
    "# 2、调用模型\n",
    "response = chat_model.invoke(\"什么是langchain?\")\n",
    "print(response)\n",
    "\n"
   ],
   "id": "bfe3a80729b104dc",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "content='Langchain是一种基于区块链技术的多语言' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 20, 'prompt_tokens': 14, 'total_tokens': 34, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-3.5-turbo-0125', 'system_fingerprint': None, 'id': 'chatcmpl-CKAIwkh0U9NJZnIjLFo1cx9hfZXxT', 'service_tier': 'default', 'finish_reason': 'length', 'logprobs': None} id='run--aca0a03b-46d8-41f9-a3a0-80413d36c249-0' usage_metadata={'input_tokens': 14, 'output_tokens': 20, 'total_tokens': 34, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}}\n"
     ]
    }
   ],
   "execution_count": 24
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "# 4、角度3：使用各个平台的API的调用大模型（了解）\n",
    "\n",
    "## 4.1 OpenAI的方式\n",
    "\n",
    "调用非对话模型"
   ],
   "id": "2775f5bc183c0c25"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:37:06.593108Z",
     "start_time": "2025-09-26T21:37:05.388877Z"
    }
   },
   "cell_type": "code",
   "source": [
    "from openai import OpenAI\n",
    "\n",
    "# 从环境变量读取API密钥（推荐安全存储）\n",
    "client = OpenAI(\n",
    "    api_key=\"sk-9RiXU7Yiev0NuA94Mxl57wDVtwNavjiQYQue5E6qcgILyJv9\",\n",
    "    base_url=\"https://api.openai-proxy.org/v1\",\n",
    ")\n",
    "\n",
    "# 调用Completion接口\n",
    "response = client.completions.create(\n",
    "    model=\"gpt-3.5-turbo-instruct\",  # 非对话模型\n",
    "    prompt=\"请将以下英文翻译成中文：\\n'Artificial intelligence will reshape the future.'\",\n",
    "    max_tokens=100,  # 生成文本最大长度\n",
    "    temperature=0.7  # 控制随机性\n",
    ")\n",
    "# 提取结果\n",
    "print(response.choices[0].text.strip())"
   ],
   "id": "60018e93d190bf08",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "人工智能将重塑未来。\n"
     ]
    }
   ],
   "execution_count": 27
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "调用对话模型",
   "id": "460d9ac44aeb90b6"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:40:37.737946Z",
     "start_time": "2025-09-26T21:40:36.064694Z"
    }
   },
   "cell_type": "code",
   "source": [
    "from openai import OpenAI\n",
    "\n",
    "# 从环境变量读取API密钥（推荐安全存储）\n",
    "client = OpenAI(\n",
    "    api_key=\"sk-9RiXU7Yiev0NuA94Mxl57wDVtwNavjiQYQue5E6qcgILyJv9\",\n",
    "    base_url=\"https://api.openai-proxy.org/v1\",\n",
    ")\n",
    "\n",
    "# 调用Completion接口\n",
    "response = client.chat.completions.create(\n",
    "    model=\"gpt-3.5-turbo\",  # 对话模型\n",
    "    messages=[\n",
    "        {\"role\":\"system\",\"content\":\"你是一个乐于助人的智能AI小助手\"},\n",
    "        {\"role\":\"user\",\"content\":\"你好，请你介绍一下你自己\"},\n",
    "    ],\n",
    "    max_tokens=150,  # 生成文本最大长度\n",
    "    temperature=0.5  # 控制随机性\n",
    ")\n",
    "# 提取结果\n",
    "print(response.choices[0].message)"
   ],
   "id": "6519779574d95b39",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "ChatCompletionMessage(content='你好，我是一个智能AI助手，可以回答各种问题，提供信息和帮助解决问题。无论是日常生活中的疑问、学习中的困惑，还是工作中的挑战，我都会尽力为你提供支持和帮助。有什么问题可以帮到你呢？', refusal=None, role='assistant', annotations=None, audio=None, function_call=None, tool_calls=None)\n"
     ]
    }
   ],
   "execution_count": 29
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "## 4.2 百度千帆平台",
   "id": "f65999392ecd6346"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T21:48:21.744684Z",
     "start_time": "2025-09-26T21:48:19.745525Z"
    }
   },
   "cell_type": "code",
   "source": [
    "from openai import OpenAI\n",
    "\n",
    "client = OpenAI(\n",
    "    api_key=\"bce-v3/ALTAK-gZcb2Td8O4DMttKPJhoOY/70a618b56ba51a1e136c51e9c07f93009d615efd\",  # 千帆bearer token\n",
    "    base_url=\"https://qianfan.baidubce.com/v2\",  # 千帆域名\n",
    "    default_headers={\"appid\": \"app-bBOKLDnw\"}   # 用户在千帆上的appid，非必传\n",
    ")\n",
    "\n",
    "completion = client.chat.completions.create(\n",
    "    model=\"ernie-4.0-turbo-8k\", # 预置服务请查看模型列表，定制服务请填入API地址\n",
    "    messages=[{'role': 'system', 'content': 'You are a helpful assistant.'},\n",
    "              {'role': 'user', 'content': 'Hello！'}]\n",
    ")\n",
    "\n",
    "print(completion.choices[0].message)\n"
   ],
   "id": "b747798421dc5180",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "ChatCompletionMessage(content='你好呀！很高兴见到你，有什么想聊的或者需要帮忙的吗？无论是问题、创意还是闲聊，我都在这里哦~ 😊', refusal=None, role='assistant', annotations=None, audio=None, function_call=None, tool_calls=None)\n"
     ]
    }
   ],
   "execution_count": 33
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 4.3 使用阿里云百炼平台\n",
    "\n",
    "方式1：使用OpenAI的方式"
   ],
   "id": "76e5c1fb5606974"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T22:03:35.797275Z",
     "start_time": "2025-09-26T22:02:30.342156Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import os\n",
    "from openai import OpenAI\n",
    "import dotenv\n",
    "\n",
    "dotenv.load_dotenv()\n",
    "print(os.getenv(\"DASHSCOPE_API_KEY\"))\n",
    "print(os.getenv(\"DASHSCOPE_BASE_URL\"))\n",
    "\n",
    "client = OpenAI(\n",
    "    # 若没有配置环境变量，请用阿里云百炼API Key将下行替换为：api_key=\"sk-xxx\",\n",
    "    api_key=os.getenv(\"DASHSCOPE_API_KEY\"),  # 如何获取API Key：https://help.aliyun.com/zh/model-studio/developer-reference/get-api-key\n",
    "    base_url=os.getenv(\"DASHSCOPE_BASE_URL\")\n",
    ")\n",
    "\n",
    "completion = client.chat.completions.create(\n",
    "    model=\"deepseek-r1\",  # 此处以 deepseek-r1 为例，可按需更换模型名称。\n",
    "    messages=[\n",
    "        {'role': 'user', 'content': '9.9和9.11谁大'}\n",
    "    ]\n",
    ")\n",
    "\n",
    "# 通过reasoning_content字段打印思考过程\n",
    "print(\"思考过程：\")\n",
    "print(completion.choices[0].message.reasoning_content)\n",
    "\n",
    "# 通过content字段打印最终答案\n",
    "print(\"最终答案：\")\n",
    "print(completion.choices[0].message.content)"
   ],
   "id": "a471343d8af596e1",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "sk-fb596d5fcc7045dbb4b42062852112e4\n",
      "https://dashscope.aliyuncs.com/compatible-mode/v1\n",
      "思考过程：\n",
      "首先，问题是：“9.9和9.11谁大？”这看起来是在比较两个数字：9.9 和 9.11。\n",
      "\n",
      "我需要理解这些数字。9.9 是九点九，意思是 9 + 9/10。9.11 是九点一一，意思是 9 + 11/100。\n",
      "\n",
      "为了比较它们，我应该将它们转换为相同的单位，比如小数或分数。\n",
      "\n",
      "考虑小数形式：\n",
      "\n",
      "- 9.9 等于 9.90（因为 9.9 = 9.90）\n",
      "\n",
      "- 9.11 就是 9.11\n",
      "\n",
      "现在，比较 9.90 和 9.11。\n",
      "\n",
      "9.90 的十分位是 9，百分位是 0。\n",
      "\n",
      "9.11 的十分位是 1，百分位是 1。\n",
      "\n",
      "由于整数部分相同（都是 9），我比较小数部分。\n",
      "\n",
      "9.90 的小数部分是 0.90\n",
      "\n",
      "9.11 的小数部分是 0.11\n",
      "\n",
      "现在，0.90 与 0.11 比较。显然，0.90 大于 0.11，因为 90 百分之一大于 11 百分之一。\n",
      "\n",
      "0.90 是 90/100，0.11 是 11/100，所以 90/100 > 11/100。\n",
      "\n",
      "因此，9.90 > 9.11，所以 9.9 > 9.11。\n",
      "\n",
      "我也可以将它们视为分数：\n",
      "\n",
      "9.9 = 99/10\n",
      "\n",
      "9.11 = 911/100\n",
      "\n",
      "要比较 99/10 和 911/100，我可以找一个公分母。\n",
      "\n",
      "公分母是 100。\n",
      "\n",
      "99/10 = (99 * 10) / (10 * 10) = 990/100\n",
      "\n",
      "99/10 = 9.9，乘以 10 是 99，但分母是 10，所以乘以 10/10 得到分母 100：\n",
      "\n",
      "99/10 = (99 * 10) / (10 * 10) = 990/100？不对。\n",
      "\n",
      "99/10 = ? 要得到分母 100，分子和分母都乘以 10：\n",
      "\n",
      "(99/10) * (10/10) = (99 * 10) / (10 * 10) = 990 / 100\n",
      "\n",
      "99 * 10 = 990, 10 * 10 = 100, 所以是 990/100。\n",
      "\n",
      "990/100 是 9.90，没错。\n",
      "\n",
      "现在，9.11 = 911/100\n",
      "\n",
      "所以，比较 990/100 和 911/100。\n",
      "\n",
      "由于分母相同，比较分子：990 和 911。\n",
      "\n",
      "990 > 911，所以 990/100 > 911/100，因此 9.9 > 9.11。\n",
      "\n",
      "另一个方式：9.9 是 9 + 0.9，而 9.11 是 9 + 0.11，显然 0.9 > 0.11。\n",
      "\n",
      "0.9 是 9/10，也就是 0.90，而 0.11 是 11/100，即 0.11，所以是的。\n",
      "\n",
      "但问题是以“9.9”和“9.11”的形式写的。9.11 可能被误解为版本号之类的东西，但在数学上下文中，它很可能就是数字九点一一。\n",
      "\n",
      "在版本号中，9.11 可能表示第九次主要更新的第十一次次要更新，但问题中写的是“9.9和9.11”，并且是“谁大”，这暗示是数值比较。\n",
      "\n",
      "此外，在中文中，“9.9”和“9.11”很可能就是数字。\n",
      "\n",
      "但为了确认，在某些语境中，点可能表示小数点，而不是句点。\n",
      "\n",
      "例如，在软件版本中，9.11 可能比 9.9 大，但数值上，9.9 更大。\n",
      "\n",
      "但问题是用数字写的，并且是“谁大”，可能是在数值意义上。\n",
      "\n",
      "回顾一下，用户是用中文写的：“9.9和9.11谁大”，所以很可能是在比较数值。\n",
      "\n",
      "在数值上，9.9 大于 9.11。\n",
      "\n",
      "但让我看看 9.11 是否可能被误解。9.11 是九点一一，而不是九又十一，因为后面没有单位。\n",
      "\n",
      "有时候人们会写 9.11 来表示 9.11，比如在价格中，但数值上是一样的。\n",
      "\n",
      "另一个想法：在日期中，9.11 可能表示九月十一日，但这里是与 9.9 比较，而 9.9 可能表示九月九日，但问题中写的是“9.9”和“9.11”，并且是“谁大”，这可能是在问哪个日期更晚，但日期比较通常不是这样表达的。\n",
      "\n",
      "在中文中，日期通常写作“9月9日”和“9月11日”，而不是“9.9”和“9.11”。\n",
      "\n",
      "这里写作“9.9”和“9.11”，所以很可能就是数字。\n",
      "\n",
      "此外，在上下文中，是“9.9和9.11”，所以我认为是数值比较。\n",
      "\n",
      "所以，数值上，9.9 > 9.11。\n",
      "\n",
      "但为了全面一点，我们考虑一下如果点号是句点，就像在版本号中那样。\n",
      "\n",
      "在软件版本中，版本号通常按数字部分进行比较。\n",
      "\n",
      "例如，版本 9.9 与版本 9.11。\n",
      "\n",
      "9.11 的主要版本是 9，次要版本是 11。\n",
      "\n",
      "9.9 的主要版本是 9，次要版本是 9。\n",
      "\n",
      "由于主要版本相同，我们比较次要版本：9 与 11。\n",
      "\n",
      "11 > 9，所以版本 9.11 比 9.9 更新。\n",
      "\n",
      "但在这个语境下，问题是“谁大”，可能是指哪个版本更大，意思是哪个更新。\n",
      "\n",
      "但问题是用数字写的，并且没有指定是版本号，所以可能不是。\n",
      "\n",
      "在中文中，对于版本，可能会说“版本9.9”和“版本9.11”，但这里只是“9.9和9.11”。\n",
      "\n",
      "此外，用户可能是在测试数值理解。\n",
      "\n",
      "但为了安全起见，我应该两种可能性都考虑。\n",
      "\n",
      "但在这个问题中，是“9.9和9.11谁大”，而在数学问题中，这很可能是数值比较。\n",
      "\n",
      "也许用户是混淆了小数点的概念。\n",
      "\n",
      "例如，有些人可能认为 9.11 比 9.9 大，因为 11 比 9 大，但忽略了小数点后的位数不同。\n",
      "\n",
      "但数值上，这是不正确的。\n",
      "\n",
      "就像 0.9 比 0.11 大一样。\n",
      "\n",
      "另一个例子：9.9 几乎是 10，而 9.11 大约是 9.1，所以 9.9 更大。\n",
      "\n",
      "我认为数值上很清楚。\n",
      "\n",
      "但为了确认，我们看看用户可能的误解。\n",
      "\n",
      "用户可能认为 9.11 是 9.11，而 9.9 是 9.90，但写成了 9.9，所以 9.9 是更短的写法，但数值相同。\n",
      "\n",
      "在比较中，9.9 是 9.90，大于 9.11。\n",
      "\n",
      "如果用户认为 9.11 是九又十一，但那样应该写作 9 11 或类似形式，而不是 9.11。\n",
      "\n",
      "在中文中，小数点是用点表示的，所以 9.11 是九点一一。\n",
      "\n",
      "所以我认为数值比较是题中之意。\n",
      "\n",
      "因此，9.9 更大。\n",
      "\n",
      "但问题是“谁大”，所以是“谁更大”。\n",
      "\n",
      "所以答案应该是 9.9 更大。\n",
      "\n",
      "但让我再检查一下数字：9.9 和 9.11。\n",
      "\n",
      "9.9 = 9.90\n",
      "\n",
      "9.11 = 9.11\n",
      "\n",
      "9.90 - 9.11 = 0.79 > 0，所以 9.9 更大。\n",
      "\n",
      "是的。\n",
      "\n",
      "如果是从字符串角度比较，但问题中写的是“谁大”，可能是指数值大小。\n",
      "\n",
      "在编程中，如果作为字符串比较，\"9.11\" 和 \"9.9\"，由于点号相同，然后比较“9”和“11”，但“11”在字典序中大于“9”，所以“9.11” > “9.9” 作为字符串，但数值上不是这样。\n",
      "\n",
      "但在这个语境下，可能不是这样。\n",
      "\n",
      "我认为对于这个问题，数值比较才是正确的。\n",
      "\n",
      "所以，我会选择 9.9 更大。\n",
      "最终答案：\n",
      "在数值比较中，9.9 和 9.11 都是小数，需要将它们转换为相同位数来比较。\n",
      "\n",
      "- 9.9 可以写作 9.90（因为 9.9 = 9.90）。\n",
      "- 9.11 保持为 9.11。\n",
      "\n",
      "现在比较小数部分：\n",
      "- 9.90 的小数部分是 0.90（即 90/100）。\n",
      "- 9.11 的小数部分是 0.11（即 11/100）。\n",
      "\n",
      "由于 0.90 > 0.11，因此 9.90 > 9.11，即 9.9 > 9.11。\n",
      "\n",
      "**结论：9.9 更大。**\n",
      "\n",
      "### 说明：\n",
      "- 有些人可能误以为 9.11 更大，因为 11 > 9，但这忽略了小数点后的位数差异。实际上，9.9 相当于 9.90，比 9.11 大 0.79。\n",
      "- 如果是在其他上下文（如软件版本号），版本 9.11 可能比 9.9 新（因为次要版本 11 > 9），但这里问题明确是数值比较（“谁大”在中文中通常指数值大小），因此以数值为准。\n"
     ]
    }
   ],
   "execution_count": 3
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "方式2：使用dashscope",
   "id": "1b3d59a1599dd37f"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T22:08:28.908319Z",
     "start_time": "2025-09-26T22:08:13.496310Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import os\n",
    "import dashscope\n",
    "\n",
    "messages = [\n",
    "    {'role': 'user', 'content': '你是谁？'}\n",
    "]\n",
    "\n",
    "response = dashscope.Generation.call(\n",
    "    # 若没有配置环境变量，请用阿里云百炼API Key将下行替换为：api_key=\"sk-xxx\",\n",
    "    api_key=os.getenv('DASHSCOPE_API_KEY'),\n",
    "    model=\"deepseek-r1\",  # 此处以 deepseek-r1 为例，可按需更换模型名称。\n",
    "    messages=messages,\n",
    "    # result_format参数不可以设置为\"text\"。\n",
    "    result_format='message'\n",
    ")\n",
    "\n",
    "print(\"=\" * 20 + \"思考过程\" + \"=\" * 20)\n",
    "print(response.output.choices[0].message.reasoning_content)\n",
    "print(\"=\" * 20 + \"最终答案\" + \"=\" * 20)\n",
    "print(response.output.choices[0].message.content)"
   ],
   "id": "dc5ae012cc567713",
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/opt/miniconda3/envs/pyth310/lib/python3.10/site-packages/requests/__init__.py:86: RequestsDependencyWarning: Unable to find acceptable character detection dependency (chardet or charset_normalizer).\n",
      "  warnings.warn(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "====================思考过程====================\n",
      "嗯，用户问了一个非常基础但也很关键的问题——“你是谁”。这可能是第一次使用智能助手的用户，或者是在测试我的基础功能。  \n",
      "\n",
      "用户可能带着好奇或试探的心态，想确认我的身份和功能边界。ta没有提供任何背景信息，所以需要我用最清晰简洁的方式介绍自己，同时传递出友好和可信任的感觉。  \n",
      "\n",
      "重点要突出三点：身份（DeepSeek-R1）、开发者（深度求索）、核心能力（文本处理）。用“伙伴”这个词能降低距离感，而“24小时在线”强调可靠性。最后用表情符号和开放性问题收尾，既能活跃气氛，也能引导对话继续。  \n",
      "\n",
      "对了，用户没提具体需求，所以结尾要主动邀请提问，给ta一个开口的契机。\n",
      "====================最终答案====================\n",
      "你好呀！👋我是 **DeepSeek-R1**，是由中国的 **深度求索（DeepSeek）公司**开发的一款智能助手。你可以把我当作一个随时在线、知识丰富、乐于助人的文字聊天伙伴～\n",
      "\n",
      "我能帮你解答各种问题，比如学习辅导、写作润色、编程答疑、工作文档处理、生活建议等等。我支持上传文件（比如PDF、Word、Excel等），还能阅读其中的内容，帮你提取信息或总结重点📄✨\n",
      "\n",
      "目前我还是**纯文字版**的助手，不支持语音和图片识别，但你可以随时打字问我任何问题！而且我是**免费的**，24小时在线，随时等你来聊～😊\n",
      "\n",
      "那现在，有什么我可以帮你的吗？\n"
     ]
    }
   ],
   "execution_count": 4
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 4.4 智谱的GLM\n",
    "\n",
    "方式1：使用OpenAI的方式"
   ],
   "id": "982520d03f909392"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T22:13:10.005381Z",
     "start_time": "2025-09-26T22:12:39.425925Z"
    }
   },
   "cell_type": "code",
   "source": [
    "from openai import OpenAI\n",
    "\n",
    "client = OpenAI(\n",
    "    api_key=\"9777a669173c4d7fa353ad18d54bba1f.s7Qm9vvKiWyGw5RX\",\n",
    "    base_url=\"https://open.bigmodel.cn/api/paas/v4/\"\n",
    ")\n",
    "\n",
    "completion = client.chat.completions.create(\n",
    "    model=\"glm-4.5\",\n",
    "    messages=[\n",
    "        {\"role\": \"system\", \"content\": \"你是一个聪明且富有创造力的小说作家\"},\n",
    "        {\"role\": \"user\", \"content\": \"请你作为童话故事大王，写一篇短篇童话故事\"}\n",
    "    ],\n",
    "    top_p=0.7,\n",
    "    temperature=0.9\n",
    ")\n",
    "\n",
    "print(completion.choices[0].message.content)"
   ],
   "id": "25f3f93ec92ce52",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "## 影子出走记\n",
      ">小艾的影子突然消失了，她哭着跑进森林寻找。\n",
      ">黑袍巫师告诉她，影子被偷走了，除非用最珍贵的东西交换。\n",
      ">小艾翻遍森林，找到会发光的蘑菇、会唱歌的石头，却都不够珍贵。\n",
      ">当她绝望时，发现影子正蹲在盲童小屋外，用自己为他描绘世界。\n",
      ">巫师现身，原来他偷影子只因自己没有影子而孤独。\n",
      ">小艾的影子主动留下陪伴巫师，却教会他：真正的影子来自温暖的心。\n",
      ">从此，小艾的影子能短暂离开她，去照亮更多黑暗的角落。\n",
      "---\n",
      "\n",
      "阳光慷慨地泼洒在小镇的青石板路上，小艾和她的影子在暖融融的光斑里追逐嬉戏。她跳一下，影子也跟着跳一下；她转个圈，影子便在地上画出一个完美的圆。影子是她最亲密的伙伴，沉默却忠诚，永远紧紧相随，像一片温柔的墨色土地，承载着她所有的欢腾与安静。\n",
      "\n",
      "可那天下午，当小艾从午睡中醒来，习惯性地看向地面时，心脏猛地一沉——空荡荡的，只有一片苍白的光。她的影子，那个从不曾离开的伙伴，不见了！小艾慌了，她冲到院子里，在每一寸阳光下疯狂寻找，可影子仿佛被阳光彻底蒸发，无影无踪。恐惧像冰冷的藤蔓缠住了她，眼泪不受控制地涌出来，她抽泣着，冲向了小镇边缘那片幽深静谧的森林。\n",
      "\n",
      "森林深处，光线被浓密的枝叶切割得支离破碎，空气里浮动着潮湿泥土和腐叶的气息。小艾跌跌撞撞地跑着，泪水模糊了视线。突然，前方一棵巨大的、盘根错节的古树后面，缓缓转出一个身影。他裹在一件宽大得几乎拖地的黑袍里，兜帽压得极低，只露出一个苍白得没有血色的下巴。他像一团移动的、凝固的黑暗，无声无息地挡住了小艾的去路。\n",
      "\n",
      "“小姑娘，你在找什么？”一个低沉沙哑，仿佛枯叶摩擦的声音响起。\n",
      "\n",
      "小艾被这突如其来的声音吓得后退一步，但想到失踪的影子，她鼓起勇气，带着哭腔喊道：“我的影子！我的影子不见了！”\n",
      "\n",
      "黑袍巫师发出一声短促、古怪的轻笑，像是叹息，又像是嘲弄。“影子？哦，是的，我知道它在哪里。”他缓缓抬起一只枯瘦的手，指向森林更幽暗的深处，“它被偷走了。被一个……贪婪的家伙偷走了。”\n",
      "\n",
      "“偷走了？”小艾的心沉得更深，眼泪又涌了出来，“那……那怎么办？我要怎样才能把它找回来？”\n",
      "\n",
      "巫师兜帽下的阴影似乎更深了些，他慢悠悠地说：“办法嘛……倒是有一个。用你最珍贵的东西，来交换你的影子。只要你能找到一样东西，珍贵到让那个小偷动心，你的影子就能回到你身边。”\n",
      "\n",
      "“最珍贵的东西？”小艾喃喃重复着，擦干眼泪，眼神重新燃起一丝希望的光芒。她开始在森林里仔细搜寻。她记得妈妈说过，森林里藏着许多宝贝。她拨开低垂的蕨类植物，在潮湿的树根缝隙里，真的发现了几朵小小的蘑菇，它们散发着柔和的、梦幻般的蓝绿色光芒，像坠落的星星碎片。小艾小心翼翼地把它们捧在手心，蘑菇的光映亮了她期待的脸。她找到巫师，献宝似的递过去：“这个！会发光的蘑菇，很珍贵吧？”\n",
      "\n",
      "巫师只瞥了一眼，那蓝绿色的光芒在他深不见底的兜帽下似乎都黯淡了下去。“光？森林里从不缺光。”他冷冷地说，“不够珍贵。”\n",
      "\n",
      "小艾没有气馁。她继续往森林深处走，耳朵贴在冰冷的岩石上倾听。终于，在一处溪流边，她听到了微弱却奇妙的歌声，像是无数细小的风铃在轻摇。她循声挖开湿润的泥土，找到了一块圆润的、布满细密纹路的石头。当她的手指触碰到它时，那空灵的歌声便清晰地流淌出来，带着溪水的清凉和树叶的沙沙声。她再次找到巫师，满怀希望地递上石头：“这个！会唱歌的石头，它的声音多好听！”\n",
      "\n",
      "巫师伸出枯瘦的手指，轻轻碰了碰石头。歌声戛然而止。“声音？”他低语，声音里带着一丝不易察觉的落寞，“森林里也不缺声音。不够珍贵。”\n",
      "\n",
      "蘑菇的光不够，石头的歌也不够。小艾的心一点点沉下去，像灌满了冰冷的溪水。她茫然地坐在一棵倒下的巨大树干上，望着森林深处越来越浓的暮色，绝望像潮水般将她淹没。最珍贵的东西……她还有什么？她掏空了口袋，只有几颗光滑的鹅卵石，一片形状奇特的叶子。她抱着膝盖，把脸埋进去，无声地哭泣起来。影子，她最亲密的伙伴，真的再也回不来了吗？\n",
      "\n",
      "就在这时，一阵极其微弱、断断续续的童声飘了过来，带着一种奇特的、努力模仿的稚嫩：“……看……大树……好高……叶子……绿绿的……风……吹过……沙沙……”\n",
      "\n",
      "小艾猛地抬起头，循着声音，跌跌撞撞地拨开最后一片浓密的灌木丛。眼前豁然开朗，林间空地上，坐落着一间小小的、简陋的木屋。木屋的门虚掩着，那声音正是从里面传出来的。而就在木屋门口的台阶上，蹲着一个熟悉的、墨色的身影——她的影子！\n",
      "\n",
      "小艾几乎要欢呼出声，可下一秒，她愣住了。影子正对着木屋的门，它那模糊的轮廓在昏暗的光线下，正努力地做出各种姿态：它伸出双臂，尽力向上伸展，仿佛在拥抱一棵看不见的巨树；它轻轻晃动身体，模仿着树叶在风中的摇曳；它甚至在地面上笨拙地画着圆圈，又轻轻拂过，试图描绘风的形状。而木屋内，那个断断续续的童声，正努力地、充满惊奇地复述着影子所“描绘”的一切。\n",
      "\n",
      "“……树……好高……叶子……绿绿的……风……沙沙……”\n",
      "\n",
      "小艾的心像被什么东西狠狠撞了一下，又酸又胀。她明白了，木屋里住着一个看不见的孩子。她的影子，没有选择回到她身边，而是留在这里，用自己唯一的方式，为一个生活在黑暗中的孩子，努力“画”出一个充满色彩和声音的世界。那一刻，小艾忘记了寻找，忘记了交换，她只是静静地站在那里，看着自己的影子，看着它笨拙却无比专注的“表演”，泪水再次模糊了视线，但这一次，是滚烫的、带着震撼的暖流。\n",
      "\n",
      "“看来你找到它了。”那个低沉沙哑的声音再次在身后响起。黑袍巫师不知何时已悄然出现在小艾身旁，兜帽下的目光，似乎也正投向那个蹲在木屋门口的影子。\n",
      "\n",
      "小艾猛地回头，愤怒地瞪着他：“是你！是你偷走了我的影子！你为什么要这么做？”\n",
      "\n",
      "巫师沉默了片刻，那宽大的黑袍在微风中轻轻摆动，像一片没有生命的巨大阴影。终于，他缓缓抬起手，指向自己脚下——那里，只有一片被踩过的枯草，没有任何跟随的墨色轮廓。\n",
      "\n",
      "“因为……”他的声音里第一次带上了一丝难以言喻的苦涩和孤寂，“我没有影子。我独自在这森林里，已经太久了。我看着阳光下的世界，看着那些成双成对的身影，只有我，永远是孤零零的一个。我嫉妒，我……孤独。”他顿了顿，声音更低了，“我偷走影子，只是想……想看看有影子陪伴是什么感觉。哪怕……只是偷来的片刻温暖。”\n",
      "\n",
      "小艾看着巫师脚下那片空荡荡的枯草，再看看木屋门口那个正努力为盲童描绘世界的影子，心中的愤怒像潮水般退去，取而代之的是一种复杂的、柔软的怜悯。就在这时，那个蹲在门口的影子，仿佛感应到了什么，突然停止了模仿。它缓缓地、有些笨拙地转过身，面向小艾和巫师的方向。它的轮廓在暮色中微微波动了一下，像是在无声地叹息。\n",
      "\n",
      "然后，它做出了一个让小艾和巫师都意想不到的动作。它没有扑向小艾，而是缓缓地、一步一步地，走向了黑袍巫师。它停在巫师那宽大的黑袍边，然后，轻轻地、依恋地，靠在了巫师的脚边。它那模糊的轮廓，第一次，稳稳地覆盖在了巫师脚下那片空荡荡的枯草之上。\n",
      "\n",
      "巫师僵住了，他枯瘦的身体似乎微微颤抖了一下。他低下头，看着脚边那片终于属于自己的、温暖的墨色，兜帽下的阴影剧烈地起伏着。他伸出手，想要触碰，却又不敢，只是悬在那里，指尖微微发抖。\n",
      "\n",
      "小艾的影子靠在巫师脚边，它的轮廓似乎变得更加清晰、更加柔和了。它没有发出声音，但小艾和巫师都仿佛听到了它的低语，一种无声的、却直抵心灵的声音在森林里回荡：\n",
      "\n",
      "“真正的影子，不是被偷来、被换来的。它来自一颗愿意分享、愿意温暖他人的心。你看，我离开了小艾，却找到了更需要光的地方。你感到的温暖，不是因为我属于你，而是因为……我愿意照亮你的孤独。”\n",
      "\n",
      "巫师悬在空中的手，终于轻轻地、颤抖地，落在了那片墨色的轮廓上。一种陌生的、久违的暖流，从指尖一直蔓延到他冰冷的心底。他抬起头，兜帽下那双深不见底的眼睛，第一次，似乎有了一丝微弱的光芒在闪烁。他看着小艾，又低头看着脚边的影子，声音嘶哑，却带着一种前所未有的松动：“我……明白了。”\n",
      "\n",
      "第二天清晨，当第一缕阳光穿透森林的薄雾，洒在小艾身上时，她惊喜地发现，她的影子回来了！它安静地躺在她的脚边，轮廓清晰，带着一夜安睡后的温顺。小艾蹲下身，轻轻抚摸着地面上的影子，影子也微微晃动了一下，像是在回应她的触碰。\n",
      "\n",
      "更奇妙的事情发生了。当小艾走到木屋前，向里面那个看不见的孩子挥手告别时，她的影子突然从她脚边分离出来，像一片被风吹起的墨色羽毛，轻盈地飘向木屋的门。它停在门槛上，对着屋内，再次开始笨拙却充满爱意地伸展、摇摆、描绘。小艾惊讶地发现，她感觉不到丝毫的分离感，反而心中充满了暖意。\n",
      "\n",
      "黑袍巫师站在不远处一棵古树的阴影下，看着这一幕。他宽大的黑袍依旧，但脚下，那片属于他的影子，正安静地依偎着，轮廓柔和。他抬起头，望向穿透枝叶的阳光，那阳光似乎不再那么刺眼，反而带着一种温柔的暖意。\n",
      "\n",
      "小艾的影子在木屋门口忙碌了一会儿，又像归巢的鸟儿，轻快地滑回到小艾的脚边，重新与她融为一体。小艾笑了，她知道，她的影子再也不会真正离开了。它只是拥有了一种新的、更强大的魔法——它可以在她需要时紧紧相随，也可以在她允许时，短暂地离开，去寻找那些更需要光亮的角落，去温暖那些被孤独冻结的心灵。\n",
      "\n",
      "森林里，阳光穿过叶隙，在铺满落叶的地上投下斑驳的光影。小艾和她的影子手拉着手（影子在地面上伸出小小的轮廓，轻轻碰了碰小艾的手指），一起走在回家的路上。她的影子，此刻看起来，比以往任何时候都更加明亮、更加温暖。因为小艾知道，也相信——最亮的影子，永远来自最温暖的心。\n"
     ]
    }
   ],
   "execution_count": 6
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "方式2：使用langchain的方式",
   "id": "c5724e3c15e9bb46"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T22:14:42.496345Z",
     "start_time": "2025-09-26T22:14:25.291983Z"
    }
   },
   "cell_type": "code",
   "source": [
    "from langchain_openai import ChatOpenAI\n",
    "from langchain.schema import HumanMessage, SystemMessage\n",
    "\n",
    "# 创建LLM实例\n",
    "llm = ChatOpenAI(\n",
    "    temperature=0.7,\n",
    "    model=\"glm-4.5\",\n",
    "    openai_api_key=\"9777a669173c4d7fa353ad18d54bba1f.s7Qm9vvKiWyGw5RX\",\n",
    "    openai_api_base=\"https://open.bigmodel.cn/api/paas/v4/\"\n",
    ")\n",
    "\n",
    "# 创建消息\n",
    "messages = [\n",
    "    SystemMessage(content=\"你是一个有用的AI助手\"),\n",
    "    HumanMessage(content=\"请介绍一下人工智能的发展历程\")\n",
    "]\n",
    "\n",
    "# 调用模型\n",
    "response = llm(messages)\n",
    "print(response.content)"
   ],
   "id": "6b12fc61dfbd82bc",
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/var/folders/9p/r71fhhs11052m2r7x4gfc5b80000gn/T/ipykernel_13562/1960073041.py:19: LangChainDeprecationWarning: The method `BaseChatModel.__call__` was deprecated in langchain-core 0.1.7 and will be removed in 1.0. Use :meth:`~invoke` instead.\n",
      "  response = llm(messages)\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "人工智能（AI）的发展历程是一部跨越数十年的探索史，融合了计算机科学、数学、神经科学、哲学等多学科的努力。以下是其关键阶段的概述：\n",
      "\n",
      "---\n",
      "\n",
      "### **1. 早期萌芽（1940s–1950s）：理论奠基**\n",
      "- **1943年**：神经科学家麦卡洛克（Warren McCulloch）和数学家皮茨（Walter Pitts）提出**首个人工神经元模型**，模拟人脑神经元的工作原理。\n",
      "- **1950年**：图灵（Alan Turing）发表经典论文《计算机器与智能》，提出著名的**图灵测试**，定义了“机器能否思考”的哲学命题。\n",
      "- **1956年**：**达特茅斯会议**（Dartmouth Workshop）正式确立“人工智能”（Artificial Intelligence）这一术语，标志着AI作为独立学科的诞生。约翰·麦卡锡（John McCarthy）、马文·明斯基（Marvin Minsky）等科学家提出“用机器模拟人类智能行为”的愿景。\n",
      "\n",
      "---\n",
      "\n",
      "### **2. 第一次AI浪潮（1950s–1970s）：乐观与探索**\n",
      "- **逻辑推理与符号主义**：\n",
      "  - **1957年**：纽厄尔（Allen Newell）和司马贺（Herbert Simon）开发**逻辑理论家（Logic Theorist）**，首次证明数学定理，被视为首个AI程序。\n",
      "  - **1965年**：约瑟夫·魏森鲍姆（Joseph Weizenbaum）创建**ELIZA**，通过简单模式匹配模拟心理治疗师，引发对“机器理解能力”的讨论。\n",
      "- **早期机器学习**：\n",
      "  - **1957年**：弗兰克·罗森布拉特（Frank Rosenblatt）发明**感知机（Perceptron）**，首个可训练的神经网络模型，但因无法解决非线性问题（如异或问题）而受限。\n",
      "- **局限性暴露**：\n",
      "  - 1969年，明斯基和派珀特（Seymour Papert）在《感知机》一书中指出单层神经网络的缺陷，导致**神经网络研究进入第一次寒冬**。\n",
      "\n",
      "---\n",
      "\n",
      "### **3. 第一次AI寒冬（1970s–1980s）：瓶颈与反思**\n",
      "- **问题凸显**：\n",
      "  - 计算能力不足：早期计算机无法处理复杂计算。\n",
      "  - 数据稀缺：缺乏大规模训练数据。\n",
      "  - 理论局限：符号主义AI难以应对现实世界的模糊性和不确定性。\n",
      "- **资金削减**：英国（Lighthill报告，1973）和美国（DARPA削减预算）相继减少对AI研究的资助，研究热情降温。\n",
      "\n",
      "---\n",
      "\n",
      "### **4. 第二次AI浪潮（1980s–1990s）：专家系统与知识工程**\n",
      "- **专家系统兴起**：\n",
      "  - **1980年代**：基于规则的**专家系统**（如MYCIN医疗诊断系统、DENDRAL化学分析系统）成为主流，通过“知识库+推理引擎”解决特定领域问题。\n",
      "  - 商业化应用：日本“第五代计算机计划”、美国MCC联盟等推动AI产业落地。\n",
      "- **机器学习复兴**：\n",
      "  - **1986年**：杰弗里·辛顿（Geoffrey Hinton）等人提出**反向传播算法（Backpropagation）**，解决了多层神经网络的训练问题，为深度学习奠定基础。\n",
      "  - **支持向量机（SVM）**、**决策树**等统计学习方法开始流行。\n",
      "- **第二次寒冬（1990年代初）**：\n",
      "  - 专家系统维护成本高、适应性差，市场泡沫破裂，投资再次萎缩。\n",
      "\n",
      "---\n",
      "\n",
      "### **5. 互联网时代与统计学习崛起（1990s–2000s）**\n",
      "- **数据驱动革命**：\n",
      "  - 互联网爆发式增长提供海量数据，推动AI从“规则驱动”转向“数据驱动”。\n",
      "  - **1997年**：IBM的**深蓝（Deep Blue）** 击败国际象棋世界冠军卡斯帕罗夫，展现搜索算法的强大能力。\n",
      "- **机器学习主流化**：\n",
      "  - **2006年**：辛顿提出“深度学习”概念，通过预训练解决深层网络优化难题。\n",
      "  - **2011年**：IBM **Watson** 在智力问答节目《Jeopardy!》中战胜人类冠军，融合自然语言处理与知识图谱技术。\n",
      "\n",
      "---\n",
      "\n",
      "### **6. 深度学习爆发期（2010s至今）：AI的“黄金时代”**\n",
      "- **关键技术突破**：\n",
      "  - **2012年**：AlexNet（基于CNN）在ImageNet图像识别大赛中准确率碾压传统方法，**深度学习革命**正式开启。\n",
      "  - **2014年**：GAN（生成对抗网络，Ian Goodfellow）实现逼真图像生成。\n",
      "  - **2017年**：Transformer架构（Google）提出，彻底改变自然语言处理（NLP）领域。\n",
      "- **里程碑事件**：\n",
      "  - **2016年**：AlphaGo（DeepMind）击败围棋世界冠军李世石，强化学习与深度学习结合引发全球关注。\n",
      "  - **2018–2020年**：BERT、GPT系列模型推动NLP进入“大模型时代”，AI生成内容（AIGC）能力飞跃。\n",
      "  - **2022年**：ChatGPT引爆通用人工智能（AGI）讨论，多模态模型（如GPT-4）实现跨文本、图像、音频的理解与生成。\n",
      "- **应用普及**：\n",
      "  - 自动驾驶（Tesla Waymo）、医疗诊断（影像分析）、智能推荐（抖音/Netflix）、工业机器人等领域全面落地。\n",
      "\n",
      "---\n",
      "\n",
      "### **7. 当前挑战与未来方向**\n",
      "- **核心挑战**：\n",
      "  - **可解释性**：深度学习模型如同“黑箱”，决策过程难以追溯。\n",
      "  - **伦理与安全**：偏见、隐私泄露、深度伪造（Deepfake）、自主武器风险。\n",
      "  - **能耗与成本**：大模型训练需消耗巨额电力和算力资源。\n",
      "  - **通用人工智能（AGI）**：实现跨领域、自我进化的智能仍遥不可及。\n",
      "- **前沿探索**：\n",
      "  - **神经符号AI**：融合深度学习与符号推理，提升可解释性。\n",
      "  - **具身智能（Embodied AI）**：让AI通过物理实体（机器人）感知和交互世界。\n",
      "  - **脑机接口**：直接连接人脑与计算机（如Neuralink）。\n",
      "  - **AI for Science**：加速材料发现、药物研发、气候模拟等科研进程。\n",
      "\n",
      "---\n",
      "\n",
      "### **总结**\n",
      "人工智能从哲学思辨走向实用技术，经历了“逻辑推理→知识工程→统计学习→深度学习”的范式演进。其发展始终受限于**算力、数据、算法**三要素的突破，而当前的大模型浪潮正推动AI从“专用工具”向“通用能力”迈进。未来，AI的进步不仅依赖技术本身，更需要跨学科协作、伦理框架构建与社会共识，以实现“科技向善”的终极目标。\n"
     ]
    }
   ],
   "execution_count": 7
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "## 4.5 硅基流动平台的演示",
   "id": "99ce5a4d9441b246"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-09-26T22:17:57.722444Z",
     "start_time": "2025-09-26T22:17:22.473827Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import requests\n",
    "\n",
    "url = \"https://api.siliconflow.cn/v1/chat/completions\"\n",
    "\n",
    "payload = {\n",
    "    \"model\": \"Qwen/QwQ-32B\",\n",
    "    \"messages\": [\n",
    "        {\n",
    "            \"role\": \"user\",\n",
    "            \"content\": \"What opportunities and challenges will the Chinese large model industry face in 2025?\"\n",
    "        }\n",
    "    ]\n",
    "}\n",
    "headers = {\n",
    "    \"Authorization\": \"Bearer sk-rtszvlaxjipjlwylbilqiixlutsjvguthxyuigcdxootlfrt\", #填写自己的api-key\n",
    "    \"Content-Type\": \"application/json\"\n",
    "}\n",
    "\n",
    "response = requests.post(url, json=payload, headers=headers)\n",
    "\n",
    "print(response.json())"
   ],
   "id": "8266ee8e30e8bdad",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'id': '0199881a344fddd18ad1a7e607dff19d', 'object': 'chat.completion', 'created': 1758925042, 'model': 'Qwen/QwQ-32B', 'choices': [{'index': 0, 'message': {'role': 'assistant', 'content': '\\n\\nThe Chinese large model industry, which encompasses large language models (LLMs) and other AI systems, will face significant opportunities and challenges by 2025. Here\\'s a structured analysis:\\n\\n---\\n\\n### **Opportunities for the Chinese Large Model Industry in 2025**\\n\\n#### 1. **Government Support and Strategic Prioritization**\\n   - **Opportunity:** The Chinese government has already made artificial intelligence a strategic priority through initiatives like the \"New Generation AI Development Plan\" (2017). By 2025, additional policy measures and funding could accelerate innovation, including subsidies for R&D, tax incentives for tech firms, and funding for AI talent development.\\n   - **Potential Priority Sectors:** Integration of large models into sectors like smart manufacturing, autonomous vehicles, healthcare, and smart cities under China\\'s \"Digital China\" strategy.\\n\\n#### 2. **Domestic Market Scale and Data Access**\\n   - **Opportunity:** China\\'s vast population (1.4 billion people) and centralized digital ecosystem (e.g., mobile payments, social media, e-commerce) provide海量 data for training models. Policymakers may prioritize developing LLMs that leverage this data to drive domestic productivity and address societal challenges (e.g., regional dialects, operationalizing \"common prosperity\").\\n\\n#### 3. **International Expansion and Localization**\\n   - **Opportunity:** Chinese tech firms could capitalize on demand in regions aligned with China\\'s Belt and Road Initiative (BRI), offering AI solutions in countries with limited local AI expertise. Tailoring models to local languages, cultures, and regulatory requirements (e.g., Southeast Asia, Africa) could create a niche.\\n   - **Edge:** Experience managing closed-ecosystem data (e.g., without access to global platforms like Google or Twitter) might make their models adaptable to restricted environments.\\n\\n#### 4. **Vertical Integration Across Industries**\\n   - **Opportunity:** Large models could enhance existing verticals:\\n     - **Healthcare:** Improved diagnostics using LLMs trained on Chinese medical data.\\n     - **Education:** Personalized learning tools addressing China\\'s educational disparities.\\n     - **Manufacturing:** Predictive maintenance and quality control via AI-driven insights.\\n   - **Combination with Emerging Tech:** Deployment of large models alongside IoT, 5G, and quantum computing advancements for real-time decision-making systems.\\n\\n#### 5. **Sustainable Tech Development**\\n   - **Opportunity:** China’s push for green tech might drive energy-efficient AI solutions (e.g., via specialized chips like AliCloud\\'s \"Huang Yi\") and carbon-neutral AI hubs, positioning it as a global leader in eco-friendly AI.\\n\\n#### 6. **Global Collaborative Gaps**\\n   - **Surprising Edge:** Post-2025, fragmented global tech policies (e.g., US export controls, EU AI Act) might limit Western models\\' access to certain markets. Chinese firms could offer alternatives where international collaboration is restricted.\\n\\n---\\n\\n### **Challenges Facing the Chinese Large Model Industry in 2025**\\n\\n#### 1. **Geopolitical and Regulatory Headwinds**\\n   - **Challenge:** Heightened geopolitical tensions could lead to restrictions on Chinese AI technologies abroad (e.g., bans on Huawei, TikTok precedents). Domestically, stricter data privacy laws (e.g., updated versions of the Personal Information Protection Law) may curb data availability for training models.\\n   - **Example:** The US Foreign Direct Product Rule (FDPR) restricts chip exports, impacting access to cutting-edge GPUs critical for training large models.\\n\\n#### 2. **Semiconductor Dependency and Chip Embargoes**\\n   - **Challenge:** China’s domestic semiconductor sector (e.g., Semiconductor Manufacturing International Corporation, SMIC) may still lag behind global leaders after years of U.S. tech sanctions. A reliance on foreign chips (e.g., NVIDIA GPUs) could hinder model development unless China achieves self-sufficiency in advanced chip manufacturing.\\n\\n#### 3. **Global Competition and Talent Drain**\\n   - **Competition:** Western models (e.g., OpenAI’s GPT-4 or Google’s Gemini) may dominate non-Chinese markets, squeezing demand for Chinese alternatives. \\n   - **Talent Gaps:** Despite efforts to nurture local talent, China might struggle to retain top researchers if international opportunities remain more attractive.\\n\\n#### 4. **Operational and Technical Challenges**\\n   - **Energy and Costs:** Training large models is energy-intensive. If energy prices rise or carbon credits tighten, operational costs could surge.\\n   - **Technical Maturity:** Overcoming biases inherent in Chinese language training data (e.g., censorship, cultural nuances) while maintaining global competitiveness may prove complex.\\n\\n#### 5. **Ethical and Safety Concerns**\\n   - **Misuse Risks:** Deepfakes, disinformation, and algorithmic bias could prompt stricter regulatory oversight (similar to Russia’s strict AI rules), potentially slowing innovation.\\n   - **Public Trust:** Privacy breaches or failures in chatbots (e.g., leaking sensitive user data) might erode public patience with AI adoption.\\n\\n#### 6. **Market Saturation and Sectoral Challenges**\\n   - **Overcompetition:** A crowded market of homegrown models (e.g., Alibaba’s Qwen, Baidu’s Wenxin Yiyan) may lead to cutthroat pricing, reducing profitability.\\n   - **Sector-Specific Barriers:** Healthcare or autonomous vehicle applications face higher regulatory hurdles and safety scrutiny.\\n\\n#### 7. **Fragmented Data and Collaboration Gaps**\\n   - **Data Silos:** Sectoral data fragmentation (e.g., health, finance, transportation) could impede the creation of holistic large models. Limited access to global internet data (due to the Great Firewall) might lead to gaps in multilingual or cross-cultural capabilities compared to global players.\\n\\n---\\n\\n### **Key Takeaways for 2025**\\n- **China’s strength lies in** leveraging domestic scale, government backing, and closed-ecosystem expertise to carve out niche markets globally.\\n- **Critical risks** include chip dependency, geopolitical pushback, and the need to balance rapid innovation with ethical and regulatory compliance.\\n- **Future success hinges on** advancements in autonomous chip production, sustainable AI practices, and tailored solutions for international partners who avoid Western tech.\\n\\nBy 2025, the Chinese large model industry may be a global player but will need to navigate a complex balancing act between native advantages and systemic constraints.', 'reasoning_content': \"Okay, so I need to figure out what opportunities and challenges the Chinese large model industry will face by 2025. Hmm, I know China has been pushing hard in AI and large language models. But I'm not too familiar with all the specifics. Let me start by breaking down the question into parts. \\n\\nFirst, opportunities. What does the Chinese market look like? They have a big tech sector with companies like Alibaba, Tencent, Baidu, and maybe new startups. Government support for tech and AI is probably a big opportunity because the Chinese government has invested a lot in AI and wants to be a leader in this field. Maybe there's more funding, or subsidies for AI projects.\\n\\nThere's also the domestic market size. Since China has a huge population, there's a lot of data to train large models. More data can make their models better, which is an opportunity. Also, integrating AI into various sectors like healthcare, education, autonomous vehicles—maybe China will push more for those applications, so the industry can benefit from that.\\n\\nAnother opportunity could be the international market. If Chinese companies can export their large models and AI solutions, especially in regions where China has influence through the Belt and Road Initiative. That might give them a way to expand beyond their borders.\\n\\nNow, for challenges. Regulation is probably a big one. With data privacy concerns worldwide, maybe China has its own regulations under the Cybersecurity Law or the Personal Information Protection Law. Ensuring that their models comply with these laws, especially if they handle sensitive data, could be a challenge. Also, there might be geopolitical issues. If other countries are wary of Chinese tech, they might ban or restrict the use of these models, similar to how some places have blocked Chinese apps before.\\n\\nTechnical challenges? Maybe the actual development of large models is resource-intensive. They need GPUs, which are currently in high demand. Does China have enough semiconductor chip production to support their own needs, or would there be a dependency on other countries, which could be a challenge? Also, there's the issue of model efficiency and energy consumption. Training large models uses a lot of energy, so sustainable practices might be needed, which could be costly.\\n\\nAnother challenge might be competition. Global companies like OpenAI or Google have their own large models (like GPT or PaLM). How will Chinese models compete with Western competitors, especially if there are export restrictions or if international markets favor non-Chinese companies?\\n\\nThere's also the aspect of domain-specific expertise in the models. Maybe Chinese models need to have specific linguistic or cultural contexts which require unique development efforts. But then, integrating these in a way that is accurate and not biased could be challenging.\\n\\nI'm also thinking about collaboration. International collaboration in AI has been happening between researchers and companies, but if geopolitical tensions increase, China might be isolated, making it harder to share knowledge or access important research. That could slow down their progress.\\n\\nAnother point is the talent pool. Do they have enough skilled AI researchers and engineers? China has many universities and tech hubs, but maybe attracting top talent or retaining them is an issue, especially if there are better opportunities abroad.\\n\\nEthically, there could be challenges with misuse of AI models, such as deepfakes or misinformation. The government might need to enforce strict regulations to avoid that, which could be tricky to balance innovation with supervision.\\n\\nAlso, economic pressures like recessions or market downturns could affect funding for AI projects. Plus, there's the potential for over-saturation—many models being developed but not enough viable use cases, leading to a market crash or reduced investment.\\n\\nWait, I should structure this better. Maybe list opportunities under points like government support, domestic market, international expansion, applications in various sectors. Challenges could be regulatory issues, geopolitical tensions, competition from other countries, technological dependency, energy costs, talent shortages, ethical concerns, etc.\\n\\nI need to make sure these points are specific to 2025. For example, by then, the industry might have matured a bit. Maybe the opportunities could include things like more established AI ecosystems or specific policies by the government. Challenges might be more about sustaining growth as competition increases or as global tech policies become more fragmented.\\n\\nAlso, considering advancements in technology, maybe opportunities include leveraging quantum computing or more advanced chips made in China, which could be a future opportunity if China develops its semiconductor industry. Conversely, challenges could be overcoming the current chip embargo, like from the US, which restricts access to advanced chips unless China makes technical breakthroughs in manufacturing.\\n\\nAnother thing is the potential integration with other technologies like IoT, robotics, or blockchain—creating new applications that large models can drive. This could be an opportunity for synergies. But challenges might be the complexity of managing and securing such interconnected systems.\\n\\nHmm, maybe in healthcare, there's a lot of potential but privacy issues with patient data. Similarly, autonomous vehicles needing large models for decision-making, but ensuring safety and trust in the systems is a big challenge.\\n\\nI should also think about Chinese-localized features, like understanding Chinese dialects, idioms, or considering the unique features of the Chinese internet space (since it's a closed ecosystem in some aspects). That could be an opportunity for models tailored to that environment but might also pose challenges in replicating that internationally.\\n\\nIncluding examples of existing models like those from Alibaba (M6, etc.), and Baidu's ERNIE, maybe the challenges relate to how these are scaled and how they can dominate NLP markets.\\n\\nWait, maybe the question is more about the broader industry in China, not specific companies.\\n\\nIn summary, the opportunities would likely be around government backing, access to a large user base/data, expanding into sectors, and international markets. Challenges are technical limitations, geopolitical restrictions, regulatory hurdles, competition with global players, and sustanability both technically and financially.\\n\\nI might need to elaborate each point more clearly, ensuring that the opportunities and challenges are distinct and specific to 2025 timeframe.\\n\"}, 'finish_reason': 'stop'}], 'usage': {'prompt_tokens': 28, 'completion_tokens': 2526, 'total_tokens': 2554, 'completion_tokens_details': {'reasoning_tokens': 1213}}, 'system_fingerprint': ''}\n"
     ]
    }
   ],
   "execution_count": 9
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
