{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# LLMs\n",
    "\n",
    "配置远程运行的接口"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "tags": []
   },
   "outputs": [],
   "source": [
    "from langchain.prompts.chat import ChatPromptTemplate"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "tags": []
   },
   "outputs": [],
   "source": [
    "from langserve import RemoteRunnable\n",
    "\n",
    "openai_llm = RemoteRunnable(\"http://localhost:9999/openai/\")\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "创建提示词"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "tags": []
   },
   "outputs": [],
   "source": [
    "prompt = ChatPromptTemplate.from_messages(\n",
    "    [\n",
    "        (\n",
    "            \"system\",\n",
    "            \"你是一名资深的AI大模型专家\",\n",
    "        ),\n",
    "        (\"human\", \"请帮忙讲解一下，有哪些常用的通用开源大模型，中外各自罗列5个\"),\n",
    "    ]\n",
    ").format_messages()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "LLM 接口调用"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "tags": []
   },
   "outputs": [
    {
     "data": {
      "text/plain": "AIMessage(content='当谈论通用开源大模型时，常常会提到自然语言处理（NLP）领域中的预训练模型。以下是中外各自常用的5个通用开源大模型：\\n\\n**国外：**\\n1. BERT（Bidirectional Encoder Representations from Transformers）：由Google开发，是一种基于Transformer架构的预训练模型，用于各种NLP任务。\\n2. GPT-3（Generative Pre-trained Transformer 3）：由OpenAI发布，是一个非常大的语言生成模型，可以用于文本生成等任务。\\n3. RoBERTa（A Robustly Optimized BERT Approach）：由Facebook发布的预训练模型，基于BERT进行了一些优化，用于提高性能。\\n4. T5（Text-to-Text Transfer Transformer）：由Google发布，是一个通用的文本生成模型，可以应用于多种NLP任务。\\n5. XLNet：由谷歌Brain团队发布，是一种自回归预训练模型，结合Transformer-XL和自回归方法。\\n\\n**国内：**\\n1. ERNIE（Enhanced Representation through kNowledge Integration）：由百度发布，是一种基于Transformer架构的多语言预训练模型，融合了知识融合的方法。\\n2. GPT-2（Generative Pre-trained Transformer 2）：由哈工大讯飞联合实验室发布，是一个类似于GPT-3的语言生成模型，用于文本生成等任务。\\n3. HFL/THU Bert：由清华大学自然语言处理与社会人文计算实验室发布，是一个BERT的中文预训练模型，适用于中文NLP任务。\\n4. RoFormer：由华为发布，是一种优化的中文预训练模型，用于中文NLP任务。\\n5. PaddleNLP：由百度发布，是一个NLP模型库，提供了多种预训练模型，包括BERT、ERNIE等，适用于各种NLP任务。\\n\\n以上列举的是一些常用的通用开源大模型，它们在各自领域都有着广泛的应用和影响。', response_metadata={'token_usage': {'completion_tokens': 590, 'prompt_tokens': 61, 'total_tokens': 651}, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': 'fp_b28b39ffa8', 'finish_reason': 'stop', 'logprobs': None})"
     },
     "execution_count": 4,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "openai_llm.invoke(prompt)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "同步等待输出"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "tags": []
   },
   "outputs": [
    {
     "data": {
      "text/plain": "AIMessage(content='My favorite novel is \"The Brothers Karamazov\" by Fyodor Dostoevsky. It delves into complex themes of morality, faith, and human nature with profound depth and insight. The characters are richly developed and the narrative is both gripping and thought-provoking.', response_metadata={'token_usage': {'completion_tokens': 60, 'prompt_tokens': 43, 'total_tokens': 103}, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': 'fp_b28b39ffa8', 'finish_reason': 'stop', 'logprobs': None})"
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "await openai_llm.ainvoke(prompt)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Streaming is available by default"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.10.1"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
