{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 下载ollama"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "source /etc/network_turbo\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "curl -fsSL https://ollama.com/install.sh | sh"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 配置ollama环境"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### cpu 加载/ gpu 加载 （自动识别）\n",
    "vim /etc/profile \n",
    "\n",
    "export OLLAMA_HOST=\"0.0.0.0:6006\" \n",
    "\n",
    "export OLLAMA_MODELS=/root/autodl-tmp/models \n",
    "\n",
    "source /etc/profile \n",
    "\n",
    "echo $OLLAMA_HOST\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### GPU加载 单卡/多卡\n",
    "\n",
    "vim /etc/profile \n",
    "\n",
    "export OLLAMA_HOST=\"0.0.0.0:6006\" \n",
    "\n",
    "export OLLAMA_GPU_LAYER=cuda\n",
    "\n",
    "export OLLAMA_NUM_GPU=2\n",
    "\n",
    "export CUDA_VISIBLE_DEVICES=0,1\n",
    "\n",
    "export OLLAMA_SCHED_SPREAD=1\n",
    "\n",
    "export OLLAMA_KEEP_ALIVE=-1\n",
    "\n",
    "export OLLAMA_MODELS=/root/autodl-tmp/models \n",
    "\n",
    "source /etc/profile \n",
    "\n",
    "echo $OLLAMA_HOST"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 开启ollama服务"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "ollama serve"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 从官方拉取模型"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "ollama run qwen3:8b"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 本地openai调用"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<think>\n",
      "好的，用户想要一个冷笑话。首先，我需要理解什么是冷笑话。冷笑话通常以双关语或文字游戏为基础，带有幽默但可能有点冷门或意外的结尾。这类笑话往往需要听众有一定的文化背景或对特定词汇的理解才能领会其中的笑点。\n",
      "\n",
      "接下来，我得考虑用户的需求。他们可能是在寻找轻松一笑的时刻，或者想测试我的幽默感。也有可能是在社交场合中需要一些冷笑话来活跃气氛。不过，用户没有说明具体场景，所以保持通用性比较重要。\n",
      "\n",
      "然后，我需要确保笑话符合中文的语言习惯，同时避免可能引起误解或冒犯的内容。比如，使用常见的双关词，如“冷”相关的词汇，或者结合日常生活中的常见场景，让笑话更贴近生活。\n",
      "\n",
      "另外，考虑到用户之前的历史对话，我需要保持回答的简洁和直接，不需要额外的解释，因为用户只是想要一个笑话，而不是深入的分析。同时，要避免过于复杂的结构，确保笑话容易理解。\n",
      "\n",
      "最后，我需要检查笑话是否符合冷笑话的特点，是否有足够的意外性和双关性。例如，结合“冷”和“冷笑话”本身，或者使用其他类似的双关词，让笑话既符合定义又有趣味性。\n",
      "\n",
      "综上所述，我需要构建一个简短、双关、容易理解且带有意外结尾的冷笑话，确保它符合用户的要求，并且在中文环境中能够产生幽默效果。\n",
      "</think>\n",
      "\n",
      "为什么北极熊不会迷路？  \n",
      "因为它们有“北极”导航！ ❄️\n"
     ]
    }
   ],
   "source": [
    "import os\n",
    "from openai import OpenAI\n",
    "\n",
    "# client = OpenAI(\n",
    "#     # 若没有配置环境变量，请用百炼API Key将下行替换为：api_key=\"sk-xxx\",\n",
    "#     api_key=os.getenv(\"DASHSCOPE_API_KEY\"), # 如何获取API Key：https://help.aliyun.com/zh/model-studio/developer-reference/get-api-key\n",
    "#     base_url=\"https://dashscope.aliyuncs.com/compatible-mode/v1\",\n",
    "# )\n",
    "\n",
    "client = OpenAI(\n",
    "   \n",
    "    api_key=\"na\", \n",
    "    base_url=\"http://localhost:8000/v1\",\n",
    ")\n",
    "\n",
    "\n",
    "completion = client.chat.completions.create(\n",
    "    model=\"qwen3:8b\", \n",
    "    messages=[\n",
    "        {'role': 'system', 'content': 'You are a helpful assistant.'},\n",
    "        {'role': 'user', 'content': '给我一个冷笑话'}\n",
    "        ]\n",
    ")\n",
    "print(completion.choices[0].message.content)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 通过open-webui 部署ollama"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "vim /etc/profile \n",
    "\n",
    "export OLLAMA_HOST=\"0.0.0.0:11434\" \n",
    "\n",
    "source /etc/profile \n",
    "\n",
    "echo $OLLAMA_HOST"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 注意，不需要以下环节\n",
    "export HF_ENDPOINT=https://hf-mirror.com \\\n",
    "export ENABLE_OLLAMA_API=False \\\n",
    "export OPENAI_API_BASE_URL=http://127.0.0.1:5000/v1 \\\n",
    "export DEFAULT_MODELS=\"Qwen3-8B\" \\"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 直接启动open-webui即可"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "ollama serve\n",
    "\n",
    "open-webui serve --port 6006"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "ame",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.12.0"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
