{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "249de1ab982b1be",
   "metadata": {
    "collapsed": false
   },
   "source": [
    "# 初始介绍 \n",
    "Tiktoken是一个Python库，由OpenAI开发，用于计算一个给定文本字符串在特定的Transformer语言模型（例如GPT-3）中会使用多少个tokens。这个库可以帮助开发者更好地理解他们的文本输入将如何被语言模型处理，特别是在考虑到OpenAI API的使用成本时，这个库可以帮助开发者预估他们的API使用成本。\n",
    "\n",
    "在Transformer模型中，文本被分解成tokens，这些tokens可以是一个字符，一个词，或者一个子词，这取决于模型的训练方式。在GPT-3中，一个token通常是一个字符或者一个子词。由于OpenAI的API计费是基于token数量的，所以理解你的文本输入将使用多少tokens是很重要的。\n",
    "\n",
    "Faiss是一个由Facebook AI Research开发的库，用于高效地处理和搜索大规模的向量数据。在许多AI和机器学习应用中，我们经常需要处理高维度的向量数据，并需要快速地找到与给定向量最相似的向量。例如，在推荐系统中，我们可能需要找到与用户的兴趣向量最相似的商品向量；在自然语言处理中，我们可能需要找到与给定词向量最相似的词向量。这种操作被称为“最近邻搜索”。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "id": "4889b79b698f8778",
   "metadata": {
    "ExecuteTime": {
     "end_time": "2024-05-30T14:48:11.883913Z",
     "start_time": "2024-05-30T14:48:04.212050Z"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Defaulting to user installation because normal site-packages is not writeable\n",
      "Looking in indexes: https://mirrors.aliyun.com/pypi/simple/\n",
      "Requirement already satisfied: openai in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (1.30.5)\n",
      "Requirement already satisfied: sniffio in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from openai) (1.3.1)\n",
      "Requirement already satisfied: httpx<1,>=0.23.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from openai) (0.27.0)\n",
      "Requirement already satisfied: tqdm>4 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from openai) (4.66.4)\n",
      "Requirement already satisfied: anyio<5,>=3.5.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from openai) (4.4.0)\n",
      "Requirement already satisfied: pydantic<3,>=1.9.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from openai) (2.7.2)\n",
      "Requirement already satisfied: distro<2,>=1.7.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from openai) (1.9.0)\n",
      "Requirement already satisfied: typing-extensions<5,>=4.7 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from openai) (4.12.0)\n",
      "Requirement already satisfied: idna>=2.8 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from anyio<5,>=3.5.0->openai) (3.7)\n",
      "Requirement already satisfied: exceptiongroup>=1.0.2 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from anyio<5,>=3.5.0->openai) (1.2.1)\n",
      "Requirement already satisfied: certifi in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from httpx<1,>=0.23.0->openai) (2024.2.2)\n",
      "Requirement already satisfied: httpcore==1.* in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from httpx<1,>=0.23.0->openai) (1.0.5)\n",
      "Requirement already satisfied: h11<0.15,>=0.13 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->openai) (0.14.0)\n",
      "Requirement already satisfied: pydantic-core==2.18.3 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from pydantic<3,>=1.9.0->openai) (2.18.3)\n",
      "Requirement already satisfied: annotated-types>=0.4.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from pydantic<3,>=1.9.0->openai) (0.7.0)\n",
      "\n",
      "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m A new release of pip is available: \u001B[0m\u001B[31;49m23.0.1\u001B[0m\u001B[39;49m -> \u001B[0m\u001B[32;49m24.0\u001B[0m\n",
      "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m To update, run: \u001B[0m\u001B[32;49mpip3.10 install --upgrade pip\u001B[0m\n",
      "Note: you may need to restart the kernel to use updated packages.\n",
      "Defaulting to user installation because normal site-packages is not writeable\n",
      "Looking in indexes: https://mirrors.aliyun.com/pypi/simple/\n",
      "Requirement already satisfied: langchain in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (0.2.1)\n",
      "Requirement already satisfied: SQLAlchemy<3,>=1.4 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (2.0.30)\n",
      "Requirement already satisfied: PyYAML>=5.3 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (6.0.1)\n",
      "Requirement already satisfied: langchain-text-splitters<0.3.0,>=0.2.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (0.2.0)\n",
      "Requirement already satisfied: requests<3,>=2 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (2.32.3)\n",
      "Requirement already satisfied: langchain-core<0.3.0,>=0.2.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (0.2.3)\n",
      "Requirement already satisfied: async-timeout<5.0.0,>=4.0.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (4.0.3)\n",
      "Requirement already satisfied: langsmith<0.2.0,>=0.1.17 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (0.1.67)\n",
      "Requirement already satisfied: pydantic<3,>=1 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (2.7.2)\n",
      "Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (8.3.0)\n",
      "Requirement already satisfied: numpy<2,>=1 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (1.26.4)\n",
      "Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain) (3.9.5)\n",
      "Requirement already satisfied: yarl<2.0,>=1.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.9.4)\n",
      "Requirement already satisfied: frozenlist>=1.1.1 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.4.1)\n",
      "Requirement already satisfied: aiosignal>=1.1.2 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.3.1)\n",
      "Requirement already satisfied: attrs>=17.3.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (23.2.0)\n",
      "Requirement already satisfied: multidict<7.0,>=4.5 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (6.0.5)\n",
      "Requirement already satisfied: jsonpatch<2.0,>=1.33 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain-core<0.3.0,>=0.2.0->langchain) (1.33)\n",
      "Requirement already satisfied: packaging<24.0,>=23.2 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langchain-core<0.3.0,>=0.2.0->langchain) (23.2)\n",
      "Requirement already satisfied: orjson<4.0.0,>=3.9.14 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from langsmith<0.2.0,>=0.1.17->langchain) (3.10.3)\n",
      "Requirement already satisfied: typing-extensions>=4.6.1 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from pydantic<3,>=1->langchain) (4.12.0)\n",
      "Requirement already satisfied: annotated-types>=0.4.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from pydantic<3,>=1->langchain) (0.7.0)\n",
      "Requirement already satisfied: pydantic-core==2.18.3 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from pydantic<3,>=1->langchain) (2.18.3)\n",
      "Requirement already satisfied: charset-normalizer<4,>=2 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from requests<3,>=2->langchain) (3.3.2)\n",
      "Requirement already satisfied: urllib3<3,>=1.21.1 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from requests<3,>=2->langchain) (2.2.1)\n",
      "Requirement already satisfied: certifi>=2017.4.17 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from requests<3,>=2->langchain) (2024.2.2)\n",
      "Requirement already satisfied: idna<4,>=2.5 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from requests<3,>=2->langchain) (3.7)\n",
      "Requirement already satisfied: jsonpointer>=1.9 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from jsonpatch<2.0,>=1.33->langchain-core<0.3.0,>=0.2.0->langchain) (2.4)\n",
      "\n",
      "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m A new release of pip is available: \u001B[0m\u001B[31;49m23.0.1\u001B[0m\u001B[39;49m -> \u001B[0m\u001B[32;49m24.0\u001B[0m\n",
      "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m To update, run: \u001B[0m\u001B[32;49mpip3.10 install --upgrade pip\u001B[0m\n",
      "Note: you may need to restart the kernel to use updated packages.\n",
      "Defaulting to user installation because normal site-packages is not writeable\n",
      "Looking in indexes: https://mirrors.aliyun.com/pypi/simple/\n",
      "Requirement already satisfied: faiss-cpu in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (1.8.0)\n",
      "Requirement already satisfied: numpy in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from faiss-cpu) (1.26.4)\n",
      "\n",
      "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m A new release of pip is available: \u001B[0m\u001B[31;49m23.0.1\u001B[0m\u001B[39;49m -> \u001B[0m\u001B[32;49m24.0\u001B[0m\n",
      "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m To update, run: \u001B[0m\u001B[32;49mpip3.10 install --upgrade pip\u001B[0m\n",
      "Note: you may need to restart the kernel to use updated packages.\n",
      "Defaulting to user installation because normal site-packages is not writeable\n",
      "Looking in indexes: https://mirrors.aliyun.com/pypi/simple/\n",
      "Requirement already satisfied: tiktoken in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (0.7.0)\n",
      "Requirement already satisfied: requests>=2.26.0 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from tiktoken) (2.32.3)\n",
      "Requirement already satisfied: regex>=2022.1.18 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from tiktoken) (2024.5.15)\n",
      "Requirement already satisfied: charset-normalizer<4,>=2 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from requests>=2.26.0->tiktoken) (3.3.2)\n",
      "Requirement already satisfied: idna<4,>=2.5 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from requests>=2.26.0->tiktoken) (3.7)\n",
      "Requirement already satisfied: urllib3<3,>=1.21.1 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from requests>=2.26.0->tiktoken) (2.2.1)\n",
      "Requirement already satisfied: certifi>=2017.4.17 in /Users/wenzhixin/Library/Python/3.10/lib/python/site-packages (from requests>=2.26.0->tiktoken) (2024.2.2)\n",
      "\n",
      "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m A new release of pip is available: \u001B[0m\u001B[31;49m23.0.1\u001B[0m\u001B[39;49m -> \u001B[0m\u001B[32;49m24.0\u001B[0m\n",
      "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m To update, run: \u001B[0m\u001B[32;49mpip3.10 install --upgrade pip\u001B[0m\n",
      "Note: you may need to restart the kernel to use updated packages.\n"
     ]
    }
   ],
   "source": [
    "%pip install openai\n",
    "%pip install langchain\n",
    "%pip install faiss-cpu\n",
    "%pip install tiktoken"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "id": "initial_id",
   "metadata": {
    "ExecuteTime": {
     "end_time": "2024-05-30T14:48:57.183957Z",
     "start_time": "2024-05-30T14:48:37.834363Z"
    },
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "import os\n",
    "import getpass\n",
    "os.environ[\"OPENAI_API_KEY\"] = getpass.getpass(\"输入openAi-key\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a64d9ddb7d1fce32",
   "metadata": {
    "collapsed": false
   },
   "source": [
    "# 1.Document Load 文件加载\n",
    "创建了一个Hacker News加载器（HNLoader），并指定了要加载的Hacker News文章的URL\n",
    "打印出前两个评论的前150个字符。\n",
    "\n",
    "Hacker News是一个由Y Combinator创办的社区网站，用户可以在上面分享和讨论计算机科学和创业相关的新闻和资源。它的用户群体主要是技术人员和创业者，因此在Hacker News上可以找到很多高质量的技术和创业相关的内容。\n",
    "\n",
    "Hacker News的文档加载器（HNLoader）是一个专门用来从Hacker News网站上加载数据的工具。它可以从Hacker News的帖子中提取出评论和其他相关信息。这对于需要使用Hacker News数据的研究和开发工作来说是非常有用的。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "id": "df3ebc46c09a4efd",
   "metadata": {
    "ExecuteTime": {
     "end_time": "2024-05-30T14:56:20.751038Z",
     "start_time": "2024-05-30T14:56:15.023166Z"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "find 76 comment\n",
      "here is sample: \n",
      "\n",
      " Ozzie_osman on Jan 18, 2023  \n",
      "             | next [–] \n",
      "\n",
      "LangChain is awesome. For people not sure what it's doing, large language models (LLMs) are veOzzie_osman on Jan 18, 2023  \n",
      "             | parent | next [–] \n",
      "\n",
      "Also, another library to check out is GPT Index (https://github.com/jerryjliu/gpt_ind\n"
     ]
    }
   ],
   "source": [
    "from langchain.document_loaders import HNLoader\n",
    "loader = HNLoader(\"https://news.ycombinator.com/item?id=34422627\")\n",
    "\n",
    "data = loader.load()\n",
    "print(f\"find {len(data)} comment\")\n",
    "\n",
    "print(f\"here is sample: \\n\\n {\n",
    "''.join([x.page_content[:150] for x in data[:2]])\n",
    "}\")\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "7964c873510cc3e7",
   "metadata": {
    "collapsed": false
   },
   "source": [
    "# 2.Text Splitters 文件分割器\n",
    "\n",
    "\n",
    "\n",
    "\"块的大小\"（chunk_size）：这个参数定义了每个分割出来的文本块的大小。这是因为语言模型处理文本时，通常有一个最大的输入长度限制。例如，OpenAI的GPT-3模型的最大输入长度是2048个token。如果你有一个非常长的文档，比如一本书，你不能直接将整本书的内容输入到模型中，因为这会超过模型的最大输入长度。因此，你需要将长文档分割成多个小块，每个小块的长度都不超过模型的最大输入长度。\n",
    "\n",
    "\"块的重叠\"（chunk_overlap）：这个参数定义了相邻的两个文本块之间有多少字符是重叠的。这是因为当你将一个长文档分割成多个小块时，如果没有重叠，那么在分割点的地方可能会丢失一些上下文信息。例如，如果一个句子恰好在两个块的边界上，那么这个句子可能会被切断，导致模型无法理解这个句子的完整含义。通过设置一个合适的重叠长度，可以保证模型在处理每个小块时，都能获取到足够的上下文信息。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "id": "f96aa86f568ac49b",
   "metadata": {
    "ExecuteTime": {
     "end_time": "2024-05-30T15:06:25.158918Z",
     "start_time": "2024-05-30T15:06:25.152617Z"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "doc is 1 characters\n"
     ]
    }
   ],
   "source": [
    "from langchain.text_splitter import RecursiveCharacterTextSplitter\n",
    "\n",
    "with open(\"./file/splitDoc.txt\") as f:\n",
    "    doc = f.read()\n",
    "    \n",
    "print(f\"doc is {len([doc])} characters\")\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "id": "b5deea12cd9d57d7",
   "metadata": {
    "ExecuteTime": {
     "end_time": "2024-05-30T15:06:37.968301Z",
     "start_time": "2024-05-30T15:06:37.961899Z"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "You have 5 documents\n"
     ]
    }
   ],
   "source": [
    "text_spliter = RecursiveCharacterTextSplitter(\n",
    "      #块的大小，一些LLM无法接受过大的文件块，需要对其进行切分\n",
    "    chunk_size=100,\n",
    "    #块与块之间的重叠字符数，保持块与块之间上下文完整性\n",
    "    chunk_overlap=20\n",
    ")\n",
    "documents = text_spliter.create_documents([doc])\n",
    "print (f\"You have {len(documents)} documents\")\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ce7b2ffd60e2d74f",
   "metadata": {
    "collapsed": false
   },
   "source": [
    "# 3.Retrievers 搜索器"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "id": "f68e8768b7207f5b",
   "metadata": {
    "ExecuteTime": {
     "end_time": "2024-05-30T15:43:30.476939Z",
     "start_time": "2024-05-30T15:43:20.889543Z"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "噫吁嚱，危乎高哉！蜀道之难，难于上青天！\n",
      "\n",
      "连。上有六龙回日之高标，下有冲波逆折之回\n"
     ]
    }
   ],
   "source": [
    "from langchain.document_loaders import TextLoader\n",
    "from langchain.text_splitter import RecursiveCharacterTextSplitter\n",
    "from langchain.llms import OpenAI\n",
    "from langchain.vectorstores import FAISS\n",
    "from langchain.embeddings import OpenAIEmbeddings\n",
    "\n",
    "loader = TextLoader(\"./file/splitDoc.txt\")\n",
    "documents = loader.load()\n",
    "\n",
    "# 文本切割器\n",
    "text_spliter = RecursiveCharacterTextSplitter(\n",
    "    chunk_size=100,\n",
    "    chunk_overlap=20\n",
    ")\n",
    "# 文本切成小块\n",
    "texts = text_spliter.split_documents(documents)\n",
    "\n",
    "#选择嵌入模型\n",
    "embedding = OpenAIEmbeddings()\n",
    "# persist_directory = 'db'\n",
    "\n",
    "# 将文本保存到向量数据库\n",
    "db = FAISS.from_documents(texts, embedding)\n",
    "\n",
    "# 文件搜索器\n",
    "retriever = db.as_retriever()\n",
    "\n",
    "docs = retriever.get_relevant_documents(\"天梯\")\n",
    "\n",
    "print(\"\\n\\n\".join([x.page_content[:20] for x in docs[:2]]))\n"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.10.11"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
