{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "2fb7902a-736f-446b-b974-66ca653ad255",
   "metadata": {},
   "source": [
    "# OpenSearch + DeepSeek 实现 RAG 知识库"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0ea794ca-b5fe-4277-9f15-4a66c8b3cee6",
   "metadata": {},
   "source": [
    "# 传统的 RAG 方案"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d71dbbef-cf71-4f36-a74c-484eacd968a5",
   "metadata": {},
   "source": [
    "![image-tradition](RAG-Tradition.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "792f9a4d-8453-486d-bbac-55c051881fa4",
   "metadata": {},
   "source": [
    "<br>\n",
    "<br>\n",
    "<br>\n",
    "<br>\n",
    "<br>"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ea45dc0b-07cb-4bcf-a5a7-f309a9c668fa",
   "metadata": {},
   "source": [
    "# OpenSearch 实现 RAG 知识库方案架构"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "086875cc-0697-43b2-bf30-31fefe8c0ce5",
   "metadata": {},
   "source": [
    "![arch](RAG-OpenSearch.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "67b4a094-6001-4f17-a577-ab103fedf44d",
   "metadata": {},
   "source": [
    "# 操作步骤"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "f006640f-3cea-42c4-92e4-88492dcea915",
   "metadata": {},
   "source": [
    "1. 环境准备\n",
    "2. 设置相关权限（IAM Role/IAM Policy/OpenSearch Security）\n",
    "3. 在 OpenSearch 创建 ML Connector(DeepSeek Connector/Embedding Connector)\n",
    "4. 在 OpenSearch 创建 Model(DeepSeek Model/Embedding Model),从 OpenSearch 测试 DeepSeek Model\n",
    "5. 导入知识库\n",
    "6. 在 OpenSearch 创建 Search Pipeline\n",
    "7. 测试问答"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "33874cf7-9cc3-4d8a-ba03-f7da4044c61c",
   "metadata": {},
   "source": [
    "# 1. 环境准备"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a1f06152-b36a-477b-8e7e-30d2916b6fef",
   "metadata": {},
   "source": [
    "## 1.1 前提条件"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "280b26bd-6562-444e-b579-edf42bd2febe",
   "metadata": {},
   "source": [
    "* 预先在 Sagemaker 中部署好两个模型：1. DeepSeek, 2. BGE Embedding\n",
    "* 创建 OpenSearch 集群"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "98e8966a-d239-4d46-b2c4-7ca443b530f3",
   "metadata": {},
   "source": [
    "## 1.2 初始化"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "29c917c3-3603-4a0c-8cce-fe16b0a9b0e8",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "# 安装依赖\n",
    "!pip install opensearch-py\n",
    "!pip install requests-aws4auth"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "fd5a358c-76cd-48d5-a709-55c19a43201b",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "import boto3\n",
    "import json\n",
    "from botocore.exceptions import ClientError\n",
    "\n",
    "def get_opensearch_credentials(secret_name, region_name=\"cn-north-1\"):\n",
    "    \"\"\"\n",
    "    从AWS Secret Manager获取OpenSearch的凭证\n",
    "    \n",
    "    参数:\n",
    "        secret_name (str): Secret Manager中存储凭证的密钥名称\n",
    "        region_name (str): AWS区域名称，默认为us-east-1\n",
    "        \n",
    "    返回:\n",
    "        dict: 包含用户名和密码的字典\n",
    "    \"\"\"\n",
    "    # 创建Secrets Manager客户端\n",
    "    session = boto3.session.Session()\n",
    "    client = session.client(\n",
    "        service_name='secretsmanager',\n",
    "        region_name=region_name\n",
    "    )\n",
    "    \n",
    "    try:\n",
    "        # 获取密钥\n",
    "        get_secret_value_response = client.get_secret_value(\n",
    "            SecretId=secret_name\n",
    "        )\n",
    "    except ClientError as e:\n",
    "        # 处理可能的错误\n",
    "        if e.response['Error']['Code'] == 'DecryptionFailureException':\n",
    "            raise Exception(\"无法解密密钥值\")\n",
    "        elif e.response['Error']['Code'] == 'InternalServiceErrorException':\n",
    "            raise Exception(\"内部服务错误\")\n",
    "        elif e.response['Error']['Code'] == 'InvalidParameterException':\n",
    "            raise Exception(\"参数无效\")\n",
    "        elif e.response['Error']['Code'] == 'InvalidRequestException':\n",
    "            raise Exception(\"请求无效\")\n",
    "        elif e.response['Error']['Code'] == 'ResourceNotFoundException':\n",
    "            raise Exception(f\"找不到指定的密钥: {secret_name}\")\n",
    "        else:\n",
    "            raise e\n",
    "    else:\n",
    "        # 密钥可能是字符串或二进制，需要处理两种情况\n",
    "        if 'SecretString' in get_secret_value_response:\n",
    "            secret = get_secret_value_response['SecretString']\n",
    "            return json.loads(secret)\n",
    "        else:\n",
    "            # 如果是二进制，需要解码\n",
    "            decoded_binary_secret = base64.b64decode(get_secret_value_response['SecretBinary'])\n",
    "            return json.loads(decoded_binary_secret)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b24b6d22-af38-4a07-9d13-98c340ad92f0",
   "metadata": {},
   "source": [
    "## 1.3 设置环境变量"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "823e5c9b-50fc-45b8-8ca7-d72d6bc7b1dd",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "# 设置环境变量\n",
    "import os\n",
    "\n",
    "os.environ[\"ACCOUNT_ID\"] = \"245788323638\"\n",
    "\n",
    "# 在 Sagemaker 中部署的 DeepSeek 模型名称\n",
    "os.environ[\"LLM_MODEL_NAME\"] = \"DeepSeek-R1-Distill-Qwen-7B-g5-endpoint\"\n",
    "\n",
    "# 在 Sagemaker 中部署的 Embedding 模型名称\n",
    "os.environ[\"EMBEDDING_MODEL_NAME\"] = \"bge-m3-2025-03-30-12-58-53-929-auto-endpoint\"\n",
    "\n",
    "# 部署的 Reggion，cn-north-1 -> BJS 北京 Region\n",
    "os.environ[\"DEEPSEEK_AWS_REGION\"] = \"cn-north-1\"\n",
    "\n",
    "# OpenSearch 的访问配置\n",
    "os.environ[\"OPENSEARCH_SERVICE_DOMAIN_ENDPOINT\"] = \"https://search-deepseek-demo-ajierlgzomqjnbpqggl3qzqqky.cn-north-1.es.amazonaws.com.cn\"\n",
    "os.environ[\"OPENSEARCH_SERVICE_ADMIN_USER\"] = \"admin\"\n",
    "os.environ[\"OPENSEARCH_SERVICE_ADMIN_PASSWORD\"] = get_opensearch_credentials(\"opensearch-credentials\", os.environ[\"DEEPSEEK_AWS_REGION\"]).get(\"password\")\n",
    "\n",
    "# 模型相关\n",
    "os.environ[\"SAGEMAKER_LLM_ENDPOINT_INFERENCE_ARN\"] = f\"arn:aws-cn:sagemaker:{os.environ['DEEPSEEK_AWS_REGION']}:{os.environ['ACCOUNT_ID']}:endpoint/{os.environ['LLM_MODEL_NAME']}\"\n",
    "os.environ[\"SAGEMAKER_EMBEDDING_MODEL_ENDPOINT_INFERENCE_ARN\"] = f\"arn:aws-cn:sagemaker:{os.environ['DEEPSEEK_AWS_REGION']}:{os.environ['ACCOUNT_ID']}:endpoint/{os.environ['EMBEDDING_MODEL_NAME']}\"\n",
    "os.environ[\"SAGEMAKER_MODEL_INFERENCE_ENDPOINT\"] = f\"https://runtime.sagemaker.{os.environ['DEEPSEEK_AWS_REGION']}.amazonaws.com.cn/endpoints/{os.environ['LLM_MODEL_NAME']}/invocations\"\n",
    "os.environ[\"OPENSEARCH_SERVICE_DOMAIN_ARN\"] = f\"arn:aws-cn:es:{os.environ['DEEPSEEK_AWS_REGION']}:{os.environ['ACCOUNT_ID']}:domain/deepseek-demo\"\n",
    "os.environ[\"SAGEMAKER_EMBEDDING_MODEL_INFERENCE_ENDPOINT\"] = f\"https://runtime.sagemaker.{os.environ['DEEPSEEK_AWS_REGION']}.amazonaws.com.cn/endpoints/{os.environ['EMBEDDING_MODEL_NAME']}/invocations\"\n",
    "\n",
    "print('SAGEMAKER_LLM_ENDPOINT_INFERENCE_ARN:' + os.environ[\"SAGEMAKER_LLM_ENDPOINT_INFERENCE_ARN\"])\n",
    "print('SAGEMAKER_EMBEDDING_MODEL_ENDPOINT_INFERENCE_ARN:' + os.environ[\"SAGEMAKER_LLM_ENDPOINT_INFERENCE_ARN\"])\n",
    "print('SAGEMAKER_MODEL_INFERENCE_ENDPOINT:' + os.environ[\"SAGEMAKER_MODEL_INFERENCE_ENDPOINT\"])\n",
    "print('OPENSEARCH_SERVICE_DOMAIN_ARN:' + os.environ[\"OPENSEARCH_SERVICE_DOMAIN_ARN\"])\n",
    "print('SAGEMAKER_EMBEDDING_MODEL_INFERENCE_ENDPOINT:' + os.environ[\"SAGEMAKER_EMBEDDING_MODEL_INFERENCE_ENDPOINT\"])"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2bea7683-f043-42ca-94c9-643c4b18d56e",
   "metadata": {},
   "source": [
    "# 2.设置相关权限（IAM Role/IAM Policy/OpenSearch Security）"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "6d0b5405-80e4-4fe4-b31c-9baedffb9e8a",
   "metadata": {},
   "source": [
    "## 2.1 创建 IAM 角色 invoke_deepseek_role"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "8d589ec5-53b0-4cbf-91be-bb27be20d8e4",
   "metadata": {},
   "source": [
    "创建角色 invoke_deepseek_role，用于 OpenSearch Connector 远程调用 Sagemaker endpoint"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "b584bc61-95a0-4071-b4e3-d85566eee5e4",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "import boto3\n",
    "import json\n",
    "import os\n",
    "\n",
    "\n",
    "# The script will create a role and policy with the names below. It\n",
    "# reads the ARN for the SageMaker endpoint from the environment.\n",
    "invoke_deepseek_policy_name = 'invoke_deepseek_policy'\n",
    "invoke_deepseek_role_name = 'invoke_deepseek_role'\n",
    "sagemaker_llm_model_inference_endpoint = os.environ['SAGEMAKER_LLM_ENDPOINT_INFERENCE_ARN']\n",
    "sagemaker_embedding_model_inference_endpoint = os.environ['SAGEMAKER_EMBEDDING_MODEL_ENDPOINT_INFERENCE_ARN']\n",
    "\n",
    "\n",
    "# Allows invoke endpoint\n",
    "policy = {\n",
    "  \"Version\": \"2012-10-17\",\n",
    "  \"Statement\": [\n",
    "    {\n",
    "      \"Effect\": \"Allow\",\n",
    "      \"Action\": [\n",
    "        \"sagemaker:InvokeEndpoint\"\n",
    "      ],\n",
    "      \"Resource\": [\n",
    "        sagemaker_llm_model_inference_endpoint,\n",
    "        sagemaker_embedding_model_inference_endpoint\n",
    "      ]\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "\n",
    "\n",
    "# Allows OpenSearch Service to assume the role. The role and policy\n",
    "# together allow OpenSearch Service to call SageMaker to invoke\n",
    "# DeepSeek to generate text.\n",
    "trust_relationship = {\n",
    "  \"Version\": \"2012-10-17\",\n",
    "  \"Statement\": [\n",
    "    {\n",
    "      \"Effect\": \"Allow\",\n",
    "      \"Principal\": {\n",
    "        \"Service\": \"es.amazonaws.com\"\n",
    "      },\n",
    "      \"Action\": \"sts:AssumeRole\"\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "\n",
    "\n",
    "# Check for an existing policy with the same name, and error out\n",
    "# if it exists.\n",
    "iam = boto3.client('iam')\n",
    "sts = boto3.client('sts')\n",
    "existing_policy = None\n",
    "try:\n",
    "  # This constructs an ARN for the policy based on the current\n",
    "  # account. If you need to run this for another account, you can \n",
    "  # change the account ID below.\n",
    "  account_id = sts.get_caller_identity()['Account']\n",
    "  policy_arn = f'arn:aws-cn:iam::{account_id}:policy/{invoke_deepseek_policy_name}'\n",
    "  existing_policy = iam.get_policy(PolicyArn=policy_arn)['Policy']\n",
    "  if existing_policy:\n",
    "    raise Exception(f\"Policy {invoke_deepseek_policy_name} already exists. Please set another policy name\")\n",
    "except iam.exceptions.NoSuchEntityException:\n",
    "  pass\n",
    "\n",
    "\n",
    "# Check for an existing policy with the same name, and error out\n",
    "# if it exists.\n",
    "existing_role = None\n",
    "try:\n",
    "  existing_role = iam.get_role(RoleName=invoke_deepseek_role_name)\n",
    "  if existing_role:\n",
    "    raise Exception(f\"Role {invoke_deepseek_role_name} already exists. Please set another role name\")\n",
    "except iam.exceptions.NoSuchEntityException:\n",
    "  pass\n",
    "\n",
    "\n",
    "# Create the policy\n",
    "policy = iam.create_policy(\n",
    "  PolicyName=invoke_deepseek_policy_name,\n",
    "  PolicyDocument=json.dumps(policy)\n",
    ")\n",
    "policy_arn = policy['Policy']['Arn']\n",
    "\n",
    "\n",
    "# Create the role, with the policy document just created.\n",
    "role = iam.create_role(\n",
    "  RoleName=invoke_deepseek_role_name,\n",
    "  AssumeRolePolicyDocument=json.dumps(trust_relationship)\n",
    ")\n",
    "role_arn = role['Role']['Arn']\n",
    "iam.attach_role_policy(\n",
    "  RoleName=invoke_deepseek_role_name,\n",
    "  PolicyArn=policy_arn\n",
    ")\n",
    "\n",
    "print(f'Created policy {policy_arn}')\n",
    "print(f'Created role {role_arn}')\n",
    "\n",
    "print(f'\\nPlease execute the following command\\nos.environ[\"INVOKE_DEEPSEEK_ROLE\"] = \"{role_arn}\"\\n')\n",
    "\n",
    "os.environ[\"INVOKE_DEEPSEEK_ROLE\"] = role_arn"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a82e05bb-3907-475e-839a-869dff518130",
   "metadata": {},
   "source": [
    "## 2.2 创建 IAM 角色 create_deepseek_connector_role"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b79d375d-34ed-47f1-82eb-540a109e361f",
   "metadata": {},
   "source": [
    "创建一个用于操作 OpenSearch 的角色，后续将使用该角色向 OpenSearch 发送创建/删除/导入数据等操作"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "d4ec5e7b-bf6f-423f-933c-be6fa7675573",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "# Copyright 2025 Norris\n",
    "# MIT-0\n",
    "#\n",
    "'''\n",
    "This module constructs an IAM role and policy that enables your account\n",
    "to create an OpenSearch connector in your Amazon OpenSearch Service domain.\n",
    "'''\n",
    "\n",
    "import boto3\n",
    "import json\n",
    "import os\n",
    "\n",
    "\n",
    "# This script will create a role and policy document with \n",
    "# the following names.\n",
    "create_connector_policy_name = 'create_deepseek_connector_policy'\n",
    "create_connector_role_name = 'create_deepseek_connector_role'\n",
    "\n",
    "# Read environment variables for the invoke role, and the domain ARNs.\n",
    "invoke_connector_role_arn = os.environ['INVOKE_DEEPSEEK_ROLE']\n",
    "opensearch_service_domain_arn = os.environ['OPENSEARCH_SERVICE_DOMAIN_ARN']\n",
    "\n",
    "def get_current_role_arn():\n",
    "    sts_client = boto3.client('sts')\n",
    "    try:\n",
    "        # 获取当前身份信息\n",
    "        response = sts_client.get_caller_identity()\n",
    "        \n",
    "        # 返回的 ARN 会是角色的 ARN，如果代码在使用角色的环境中运行\n",
    "        arn = response['Arn']\n",
    "        \n",
    "        # 如果需要提取角色名称\n",
    "        if ':assumed-role/' in arn:\n",
    "            # 从 ARN 中提取角色名称\n",
    "            role_name = arn.split(':assumed-role/')[1].split('/')[0]\n",
    "            # 构建标准的 IAM 角色 ARN\n",
    "            account_id = response['Account']\n",
    "            role_arn = f\"arn:aws-cn:iam::{account_id}:role/service-role/{role_name}\"\n",
    "            return role_arn\n",
    "        \n",
    "        return arn\n",
    "    except Exception as e:\n",
    "        print(f\"获取当前角色 ARN 时出错: {e}\")\n",
    "        return None\n",
    "\n",
    "# 使用示例\n",
    "current_role_arn = get_current_role_arn()\n",
    "\n",
    "\n",
    "# This policy will allow post operations on the OpenSearch Service\n",
    "# domain. It adds a pass role so that OpenSearch can validate the\n",
    "# connector.\n",
    "policy = {\n",
    "  \"Version\": \"2012-10-17\",\n",
    "  \"Statement\": [\n",
    "    {\n",
    "      \"Effect\": \"Allow\",\n",
    "      \"Action\": \"iam:PassRole\",\n",
    "      \"Resource\": invoke_connector_role_arn\n",
    "    },\n",
    "    {\n",
    "      \"Effect\": \"Allow\",\n",
    "      \"Action\": \"es:ESHttpPost\",\n",
    "      \"Resource\": opensearch_service_domain_arn\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "\n",
    "\n",
    "# Pulls the current user ARN from Boto's entity resolution, based\n",
    "# on either aws configure, or environment variables. This role,\n",
    "# with the policy above enables you to call OpenSearch's\n",
    "# create_connector API\n",
    "# current_user_arn = boto3.resource('iam').CurrentUser().arn\n",
    "trust_relationship = {\n",
    "  \"Version\": \"2012-10-17\",\n",
    "  \"Statement\": [\n",
    "    {\n",
    "      \"Effect\": \"Allow\",\n",
    "      \"Principal\": {\n",
    "        \"AWS\": current_role_arn\n",
    "      },\n",
    "      \"Action\": \"sts:AssumeRole\"\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "\n",
    "\n",
    "iam = boto3.client('iam')\n",
    "sts = boto3.client('sts')\n",
    "\n",
    "\n",
    "# First validate that the script won't overwrite an existing role or policy. If you receive\n",
    "# either exception, change the global variable above to give it a different name\n",
    "#\n",
    "# Validate that the script won't overwrite an existing policy.  \n",
    "existing_policy = None\n",
    "try:\n",
    "  account_id = sts.get_caller_identity()['Account']\n",
    "  policy_arn = f'arn:aws-cn:iam::{account_id}:policy/{create_connector_policy_name}'\n",
    "  existing_policy = iam.get_policy(PolicyArn=policy_arn)['Policy']\n",
    "  if existing_policy:\n",
    "    raise Exception(f\"Policy {create_connector_policy_name} already exists. Please set another policy name\")\n",
    "except iam.exceptions.NoSuchEntityException:\n",
    "  # The policy document does not exist. That's the expected result, so there's\n",
    "  # nothing additional to do\n",
    "  pass\n",
    "\n",
    "\n",
    "# Validate that the script won't overwrite an existing role.\n",
    "existing_role = None\n",
    "try:\n",
    "  existing_role = iam.get_role(RoleName=create_connector_role_name)\n",
    "  if existing_role:\n",
    "    raise Exception(f\"Role {create_connector_role_name} already exists. Please set another role name\")\n",
    "except iam.exceptions.NoSuchEntityException:\n",
    "  # The role does not exist. That's the expected result, so there's\n",
    "  # nothing additional to do\n",
    "  pass\n",
    "\n",
    "\n",
    "# Create the policy and role. Note, in actual usage, you should wrap these calls\n",
    "# in try/except blocks and validate the responses. \n",
    "#\n",
    "# Create the policy\n",
    "policy = iam.create_policy(\n",
    "  PolicyName=create_connector_policy_name,\n",
    "  PolicyDocument=json.dumps(policy)\n",
    ")\n",
    "policy_arn = policy['Policy']['Arn']\n",
    "\n",
    "\n",
    "# Create the role\n",
    "role = iam.create_role(\n",
    "  RoleName=create_connector_role_name,\n",
    "  AssumeRolePolicyDocument=json.dumps(trust_relationship)\n",
    ")\n",
    "role_arn = role['Role']['Arn']\n",
    "iam.attach_role_policy(\n",
    "  RoleName=create_connector_role_name,\n",
    "  PolicyArn=policy_arn\n",
    ")\n",
    "\n",
    "print(f'Created policy {policy_arn}')\n",
    "print(f'Created role {role_arn}')\n",
    "# print(f'\\nPlease execute the following command\\nos.environ[\"CREATE_DEEPSEEK_CONNECTOR_ROLE\"] = \"{role_arn}\"\\n')\n",
    "\n",
    "os.environ[\"CREATE_DEEPSEEK_CONNECTOR_ROLE\"] = role_arn"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d5da0e03-946b-4fd8-baef-ea8a92f5285e",
   "metadata": {},
   "source": [
    "## 2.3 配置 OpenSearch 中的安全策略"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a0b8c91c-d295-4524-8708-e7ffd912b04b",
   "metadata": {},
   "source": [
    "将 create_deepseek_connector_role 添加到 OpenSearch 的安全策略中"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "166ca38b-7217-4475-8772-e7b9ff5b9db6",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "import boto3\n",
    "from opensearchpy import OpenSearch\n",
    "import os\n",
    "\n",
    "\n",
    "# Read the configuration from environment variables.\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "create_deepseek_connector_role = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "lambda_invoke_ml_commons_role_name = 'LambdaInvokeOpenSearchMLCommonsRole'\n",
    "opensearch_port = 443\n",
    "\n",
    "\n",
    "# Ensure the endpoint matches the contract for the opensearch-py client. Endpoints\n",
    "# are specified without the leading URL scheme or trailing slashes.\n",
    "if opensearch_service_api_endpoint.startswith('https://'):\n",
    "  opensearch_service_api_endpoint = opensearch_service_api_endpoint[len('https://'):]\n",
    "if opensearch_service_api_endpoint.endswith('/'):\n",
    "  opensearch_service_api_endpoint = opensearch_service_api_endpoint[:-1]\n",
    "\n",
    "\n",
    "# Construct the backend roles. OpenSearch's fine-grained access control will detect\n",
    "# signed traffic and map these entities to the ml_full_access role.\n",
    "sts = boto3.client('sts')\n",
    "account_id = sts.get_caller_identity()['Account']\n",
    "lambda_invoke_ml_commons_role_arn = f'arn:aws-cn:iam::{account_id}:role/{lambda_invoke_ml_commons_role_name}'\n",
    "role_mapping = {\n",
    "  \"backend_roles\": [create_deepseek_connector_role,\n",
    "                    lambda_invoke_ml_commons_role_arn]\n",
    "}\n",
    "\n",
    "hosts = [{\"host\": opensearch_service_api_endpoint, \"port\": opensearch_port}]\n",
    "client = OpenSearch(\n",
    "    hosts=hosts,\n",
    "    http_auth=(opensearch_user_name, opensearch_user_password),\n",
    "    use_ssl=True,\n",
    "    verify_certs=False,\n",
    "    ssl_assert_hostname=False,\n",
    "    ssl_show_warn=False,\n",
    ")\n",
    "# client.security.create_role_mapping('ml_full_access', body=role_mapping)\n",
    "\n",
    "# 使用 transport.perform_request 方法直接调用 API\n",
    "client.transport.perform_request(\n",
    "    'PUT',\n",
    "    '/_plugins/_security/api/rolesmapping/ml_full_access',\n",
    "    body=role_mapping\n",
    ")\n",
    "\n",
    "# 获取角色映射\n",
    "response = client.transport.perform_request(\n",
    "    'GET',\n",
    "    '/_plugins/_security/api/rolesmapping/ml_full_access'\n",
    ")\n",
    "print(f'ml_full_access role mapping is now {response}')\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a72ae126-12c5-46d6-80ac-4c27d68f425f",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "source": [
    "#### 更新 trusted_connector_endpoints_regex 的正则表达式\n",
    "需要在 **OpenSearch Dev tools** 中 执行以下命令，重新设置 ml_connector url 正则表达式规则，来支持亚马逊云科技中国区的URL。\n",
    "```PUT _cluster/settings\n",
    "PUT _cluster/settings\n",
    "{\n",
    "  \"persistent\": {\n",
    "    \"plugins.ml_commons.trusted_connector_endpoints_regex\": [\n",
    "      \"https://runtime\\\\.sagemaker\\\\.cn-north-1\\\\.amazonaws\\\\.com\\\\.cn/endpoints/.*\"\n",
    "    ]\n",
    "  }\n",
    "}\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "dd5ac452-22bc-4db4-88cb-471f6db07251",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "# Copyright 2025 Norris\n",
    "# MIT-0\n",
    "#\n",
    "# 更新 trusted_connector_endpoints_regex 的正则表达式\n",
    "\n",
    "import os\n",
    "import requests\n",
    "\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "\n",
    "# Set up user/password auth\n",
    "userauth = (opensearch_user_name, opensearch_user_password)\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "\n",
    "########################################################################################\n",
    "# update endpoints regex\n",
    "path = f'/_cluster/settings'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "payload = {\n",
    "  \"persistent\": {\n",
    "    \"plugins.ml_commons.trusted_connector_endpoints_regex\": [\n",
    "      \"https://runtime\\\\.sagemaker\\\\.cn-north-1\\\\.amazonaws\\\\.com\\\\.cn/endpoints/.*\"\n",
    "    ]\n",
    "  }\n",
    "}\n",
    "r = requests.put(url, auth=userauth, json=payload, headers=headers)\n",
    "\n",
    "reponse = r.json()\n",
    "print(f'reponse: {reponse}')"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e79a5703-ca8b-4d5b-82bd-2f37c32fef4c",
   "metadata": {},
   "source": [
    "# 3. 创建 ML Connector"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ec6a5e9e-e38d-4e4a-81cf-469fcd8f6c37",
   "metadata": {},
   "source": [
    "## 3.1 创建 DeepSeek Model Connector"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "60991378-e16b-496f-b470-c09d3efe3526",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "import boto3\n",
    "import json\n",
    "import os\n",
    "import requests\n",
    "from requests_aws4auth import AWS4Auth\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "invoke_role_arn = os.environ['INVOKE_DEEPSEEK_ROLE']\n",
    "create_deepseek_connector_role_arn = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "sagemaker_endpoint_url = os.environ['SAGEMAKER_MODEL_INFERENCE_ENDPOINT']\n",
    "\n",
    "\n",
    "# Create the AWS4Auth object that will sign the create connector API call. \n",
    "credentials = boto3.client('sts').assume_role(\n",
    "  RoleArn=create_deepseek_connector_role_arn,\n",
    "  RoleSessionName='create_connector_session'\n",
    ")['Credentials']\n",
    "awsauth = AWS4Auth(credentials['AccessKeyId'],\n",
    "                   credentials['SecretAccessKey'],\n",
    "                   region,\n",
    "                   'es',\n",
    "                   session_token=credentials['SessionToken'])\n",
    "\n",
    "\n",
    "# Prepare the API call parameters. \n",
    "path = '/_plugins/_ml/connectors/_create'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "# See the documentation \n",
    "# https://opensearch.org/docs/latest/ml-commons-plugin/remote-models/blueprints/ for \n",
    "# details on the connector payload, and additional blueprints for other models.\n",
    "payload = {\n",
    "  \"name\": \"DeepSeek R1 model connector v2\",\n",
    "  \"description\": \"Connector for my Sagemaker DeepSeek model\",\n",
    "  \"version\": \"1.0\",\n",
    "  \"protocol\": \"aws_sigv4\",\n",
    "  \"credential\": {\n",
    "    \"roleArn\": invoke_role_arn\n",
    "  },\n",
    "  \"parameters\": {\n",
    "    \"service_name\": \"sagemaker\",\n",
    "    \"region\": region,\n",
    "    \"do_sample\": True,\n",
    "    \"top_p\": 0.7,\n",
    "    \"temperature\": 0.5,\n",
    "    \"max_tokens\": 1024\n",
    "  },\n",
    "  \"actions\": [\n",
    "    {\n",
    "      \"action_type\": \"PREDICT\",\n",
    "      \"method\": \"POST\",\n",
    "      \"url\": sagemaker_endpoint_url,\n",
    "      \"headers\": {\n",
    "        \"content-type\": \"application/json\"\n",
    "      },\n",
    "      \"request_body\": \"\"\"{ \"prompt\": \"${parameters.inputs}\", \"temperature\": ${parameters.temperature}, \"top_p\": ${parameters.top_p}, \"max_tokens\": ${parameters.max_tokens} }\"\"\",\n",
    "      \"post_process_function\": \"\"\"\n",
    "            try {\n",
    "            // 从响应中提取生成的文本\n",
    "            def completion = \"\";\n",
    "            \n",
    "            // 检查是否存在 choices 数组\n",
    "            if (params.choices.size() > 0) {\n",
    "              // 获取第一个选择的文本\n",
    "              completion = params.choices[0].text;\n",
    "              \n",
    "            } else {\n",
    "              completion = \"无法从模型响应中提取文本\";\n",
    "            }\n",
    "            \n",
    "            \n",
    "            def json = \"{\" +\n",
    "                             \"\\\\\"name\\\\\": \\\\\"response\\\\\",\"+\n",
    "                            \"\\\\\"dataAsMap\\\\\": {\" +\n",
    "                             \"\\\\\"completion\\\\\":\\\\\"\" + escape(completion) + \"\\\\\"}\" +\n",
    "                             \"}\";\n",
    "            return json;\n",
    "            \n",
    "        \n",
    "          } catch (Exception e) {\n",
    "            // 返回错误信息\n",
    "            return '{' +\n",
    "                     '\"name\": \"response\",' +\n",
    "                     '\"dataAsMap\": {' +\n",
    "                        '\"completion\": \"处理错误: ' + escape(e) + '\"' +\n",
    "                     '}' +\n",
    "                   '}';\n",
    "          }\n",
    "        \"\"\"\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "\n",
    "# This ignores errors and doesn't check the result. In real use,\n",
    "# you should wrap this code with try/except blocks and check the\n",
    "# response status code and the response body for errors.\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "r = requests.post(url, auth=awsauth, json=payload, headers=headers)\n",
    "print(r.text)\n",
    "connector_id = json.loads(r.text)['connector_id']\n",
    "\n",
    "\n",
    "# print(connector_id)\n",
    "print(f'DEEPSEEK_CONNECTOR_ID=\"{connector_id}\"\\n')\n",
    "os.environ[\"DEEPSEEK_CONNECTOR_ID\"] = connector_id"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0261986f-35b2-44e6-b930-b79729247c5e",
   "metadata": {},
   "source": [
    "## 3.2 创建 BGE Emebedding Model Connector"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "1f5fbe2e-586b-4302-a4b5-aba9085c6cb0",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "import boto3\n",
    "import json\n",
    "import os\n",
    "import requests \n",
    "from requests_aws4auth import AWS4Auth\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "invoke_role_arn = os.environ['INVOKE_DEEPSEEK_ROLE']\n",
    "create_deepseek_connector_role_arn = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "sagemaker_endpoint_url = os.environ['SAGEMAKER_EMBEDDING_MODEL_INFERENCE_ENDPOINT']\n",
    "\n",
    "\n",
    "# Create the AWS4Auth object that will sign the create connector API call. \n",
    "credentials = boto3.client('sts').assume_role(\n",
    "    RoleArn=create_deepseek_connector_role_arn,\n",
    "    RoleSessionName='create_connector_session'\n",
    ")['Credentials']\n",
    "awsauth = AWS4Auth(credentials['AccessKeyId'], \n",
    "                   credentials['SecretAccessKey'], \n",
    "                   region, \n",
    "                   'es', \n",
    "                   session_token=credentials['SessionToken'])\n",
    "\n",
    "\n",
    "# Prepare the API call parameters. \n",
    "path = '/_plugins/_ml/connectors/_create'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "# See the documentation \n",
    "# https://opensearch.org/docs/latest/ml-commons-plugin/remote-models/blueprints/ for \n",
    "# details on the connector payload, and additional blueprints for other models.\n",
    "payload = {\n",
    "  \"name\": \"BEG Embedding model connector\",\n",
    "  \"description\": \"Connector for Sagemaker BEG model\",\n",
    "  \"version\": \"1.0\",\n",
    "  \"protocol\": \"aws_sigv4\",\n",
    "  \"credential\": {\n",
    "    \"roleArn\": invoke_role_arn\n",
    "  },\n",
    "  \"parameters\": {\n",
    "    \"service_name\": \"sagemaker\",\n",
    "    \"region\": region\n",
    "  },\n",
    "    \"actions\": [\n",
    "        {\n",
    "            \"action_type\": \"PREDICT\",\n",
    "            \"method\": \"POST\",\n",
    "            \"url\": sagemaker_endpoint_url,\n",
    "            \"headers\": {\n",
    "                \"content-type\": \"application/json\"\n",
    "            },\n",
    "            \"request_body\": \"{ \\\"input\\\": \\\"${parameters.input}\\\"}\",\n",
    "            \"pre_process_function\": \"\"\"\n",
    "                StringBuilder builder = new StringBuilder();\n",
    "                builder.append(\"\\\\\"\");\n",
    "                builder.append(params.text_docs[0]);\n",
    "                builder.append(\"\\\\\"\");\n",
    "                def parameters = \"{\" +\"\\\\\"input\\\\\":\" + builder + \"}\";\n",
    "                return \"{\" +\"\\\\\"parameters\\\\\":\" + parameters + \"}\";\n",
    "                \"\"\",\n",
    "            \"post_process_function\": \"\"\"\n",
    "                  def name = \"sentence_embedding\";\n",
    "                  def dataType = \"FLOAT32\";\n",
    "                  if (params.data[0].embedding == null || params.data[0].embedding.length == 0) {\n",
    "                    return params.message;\n",
    "                  }\n",
    "                  def shape = [params.data[0].embedding.length];\n",
    "                  def json = \"{\" +\n",
    "                             \"\\\\\"name\\\\\":\\\\\"\" + name + \"\\\\\",\" +\n",
    "                             \"\\\\\"data_type\\\\\":\\\\\"\" + dataType + \"\\\\\",\" +\n",
    "                             \"\\\\\"shape\\\\\":\" + shape + \",\" +\n",
    "                             \"\\\\\"data\\\\\":\" + params.data[0].embedding +\n",
    "                             \"}\";\n",
    "                  return json;\n",
    "                \"\"\"\n",
    "        }\n",
    "    ]\n",
    "}\n",
    "\n",
    "# This ignores errors and doesn't check the result. In real use,\n",
    "# you should wrap this code with try/except blocks and check the\n",
    "# response status code and the response body for errors.\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "r = requests.post(url, auth=awsauth, json=payload, headers=headers)\n",
    "print(r.text)\n",
    "connector_id = json.loads(r.text)['connector_id']\n",
    "\n",
    "\n",
    "print(connector_id)\n",
    "print(f'EMBEDDING_CONNECTOR_ID=\"{connector_id}\"\\n')\n",
    "os.environ[\"EMBEDDING_CONNECTOR_ID\"] = connector_id"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1f7822d6-94dd-4e5d-8075-8ba564b929db",
   "metadata": {},
   "source": [
    "# 4.创建 ML Model"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "172dc93c-c76a-40c8-bf22-279f52ea1733",
   "metadata": {},
   "source": [
    "## 4.1 注册 DeepSeek Model"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "0d53aa76-9a55-4293-8a46-b2f58301faeb",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "import os\n",
    "import requests\n",
    "\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "connector_id = os.environ['DEEPSEEK_CONNECTOR_ID']\n",
    "create_deepseek_connector_role = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "\n",
    "\n",
    "# Set up user/password auth\n",
    "userauth = (opensearch_user_name, opensearch_user_password)\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Register the model\n",
    "path = '/_plugins/_ml/models/_register'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "payload = {\n",
    "  \"name\": \"Sagemaker DeepSeek R1 model\",\n",
    "  \"function_name\": \"remote\",\n",
    "  \"description\": \"DeepSeek R1 model on Sagemaker\",\n",
    "  \"connector_id\": connector_id\n",
    "}\n",
    "r = requests.post(url, auth=userauth, json=payload, headers=headers)\n",
    "\n",
    "model_id = r.json()['model_id']\n",
    "print(f'model_id: {model_id}')\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Deploy the model\n",
    "path = f'/_plugins/_ml/models/model_id/_deploy'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "r = requests.post(url, auth=userauth, headers=headers)\n",
    "\n",
    "\n",
    "print(f'DEEPSEEK_MODEL_ID=\"{model_id}\"\\n')\n",
    "os.environ[\"DEEPSEEK_MODEL_ID\"] = model_id"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "583d97f8-d5fe-410b-a2cf-21c65bf94f36",
   "metadata": {},
   "source": [
    "## 4.2 注册 BGE Embedding Model"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "8b8caf0d-981b-442d-8599-4a122714ae24",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "import os\n",
    "import requests\n",
    "\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "connector_id = os.environ['EMBEDDING_CONNECTOR_ID']\n",
    "create_deepseek_connector_role = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "\n",
    "\n",
    "# Set up user/password auth\n",
    "userauth = (opensearch_user_name, opensearch_user_password)\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Register the model\n",
    "path = '/_plugins/_ml/models/_register'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "payload = {\n",
    "  \"name\": \"Sagemaker BGE m3 model\",\n",
    "  \"function_name\": \"remote\",\n",
    "  \"description\": \"BGE m3 model on Sagemaker\",\n",
    "  \"connector_id\": connector_id\n",
    "}\n",
    "r = requests.post(url, auth=userauth, json=payload, headers=headers)\n",
    "\n",
    "model_id = r.json()['model_id']\n",
    "print(f'model_id: {model_id}')\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Deploy the model\n",
    "path = f'/_plugins/_ml/models/model_id/_deploy'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "r = requests.post(url, auth=userauth, headers=headers)\n",
    "\n",
    "\n",
    "print(f'EMBEDDING_MODEL_ID=\"{model_id}\"\\n')\n",
    "os.environ[\"EMBEDDING_MODEL_ID\"] = model_id"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "4788cc66-6e04-4946-83e2-75e87a50aca7",
   "metadata": {},
   "source": [
    "#### Show MODEL_ID and Connector_ID"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "70427b75-57d2-4bcb-b0e0-d8e8c66cecd7",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "print(\"#####  LLM - DEEPSEEK #####\")\n",
    "print(\"DEEPSEEK_CONNECTOR_ID:\" + os.environ[\"DEEPSEEK_CONNECTOR_ID\"])\n",
    "print(\"DEEPSEEK_MODEL_ID:\" + os.environ[\"DEEPSEEK_MODEL_ID\"])\n",
    "print(\"\")\n",
    "print(\"#####  EMBEDDING MODEL - BGE #####\")\n",
    "print(\"EMBEDDING_CONNECTOR_ID:\" + os.environ[\"EMBEDDING_CONNECTOR_ID\"])\n",
    "print(\"EMBEDDING_MODEL_ID:\" + os.environ[\"EMBEDDING_MODEL_ID\"])\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "507fb621-b76c-4ae5-acb1-3951689bd30a",
   "metadata": {},
   "source": [
    "## 4.3 测试模型"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "37e245cf-5fe0-4957-a431-2e943ebb960e",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "source": [
    "在 OpenSearch Dashboard 中的 Dev Tool 中执行如下命令 <p>\n",
    "```shell\n",
    "POST _plugins/_ml/models/<modelid>/_predict\n",
    "{  \n",
    "   \"parameters\": {    \n",
    "      \"inputs\": \"OpenSearch Serverless 是什么，和OpenSearch集群模式有什么区别，使用 OpenSearch Serverless，还需要管理服务器资源么？\"\n",
    "   }\n",
    "}\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "32611e0e-6ea5-4516-bc44-c7efb8d15708",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "# Copyright 2025 Norris\n",
    "# MIT-0\n",
    "#\n",
    "\n",
    "import os\n",
    "import requests\n",
    "from pprint import pprint\n",
    "\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "connector_id = os.environ['EMBEDDING_CONNECTOR_ID']\n",
    "create_deepseek_connector_role = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "deepseek_model_id = os.environ[\"DEEPSEEK_MODEL_ID\"]\n",
    "\n",
    "# Set up user/password auth\n",
    "userauth = (opensearch_user_name, opensearch_user_password)\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Test LLM model\n",
    "path = f'/_plugins/_ml/models/{deepseek_model_id}/_predict'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "payload = {\n",
    "  \"parameters\": {    \n",
    "      \"inputs\": \"OpenSearch Serverless 是什么，和OpenSearch集群模式有什么区别，使用 OpenSearch Serverless，还需要管理服务器资源么？\"\n",
    "   }\n",
    "}\n",
    "r = requests.post(url, auth=userauth, json=payload, headers=headers)\n",
    "\n",
    "response_content = r.json()['inference_results'][0]['output'][0]['dataAsMap']\n",
    "pprint(f'response: {response_content}')"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ca9b2acc-c3e2-425f-af5b-e1c30af29ee2",
   "metadata": {},
   "source": [
    "# 5.导入知识库数据"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "35b37eb3-43d0-4a03-aa02-e5bacd4e8bb7",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "from opensearchpy import OpenSearch\n",
    "import os\n",
    "\n",
    "opensearch_port = 443\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "embedding_model_id = os.environ['EMBEDDING_MODEL_ID']\n",
    "index_name = \"opensearch_kl_index\"\n",
    "\n",
    "\n",
    "# Ensure the endpoint matches the contract for the opensearch-py client. Endpoints\n",
    "# are specified without the leading URL scheme or trailing slashes.\n",
    "if opensearch_service_api_endpoint.startswith('https://'):\n",
    "  opensearch_service_api_endpoint = opensearch_service_api_endpoint[len('https://'):]\n",
    "if opensearch_service_api_endpoint.endswith('/'):\n",
    "  opensearch_service_api_endpoint = opensearch_service_api_endpoint[:-1]\n",
    "\n",
    "\n",
    "# The mapping sets kNN to true to enable vector search for the index. It defines\n",
    "# the text field as type text, and a text_embedding field that uses the FAISS engine\n",
    "# for storage and retrieval, using the HNSW algorithm.\n",
    "mapping = {\n",
    "    \"settings\": {\n",
    "        \"index\": {\n",
    "            \"knn\": True,\n",
    "            \"number_of_shards\": 1,\n",
    "            \"number_of_replicas\": 1\n",
    "        }\n",
    "    },\n",
    "    \"mappings\": {\n",
    "        \"properties\": {\n",
    "            \"text\": {\"type\": \"text\"},\n",
    "            \"text_embedding\": {\n",
    "              \"type\": \"knn_vector\",\n",
    "              \"dimension\": 1024,\n",
    "              \"method\": {\n",
    "                  \"name\": \"hnsw\",\n",
    "                  \"space_type\": \"l2\",\n",
    "                  \"engine\": \"faiss\",\n",
    "                  \"parameters\": {\"ef_construction\": 128, \"m\": 24},\n",
    "              }\n",
    "            }\n",
    "        }\n",
    "    }\n",
    "}\n",
    "\n",
    "\n",
    "# The data for the knowledge base.\n",
    "population_data = [\n",
    "  {\"index\": {\"_index\": index_name, \"_id\": \"1\"}},\n",
    "  {\"text\": \"Amazon OpenSearch Service 是一项托管服务，可以轻松地在 AWS 云中部署、操作和扩展 OpenSearch 集群。 OpenSearch 服务域是 OpenSearch 集群的同义词。域是包含您指定的设置、实例类型、实例计数和存储资源的集群。亚马逊 OpenSearch 服务支持 OpenSearch 传统的 Elasticsearch OSS（最高 7.10，即该软件的最终开源版本）。创建域时，您可以选择使用哪种搜索引擎。\"},\n",
    "  {\"index\": {\"_index\": index_name, \"_id\": \"2\"}},\n",
    "  {\"text\": \"OpenSearch是一个完全开源的搜索和分析引擎，用于日志分析、实时应用程序监控和点击流分析等用例。\"},\n",
    "  {\"index\": {\"_index\": index_name, \"_id\": \"3\"}},\n",
    "  {\"text\": \"Amazon OpenSearch 服务的特点\\\\n大量 CPU、内存和存储容量配置，也称为实例类型，包括具有成本效益的 Graviton 实例。\\\\n支持多达 1002 个数据节点\\\\n高达 25 PB 的连接存储\\\\n为只读数据提供经济实惠UltraWarm的冷存储\"},\n",
    "  {\"index\": {\"_index\": index_name, \"_id\": \"4\"}},\n",
    "  {\"text\": \"\"\"OpenSearch 与亚马逊 OpenSearch 服务相比，何时使用该服务\\\\n对于开源OpenSearch 您的组织愿意手动监控和维护自行预置的集群，并且拥有具备相应技能的人员。\n",
    "您需要对代码拥有完全的编译级别控制。\n",
    "您的组织希望或非常独特地使用开源软件。\n",
    "您执行多云战略，需要不特定于供应商的技术。\n",
    "您的团队有能力解决任何关键的生产问题。\n",
    "您希望能够灵活地以任何需要的方式使用、修改和扩展产品。\n",
    "您希望在新功能发布后立即使用这些功能。\"\"\"},\n",
    "  {\"index\": {\"_index\": index_name, \"_id\": \"5\"}},\n",
    "  {\"text\": \"\"\"OpenSearch 与亚马逊 OpenSearch 服务相比，何时使用该服务，对于Amazon OpenSearch托管服务，您不想手动管理、监控和维护基础设施。\n",
    "您想利用 Amazon S3 的持久性和低成本优势，通过跨存储层进行数据分层，从而管理不断增加的分析成本。\n",
    "你想利用与其他数据库的集成， AWS 服务 例如 DynamoDB、Amazon DocumentDB（兼容 MongoDB）、IAM 和。 CloudWatch CloudFormation\n",
    "在预防性维护和生产期间出现问题时，您需要轻松获得帮助。 支持\n",
    "您想利用自我修复、主动维护、韧性和备份等功能。\"\"\"},\n",
    "  {\"index\": {\"_index\": index_name, \"_id\": \"6\"}},\n",
    "  {\"text\": \"Amazon OpenSearch Serverless 是 Amazon OpenSearch 服务的按需无服务器选项，它消除了配置、配置和调整 OpenSearch 集群的操作复杂性。它非常适合那些不愿自行管理集群或缺乏用于大规模部署的专用资源和专业知识的组织。借助 OpenSearch Serverless，您可以搜索和分析大量数据，而无需管理底层基础架构。\"},\n",
    "    {\"index\": {\"_index\": index_name, \"_id\": \"7\"}},\n",
    "  {\"text\": \"OpenSearch Serverless 集合是一组 OpenSearch 索引，它们协同工作以支持特定的工作负载或用例。与需要手动配置的自我管理 OpenSearch 集群相比，集合简化了操作。\"},\n",
    "    {\"index\": {\"_index\": index_name, \"_id\": \"8\"}},\n",
    "  {\"text\": \"OpenSearch Serverless 支持与 OpenSearch 开源套件相同的采集和查询 API 操作，因此您可以继续使用现有的客户端和应用程序。您的客户端必须与 OpenSearch 2.x 兼容，才能使用 OpenSearch Serverless。\"},\n",
    "    {\"index\": {\"_index\": index_name, \"_id\": \"9\"}},\n",
    "  {\"text\": \"Amazon OpenSearch Ingestion 是一个完全托管的无服务器数据收集器，可将实时日志、指标和跟踪数据流式传输到亚马逊 OpenSearch 服务域和 OpenSearch 无服务器集合。\"},\n",
    "    {\"index\": {\"_index\": index_name, \"_id\": \"10\"}},\n",
    "    {\"text\": \"借助 OpenSearch Ingestion，您不再需要像 Logstash 或 Jaeger 这样的第三方工具来摄取数据。您可以将数据生成器配置为将数据发送到 OpenSearch Ingestion，然后它会自动将其传输到您的指定域或集合。您也可以在交付数据之前转换数据。\"}\n",
    "]\n",
    "\n",
    "\n",
    "# OpenSearch ingest pipelines let you define processors to apply to your documents at ingest\n",
    "# time. The text_embedding processor lets you select a source field and a destination field\n",
    "# for the embedding. At ingest, OpenSearch will call the model, via its model id, to \n",
    "# generate the embedding.\n",
    "ingest_pipeline_definition = {\n",
    "  \"processors\": [\n",
    "    {\n",
    "      \"text_embedding\": {\n",
    "        \"model_id\": embedding_model_id,\n",
    "        \"field_map\": {\n",
    "            \"text\": \"text_embedding\"\n",
    "        }\n",
    "      }\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "\n",
    "\n",
    "# Set up for the client to call OpenSearch Service\n",
    "hosts = [{\"host\": opensearch_service_api_endpoint, \"port\": opensearch_port}]\n",
    "client = OpenSearch(\n",
    "    hosts=hosts,\n",
    "    http_auth=(opensearch_user_name, opensearch_user_password),\n",
    "    use_ssl=True,\n",
    "    verify_certs=False,\n",
    "    ssl_assert_hostname=False,\n",
    "    ssl_show_warn=False,\n",
    ")\n",
    "\n",
    "\n",
    "# Check whether an index already exists with the chosen name. If you receive an \n",
    "# exception, change the index name in the global variable above, and be sure \n",
    "# also to change the index_name global variable in run_rag.py\n",
    "if client.indices.exists(index=index_name) == 200:\n",
    "  raise Exception(f'Index {index_name} already exists. Please choose a different name in load_data.py. Be sure to change the index_name in run_rag.py as well.')\n",
    "\n",
    "# This code does not validate the response. In actual use, you should wrap this\n",
    "# block in try/except and validate the response. \n",
    "client.indices.create(index=index_name, body=mapping)\n",
    "r = client.ingest.put_pipeline(id=\"embedding_pipeline\", \n",
    "                               body=ingest_pipeline_definition)\n",
    "client.bulk(index=index_name, \n",
    "            body=population_data,\n",
    "            pipeline=\"embedding_pipeline\", \n",
    "            refresh=True)\n",
    "\n",
    "print(f'Loaded data into {index_name}')"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ef3cf5bb-394f-4f2e-beda-9657de3e87c6",
   "metadata": {},
   "source": [
    "# 6.创建 Search Pipeline"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ad81c524-2432-41f7-9d53-ab0e35c5252b",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "source": [
    "**Retrieval-augmented generation (RAG) processor**\n",
    "\n",
    "RAG processor 是一个搜索结果处理器，主要用于对话式搜索中的检索增强生成<br>\n",
    "* 将向量引擎的检索能力与 LLM 的生成能力无缝结合，无需手动编写复杂管道代码，只需配置即可实现“检索-增强生成”的端到端流程。\n",
    "* 隐藏底层向量搜索、提示词拼接、模型调用等细节，将提示（prompt）发送给大语言模型（LLM）\n",
    "* 自动将 OpenSearch 检索到的相关文档作为上下文注入 LLM 提示词，提升生成结果的准确性和相关性，避免幻觉问题。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "5574a3db-e223-49f8-a38e-34aa65fba085",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "source": [
    "在 OpenSearch Dashboard 中的 Dev Tool 中执行如下命令\n",
    "```\n",
    "PUT /_search/pipeline/my-conversation-search-pipeline-deepseek-zh\n",
    "{\n",
    "  \"response_processors\": [\n",
    "    {\n",
    "      \"retrieval_augmented_generation\": {\n",
    "        \"tag\": \"Demo pipeline\",\n",
    "        \"description\": \"Demo pipeline Using DeepSeek R1\",\n",
    "        \"model_id\": \"<llm-mode-id>\",\n",
    "        \"context_field_list\": [\n",
    "          \"text\"\n",
    "        ],\n",
    "        \"system_prompt\": \"你是一个智能助手.\",\n",
    "        \"user_instructions\": \"针对给定的问题，用少于 200 个字给出简洁、翔实的答案\"\n",
    "      }\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "c48eabc0-4b16-45f3-8c59-a3d108158370",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "# Copyright 2025 Norris\n",
    "# MIT-0\n",
    "#\n",
    "\n",
    "import os\n",
    "import requests\n",
    "\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "connector_id = os.environ['EMBEDDING_CONNECTOR_ID']\n",
    "create_deepseek_connector_role = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "deepseek_model_id = os.environ[\"DEEPSEEK_MODEL_ID\"]\n",
    "\n",
    "# Set up user/password auth\n",
    "userauth = (opensearch_user_name, opensearch_user_password)\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Create Search Pipeline\n",
    "path = f'/_search/pipeline/my-conversation-search-pipeline-deepseek-zh'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "payload = {\n",
    "  \"response_processors\": [\n",
    "    {\n",
    "      \"retrieval_augmented_generation\": {\n",
    "        \"tag\": \"Demo pipeline\",\n",
    "        \"description\": \"Demo pipeline Using DeepSeek R1\",\n",
    "        \"model_id\": deepseek_model_id,\n",
    "        \"context_field_list\": [\n",
    "          \"text\"\n",
    "        ],\n",
    "        \"system_prompt\": \"你是一个智能助手.\",\n",
    "        \"user_instructions\": \"针对给定的问题，用少于 200 个字给出简洁、翔实的答案，请使用中文回答\"\n",
    "      }\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "r = requests.put(url, auth=userauth, json=payload, headers=headers)\n",
    "\n",
    "reponse = r.json()\n",
    "print(f'reponse: {reponse}')"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b294f779-57de-4e62-a5a4-d04e9bb1cd47",
   "metadata": {},
   "source": [
    "# 7.测试问答"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "9013729a-c2fd-4b44-8736-a9c3ff1057b6",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "source": [
    "在 OpenSearch Dashboard 中的 Dev Tool 中执行如下命令\n",
    "```\n",
    "GET opensearch_kl_index/_search?search_pipeline=my-conversation-search-pipeline-deepseek-zh\n",
    "{\n",
    "  \"query\": {\n",
    "    \"neural\": {\n",
    "    \"text_embedding\": {\n",
    "        \"query_text\": \"OpenSearch Serverless 是什么，和OpenSearch集群模式有什么区别，使用 OpenSearch Serverless，还需要管理服务器资源么？\",\n",
    "        \"model_id\": \"<embedding-model-id>\",\n",
    "        \"k\": 5\n",
    "      }\n",
    "    }\n",
    "  },\n",
    "  \"size\": 2,\n",
    "  \"_source\": [\n",
    "    \"text\"\n",
    "  ],\n",
    "  \"ext\": {\n",
    "    \"generative_qa_parameters\": {\n",
    "      \"llm_model\": \"bedrock/claude\",\n",
    "      \"llm_question\": \"OpenSearch Serverless 是什么，和OpenSearch集群模式有什么区别，使用 OpenSearch Serverless，还需要管理服务器资源么？\",\n",
    "      \"context_size\": 5,\n",
    "      \"timeout\": 15\n",
    "    }\n",
    "  }\n",
    "}\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "023c3d53-f4f0-4232-98a1-9e2a4eacf39c",
   "metadata": {},
   "outputs": [],
   "source": [
    "question = \"OpenSearch Serverless 是什么，和OpenSearch集群模式有什么区别，使用 OpenSearch Serverless，还需要管理服务器资源么？\""
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "251211cb-eacd-4759-93c6-702d4ad11501",
   "metadata": {},
   "outputs": [],
   "source": [
    "question = \"能否再详细解释一下\""
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "e401ea23-8125-4913-9bf5-87d5c0498a87",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    },
    "scrolled": true
   },
   "outputs": [],
   "source": [
    "# Copyright 2025 Norris\n",
    "# MIT-0\n",
    "#\n",
    "\n",
    "# Test\n",
    "\n",
    "import os\n",
    "import requests\n",
    "from pprint import pprint\n",
    "\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "connector_id = os.environ['EMBEDDING_CONNECTOR_ID']\n",
    "create_deepseek_connector_role = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "embedding_mode_id = os.environ[\"EMBEDDING_MODEL_ID\"]\n",
    "\n",
    "# Set up user/password auth\n",
    "userauth = (opensearch_user_name, opensearch_user_password)\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Create Search Pipeline\n",
    "path = f'/opensearch_kl_index/_search?search_pipeline=my-conversation-search-pipeline-deepseek-zh'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "payload = {\n",
    "  \"query\": {\n",
    "    \"neural\": {\n",
    "    \"text_embedding\": {\n",
    "        \"query_text\": question,\n",
    "        \"model_id\": embedding_mode_id,\n",
    "        \"k\": 5\n",
    "      }\n",
    "    }\n",
    "  },\n",
    "  \"size\": 2,\n",
    "  \"_source\": [\n",
    "    \"text\"\n",
    "  ],\n",
    "  \"ext\": {\n",
    "    \"generative_qa_parameters\": {\n",
    "      \"llm_model\": \"bedrock/claude\",\n",
    "      \"llm_question\": question,\n",
    "      \"context_size\": 5,\n",
    "      \"timeout\": 15\n",
    "    }\n",
    "  }\n",
    "}\n",
    "response = requests.post(url, auth=userauth, json=payload, headers=headers)\n",
    "\n",
    "pprint(response.json())"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "792b509a-adde-452a-b591-d40072b81939",
   "metadata": {},
   "source": [
    "<br>\n",
    "<br>\n",
    "<br>\n",
    "<br>"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "4a4e378f-82a5-4b92-812f-d47638c81fdc",
   "metadata": {},
   "source": [
    "# 8.对话式搜索(创建对话记忆)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "37670adc-1aa2-45cb-9acf-f8c079e24aef",
   "metadata": {},
   "source": [
    "对话式搜索允许您用自然语言提问，并通过提出后续问题来完善答案。因此，对话变成了您与大型语言模型 (LLM) 之间的对话。<br>为此，模型需要记住整个对话的上下文，而不是单独回答每个问题。\n",
    "在 OpenSearch Dashboard 中的 Dev Tool 中执行如下命令\n",
    "```\n",
    "POST /_plugins/_ml/memory/\n",
    "{\n",
    "  \"name\": \"Conversation about DeepSeek Demo\"\n",
    "}\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "d017a18d-8839-4f83-9381-9b6ce8824127",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Copyright 2025 Norris\n",
    "# MIT-0\n",
    "#\n",
    "\n",
    "import os\n",
    "import requests\n",
    "\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "connector_id = os.environ['EMBEDDING_CONNECTOR_ID']\n",
    "create_deepseek_connector_role = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "deepseek_model_id = os.environ[\"DEEPSEEK_MODEL_ID\"]\n",
    "\n",
    "# Set up user/password auth\n",
    "userauth = (opensearch_user_name, opensearch_user_password)\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Test LLM model\n",
    "path = f'/_plugins/_ml/memory/'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "payload = {\n",
    "  \"name\": \"Conversation about DeepSeek Demo\"\n",
    "}\n",
    "r = requests.post(url, auth=userauth, json=payload, headers=headers)\n",
    "\n",
    "memory_id = r.json()['memory_id']\n",
    "print(f'memory_id: {memory_id}')\n",
    "os.environ[\"MEMORY_ID\"] = memory_id"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d8ea9085-a341-41c2-8e0e-a5e06dabd0d3",
   "metadata": {},
   "source": [
    "### 带有记忆功能的搜索"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "ed9d6183-38c0-4273-a2b5-0efba083e9c7",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "question = \"OpenSearch Serverless 是什么，和OpenSearch集群模式有什么区别，使用 OpenSearch Serverless，还需要管理服务器资源么？\""
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "668f7cc8-f3da-482e-97a7-3ccc4062e6bd",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "question = \"能否再详细解释一下\""
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "c3fae4a2-2d83-46b1-bacd-0c4eed891faa",
   "metadata": {
    "jupyter": {
     "source_hidden": true
    }
   },
   "outputs": [],
   "source": [
    "# Copyright 2025 Norris\n",
    "# MIT-0\n",
    "#\n",
    "\n",
    "# Test\n",
    "\n",
    "import os\n",
    "import requests\n",
    "from pprint import pprint\n",
    "\n",
    "\n",
    "opensearch_service_api_endpoint = os.environ['OPENSEARCH_SERVICE_DOMAIN_ENDPOINT']\n",
    "opensearch_user_name = os.environ['OPENSEARCH_SERVICE_ADMIN_USER']\n",
    "opensearch_user_password = os.environ['OPENSEARCH_SERVICE_ADMIN_PASSWORD']\n",
    "region = os.environ['DEEPSEEK_AWS_REGION']\n",
    "connector_id = os.environ['EMBEDDING_CONNECTOR_ID']\n",
    "create_deepseek_connector_role = os.environ['CREATE_DEEPSEEK_CONNECTOR_ROLE']\n",
    "embedding_mode_id = os.environ[\"EMBEDDING_MODEL_ID\"]\n",
    "memory_id = os.environ[\"MEMORY_ID\"]\n",
    "\n",
    "# Set up user/password auth\n",
    "userauth = (opensearch_user_name, opensearch_user_password)\n",
    "headers = {\"Content-Type\": \"application/json\"}\n",
    "\n",
    "\n",
    "\n",
    "########################################################################################\n",
    "# Create Search Pipeline\n",
    "path = f'/opensearch_kl_index/_search?search_pipeline=my-conversation-search-pipeline-deepseek-zh'\n",
    "url = opensearch_service_api_endpoint + path\n",
    "payload = {\n",
    "  \"query\": {\n",
    "    \"neural\": {\n",
    "    \"text_embedding\": {\n",
    "        \"query_text\": question,\n",
    "        \"model_id\": embedding_mode_id,\n",
    "        \"k\": 5\n",
    "      }\n",
    "    }\n",
    "  },\n",
    "  \"size\": 2,\n",
    "  \"_source\": [\n",
    "    \"text\"\n",
    "  ],\n",
    "  \"ext\": {\n",
    "    \"generative_qa_parameters\": {\n",
    "      \"llm_model\": \"bedrock/claude\",\n",
    "      \"llm_question\": question,\n",
    "      \"memory_id\": memory_id,\n",
    "      \"context_size\": 2,\n",
    "      \"message_size\": 2,\n",
    "      \"timeout\": 15\n",
    "    }\n",
    "  }\n",
    "}\n",
    "response = requests.post(url, auth=userauth, json=payload, headers=headers)\n",
    "\n",
    "pprint(response.json())"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "c06a8b28-6622-43b8-90e2-8b35efda1748",
   "metadata": {},
   "source": [
    "#### 查看记忆中的消息\n",
    "```\n",
    "GET /_plugins/_ml/memory/<memory-id>/messages\n",
    "```"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.11.11"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
