{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {
    "colab_type": "text",
    "id": "view-in-github"
   },
   "source": [
    "<a href=\"https://colab.research.google.com/github/tomasonjo/blogs/blob/master/llm/langchain_neo4j_tips.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "id": "SgGtmyZxDfCI",
    "outputId": "1a5f5dcf-7052-4a6a-a604-a507bebe7159"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Requirement already satisfied: neo4j in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (5.16.0)\n",
      "Requirement already satisfied: openai in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (1.6.1)\n",
      "Requirement already satisfied: langchain in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (0.1.0)\n",
      "Requirement already satisfied: langchain_openai in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (0.0.2.post1)\n",
      "Requirement already satisfied: pytz in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from neo4j) (2023.3.post1)\n",
      "Requirement already satisfied: anyio<5,>=3.5.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from openai) (3.5.0)\n",
      "Requirement already satisfied: distro<2,>=1.7.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from openai) (1.8.0)\n",
      "Requirement already satisfied: httpx<1,>=0.23.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from openai) (0.26.0)\n",
      "Requirement already satisfied: pydantic<3,>=1.9.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from openai) (1.10.12)\n",
      "Requirement already satisfied: sniffio in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from openai) (1.2.0)\n",
      "Requirement already satisfied: tqdm>4 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from openai) (4.65.0)\n",
      "Requirement already satisfied: typing-extensions<5,>=4.7 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from openai) (4.9.0)\n",
      "Requirement already satisfied: PyYAML>=5.3 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (6.0.1)\n",
      "Requirement already satisfied: SQLAlchemy<3,>=1.4 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (2.0.21)\n",
      "Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (3.9.0)\n",
      "Requirement already satisfied: dataclasses-json<0.7,>=0.5.7 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (0.6.3)\n",
      "Requirement already satisfied: jsonpatch<2.0,>=1.33 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (1.33)\n",
      "Requirement already satisfied: langchain-community<0.1,>=0.0.9 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (0.0.9)\n",
      "Requirement already satisfied: langchain-core<0.2,>=0.1.7 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (0.1.7)\n",
      "Requirement already satisfied: langsmith<0.1.0,>=0.0.77 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (0.0.77)\n",
      "Requirement already satisfied: numpy<2,>=1 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (1.26.2)\n",
      "Requirement already satisfied: requests<3,>=2 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (2.31.0)\n",
      "Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain) (8.2.2)\n",
      "Requirement already satisfied: tiktoken<0.6.0,>=0.5.2 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain_openai) (0.5.2)\n",
      "Requirement already satisfied: attrs>=17.3.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (23.1.0)\n",
      "Requirement already satisfied: multidict<7.0,>=4.5 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (6.0.4)\n",
      "Requirement already satisfied: yarl<2.0,>=1.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.9.3)\n",
      "Requirement already satisfied: frozenlist>=1.1.1 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.4.0)\n",
      "Requirement already satisfied: aiosignal>=1.1.2 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.2.0)\n",
      "Requirement already satisfied: idna>=2.8 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from anyio<5,>=3.5.0->openai) (3.4)\n",
      "Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from dataclasses-json<0.7,>=0.5.7->langchain) (3.20.1)\n",
      "Requirement already satisfied: typing-inspect<1,>=0.4.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from dataclasses-json<0.7,>=0.5.7->langchain) (0.9.0)\n",
      "Requirement already satisfied: certifi in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from httpx<1,>=0.23.0->openai) (2023.11.17)\n",
      "Requirement already satisfied: httpcore==1.* in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from httpx<1,>=0.23.0->openai) (1.0.2)\n",
      "Requirement already satisfied: h11<0.15,>=0.13 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->openai) (0.14.0)\n",
      "Requirement already satisfied: jsonpointer>=1.9 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from jsonpatch<2.0,>=1.33->langchain) (2.1)\n",
      "Requirement already satisfied: packaging<24.0,>=23.2 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from langchain-core<0.2,>=0.1.7->langchain) (23.2)\n",
      "Requirement already satisfied: charset-normalizer<4,>=2 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from requests<3,>=2->langchain) (2.0.4)\n",
      "Requirement already satisfied: urllib3<3,>=1.21.1 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from requests<3,>=2->langchain) (2.1.0)\n",
      "Requirement already satisfied: regex>=2022.1.18 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from tiktoken<0.6.0,>=0.5.2->langchain_openai) (2023.10.3)\n",
      "Requirement already satisfied: mypy-extensions>=0.3.0 in /Users/tomazbratanic/anaconda3/lib/python3.11/site-packages (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7,>=0.5.7->langchain) (1.0.0)\n"
     ]
    }
   ],
   "source": [
    "!pip install neo4j openai langchain langchain_openai"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "2lXfT6Luz2oV"
   },
   "source": [
    "# LangChain Cypher search: Tips & Tricks\n",
    "## How to optimize prompts for Cypher statement generation to retrieve relevant information from Neo4j in your LLM applications\n",
    "\n",
    "Last time, we looked at how to get started with [Cypher Search in the LangChain](https://towardsdatascience.com/langchain-has-added-cypher-search-cb9d821120d5) library and why you would want to use knowledge graphs in your LLM applications. In this blog post, we will continue to explore various use cases for integrating knowledge graphs into LLM and LangChain applications. Along the way, you will learn how to improve prompts to produce better and more accurate Cypher statements.\n",
    "\n",
    "Specifically, we will look at how to use the few-shot capabilities of LLMs by providing a couple of Cypher statement examples, which can be used to specify which Cypher statements the LLM should produce, what the results should look like, and more. Additionally, you will learn how you can integrate graph algorithms from the Neo4j Graph Data Science library into your LangChain applications.\n",
    "\n",
    "## Neo4j environment setup\n",
    "\n",
    "In this blog post, we will be using the [Twitch dataset that is available in Neo4j Sandbox](https://sandbox.neo4j.com/?usecase=twitch).\n",
    "\n",
    "![graph-model.png]()\n",
    "\n",
    "The Twitch social network composes of users. A small percentage of those users broadcast their gameplay or activities through live streams. In the graph model, users who do live streams are tagged with a secondary label Stream. Additional information about which teams they belong to, which games they play on stream, and in which language they present their content is present. We also know how many followers they had at the moment of scraping, the all-time historical view count, and when they created their accounts. The most relevant information for network analysis is knowing which users engaged in the streamer's chat. You can distinguish if the user who chatted in the stream was a regular user (CHATTER relationship), a moderator of the stream (MODERATOR relationship), or a stream VIP.\n",
    "The network information was scraped between the 7th and the 10th of May 2021. Therefore, the dataset has outdated information.\n",
    "## Improving LangChain Cypher search\n",
    "First, we have to setup the LangChain Cypher search."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "id": "x5oiK71aDhwm"
   },
   "outputs": [],
   "source": [
    "from langchain_openai import ChatOpenAI\n",
    "from langchain.chains import GraphCypherQAChain\n",
    "from langchain_community.graphs import Neo4jGraph\n",
    "\n",
    "graph = Neo4jGraph(\n",
    "    url=\"bolt://3.91.174.113:7687\", \n",
    "    username=\"neo4j\", \n",
    "    password=\"capital-electrode-deeds\"\n",
    ")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "id": "Wliyn4YWDrj0"
   },
   "outputs": [],
   "source": [
    "import os\n",
    "\n",
    "os.environ['OPENAI_API_KEY'] = \"sk-\"\n",
    "\n",
    "chain = GraphCypherQAChain.from_llm(\n",
    "    ChatOpenAI(temperature=0), graph=graph, verbose=True,\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "rd2pXgxZ0Ovo"
   },
   "source": [
    "I really love how easy it is go setup the Cypher Search in the LangChain library. You only need to define the Neo4j and OpenAI credentials, and you are good to go. Under the hood, the graph objects inspects the graph schema model and passes it to the GraphCypherQAChain to construct accurate Cypher statements.\n",
    "\n",
    "Let's begin with a simple question."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/",
     "height": 258
    },
    "id": "oaKOhoyKDvzj",
    "outputId": "2953b317-196e-455a-a836-ce4965f2b34d"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\n",
      "\u001b[1m> Entering new GraphCypherQAChain chain...\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `run` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n",
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Generated Cypher:\n",
      "\u001b[32;1m\u001b[1;3mMATCH (s:Stream)-[:PLAYS]->(:Game {name: \"Fortnite\"})\n",
      "RETURN s.name, s.followers\n",
      "ORDER BY s.followers DESC\n",
      "LIMIT 1\u001b[0m\n",
      "Full Context:\n",
      "\u001b[32;1m\u001b[1;3m[{'s.name': 'thegrefg', 's.followers': 7269018}]\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\u001b[1m> Finished chain.\u001b[0m\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{'query': '\\nWhich fortnite streamer has the most followers?\\n',\n",
       " 'result': 'The Fortnite streamer with the most followers is TheGrefg, who has 7,269,018 followers.'}"
      ]
     },
     "execution_count": 4,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "chain.invoke({'query': \"\"\"\n",
    "Which fortnite streamer has the most followers?\n",
    "\"\"\"})"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "RH7heaDW0RJP"
   },
   "source": [
    "The Cypher chain constructed a relevant Cypher statement, used it to retrieve information from Neo4j, and provided the answer in natural language form.\n",
    "\n",
    "Now let's ask another question."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/",
     "height": 296
    },
    "id": "4Cdzdk29E77u",
    "outputId": "0eef44d4-1d19-4b22-f901-08f68881fb1c"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\n",
      "\u001b[1m> Entering new GraphCypherQAChain chain...\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `run` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n",
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Generated Cypher:\n",
      "\u001b[32;1m\u001b[1;3mMATCH (s:Stream)-[:HAS_LANGUAGE]->(:Language {name: \"Italian\"})\n",
      "RETURN s.name\n",
      "ORDER BY s.followers DESC\n",
      "LIMIT 1\u001b[0m\n",
      "Full Context:\n",
      "\u001b[32;1m\u001b[1;3m[]\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\u001b[1m> Finished chain.\u001b[0m\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{'query': '\\nWhich italian streamer has the most followers?\\n',\n",
       " 'result': \"I'm sorry, but I don't have the information to answer your question.\"}"
      ]
     },
     "execution_count": 5,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "chain.invoke({'query': \"\"\"\n",
    "Which italian streamer has the most followers?\n",
    "\"\"\"})"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "X1_7O7MD0TYr"
   },
   "source": [
    "The generated Cypher statement looks valid, but unfortunately, we didn't get any results. The problem is that the language values are stored as two-character country codes, and the LLM is unaware of that. There are a few options we have to overcome this problem. First, we can utilize the few-shot capabilities of LLMs by providing examples of Cypher statements, which the model then imitates when generating Cypher statements. To add example Cypher statements in the prompt, we have to update the Cypher generating prompt. You can take a look at the default prompt used to generate Cypher statements to better understand the update we are going to do."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "id": "Wu8DkdjXF0Vf"
   },
   "outputs": [],
   "source": [
    "# https://github.com/hwchase17/langchain/blob/master/langchain/chains/graph_qa/prompts.py\n",
    "from langchain.prompts.prompt import PromptTemplate\n",
    "\n",
    "\n",
    "CYPHER_GENERATION_TEMPLATE = \"\"\"Task:Generate Cypher statement to query a graph database.\n",
    "Instructions:\n",
    "Use only the provided relationship types and properties in the schema.\n",
    "Do not use any other relationship types or properties that are not provided.\n",
    "Schema:\n",
    "{schema}\n",
    "Cypher examples:\n",
    "# How many streamers are from Norway?\n",
    "MATCH (s:Stream)-[:HAS_LANGUAGE]->(:Language {{name: 'no'}})\n",
    "RETURN count(s) AS streamers\n",
    "\n",
    "Note: Do not include any explanations or apologies in your responses.\n",
    "Do not respond to any questions that might ask anything else than for you to construct a Cypher statement.\n",
    "Do not include any text except the generated Cypher statement.\n",
    "\n",
    "The question is:\n",
    "{question}\"\"\"\n",
    "CYPHER_GENERATION_PROMPT = PromptTemplate(\n",
    "    input_variables=[\"schema\", \"question\"], template=CYPHER_GENERATION_TEMPLATE\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "2gR8nPcQ0VAM"
   },
   "source": [
    "If you compare the new Cypher generating prompt to the default one, you can observe we only added the Cypher examples section. We added an example where the model could observe that the language values are given as two-character country codes. Now we can test the improved Cypher chain to answer the question about the most followed Italian streamers."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/",
     "height": 276
    },
    "id": "uHqJDR43HFOf",
    "outputId": "1801e211-2250-4b57-b145-50aaba77edcb"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\n",
      "\u001b[1m> Entering new GraphCypherQAChain chain...\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `run` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n",
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Generated Cypher:\n",
      "\u001b[32;1m\u001b[1;3mMATCH (s:Stream)-[:HAS_LANGUAGE]->(:Language {name: 'it'})\n",
      "RETURN s.name AS streamer, s.followers AS followers\n",
      "ORDER BY followers DESC\n",
      "LIMIT 1\u001b[0m\n",
      "Full Context:\n",
      "\u001b[32;1m\u001b[1;3m[{'streamer': 'pow3rtv', 'followers': 1530428}]\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\u001b[1m> Finished chain.\u001b[0m\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{'query': '\\nWhich italian streamer has the most followers?\\n',\n",
       " 'result': 'The Italian streamer with the most followers is pow3rtv, who has 1,530,428 followers.'}"
      ]
     },
     "execution_count": 7,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "chain_language_example = GraphCypherQAChain.from_llm(\n",
    "    ChatOpenAI(temperature=0), graph=graph, verbose=True,\n",
    "    cypher_prompt=CYPHER_GENERATION_PROMPT\n",
    ")\n",
    "\n",
    "chain_language_example.invoke({'query':\"\"\"\n",
    "Which italian streamer has the most followers?\n",
    "\"\"\"})\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "sjqCT1x00XXF"
   },
   "source": [
    "The model is now aware that the languages are given as two-character country codes and can now accurately answer questions that use the language information.\n",
    "\n",
    "## Using graph algorithms to answer questions\n",
    "In the previous blog post, we looked at how integrating graph databases into LLM applications can answer questions like how entities are connected by finding the shortest or other paths between them. Today we will look at another use cases where graph databases can be used in LLM applications that other databases struggle with, specifically how we can use graph algorithms like PageRank to provide relevant answers. For example, we can use personalized PageRank to provide recommendations to an end user at query time.\n",
    "\n",
    "Take a look at the following example:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/",
     "height": 239
    },
    "id": "-fsAoE_MHvDq",
    "outputId": "271397f0-c027-4e2c-a310-9ef39b1da9d1"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\n",
      "\u001b[1m> Entering new GraphCypherQAChain chain...\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `run` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n",
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Generated Cypher:\n",
      "\u001b[32;1m\u001b[1;3mMATCH (s:Stream)-[:PLAYS]->(:Game {name: 'pokimane'})-[:PLAYS]->(:Stream)-[:PLAYS]->(g:Game)\n",
      "WHERE g.name <> 'pokimane'\n",
      "RETURN DISTINCT s.name AS streamers\u001b[0m\n",
      "Full Context:\n",
      "\u001b[32;1m\u001b[1;3m[]\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\u001b[1m> Finished chain.\u001b[0m\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{'query': '\\nWhich streamers should I also watch if I like pokimane?\\n',\n",
       " 'result': 'If you enjoy watching Pokimane, you might also want to check out other popular streamers such as Valkyrae, LilyPichu, and Amouranth. These streamers have their own unique styles and content that you might find enjoyable. Happy streaming!'}"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "chain_language_example.invoke({'query':\"\"\"\n",
    "Which streamers should I also watch if I like pokimane?\n",
    "\"\"\"})"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "Uui_hb0v0cCf"
   },
   "source": [
    "Interestingly, every time we rerun this question, the model will generate a different Cypher statement. However, one thing is consistent. For some reason, every time the League of Legends is somehow included in the query.\n",
    "\n",
    "A bit more worrying fact is that the LLM model provided recommendations even though it wasn't provided with any suggestions in the prompt context. It's known that gpt-3.5-turbo sometimes doesn't follow the rules, especially if you do not repeat them more than once.\n",
    "\n",
    "Repeating the instruction three times can help gpt-3.5-turbo solve this problem. However, by repeating instructions, you are increasing the token count and consequently the cost of Cypher generation. Therefore, it would take some prompt engineering to get the best results using the lowest count of tokens.\n",
    "\n",
    "As mentioned, we will use Personalized PageRank to provide stream recommendations. But first, we need to project the in-memory graph and run the Node Similarity algorithm to prepare the graph to be able to give recommendations. Look at my [previous blog post](https://towardsdatascience.com/twitchverse-a-network-analysis-of-twitch-universe-using-neo4j-graph-data-science-d7218b4453ff) to learn more about how graph algorithms can be used to analyze the Twitch network."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "id": "6NimpqImIXzU",
    "outputId": "073caee8-2102-41cb-d5c9-caa4e6d4add1"
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "[{'preProcessingMillis': 0,\n",
       "  'computeMillis': 98108,\n",
       "  'mutateMillis': 152,\n",
       "  'postProcessingMillis': -1,\n",
       "  'nodesCompared': 4538,\n",
       "  'relationshipsWritten': 23609,\n",
       "  'similarityDistribution': {'min': 0.04999995231628418,\n",
       "   'p5': 0.05223870277404785,\n",
       "   'max': 0.9291150569915771,\n",
       "   'p99': 0.46153998374938965,\n",
       "   'p1': 0.05039477348327637,\n",
       "   'p10': 0.05494499206542969,\n",
       "   'p90': 0.27272772789001465,\n",
       "   'p50': 0.08695673942565918,\n",
       "   'p25': 0.06399989128112793,\n",
       "   'p75': 0.14691996574401855,\n",
       "   'p95': 0.3424661159515381,\n",
       "   'mean': 0.1265612697302399,\n",
       "   'p100': 0.9291150569915771,\n",
       "   'stdDev': 0.09586148263128431},\n",
       "  'configuration': {'mutateProperty': 'score',\n",
       "   'jobId': 'd729806a-0fdf-41d0-88ca-7d90dbc644e5',\n",
       "   'topN': 0,\n",
       "   'upperDegreeCutoff': 2147483647,\n",
       "   'topK': 10,\n",
       "   'similarityCutoff': 0.05,\n",
       "   'sudo': True,\n",
       "   'degreeCutoff': 1,\n",
       "   'mutateRelationshipType': 'SHARED_AUDIENCE',\n",
       "   'bottomN': 0,\n",
       "   'bottomK': 10,\n",
       "   'logProgress': True,\n",
       "   'nodeLabels': ['*'],\n",
       "   'concurrency': 4,\n",
       "   'relationshipTypes': ['*'],\n",
       "   'similarityMetric': 'JACCARD'}}]"
      ]
     },
     "execution_count": 9,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# Project in-memory graph\n",
    "graph.query(\"\"\"\n",
    "CALL gds.graph.project('shared-audience',\n",
    "  ['User', 'Stream'],\n",
    "  {CHATTER: {orientation:'REVERSE'}})\n",
    "\"\"\")\n",
    "\n",
    "# Run node similarity algorithm\n",
    "graph.query(\"\"\"\n",
    "CALL gds.nodeSimilarity.mutate('shared-audience',\n",
    " {similarityMetric: 'Jaccard',similarityCutoff:0.05, topK:10, sudo:true,\n",
    "     mutateProperty:'score', mutateRelationshipType:'SHARED_AUDIENCE'})\n",
    "\"\"\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "LFCvIpyW0o2J"
   },
   "source": [
    "The node similarity algorithm will take about 30 seconds to complete as the database has almost five million users. The Cypher statement to provide recommendations using Personalized PageRank is the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "id": "hmoSjlwsquPi",
    "outputId": "a35e7e2b-9016-42f3-8e1d-ce97d0282e86"
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "[{'streamer': 'tranth', 'score': 0.13697276805472164},\n",
       " {'streamer': 'jungtaejune', 'score': 0.13697276805472164},\n",
       " {'streamer': 'hanryang1125', 'score': 0.10511818935406857}]"
      ]
     },
     "execution_count": 10,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "graph.query(\"\"\"\n",
    "MATCH (s:Stream)\n",
    "WHERE s.name = \"kimdoe\"\n",
    "WITH collect(s) AS sourceNodes\n",
    "CALL gds.pageRank.stream(\"shared-audience\", \n",
    "  {sourceNodes:sourceNodes, relationshipTypes:['SHARED_AUDIENCE'], \n",
    "    nodeLabels:['Stream']})\n",
    "YIELD nodeId, score\n",
    "WITH gds.util.asNode(nodeId) AS node, score\n",
    "WHERE NOT node in sourceNodes\n",
    "RETURN node.name AS streamer, score\n",
    "ORDER BY score DESC LIMIT 3\n",
    "\"\"\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "VlfPC50_0rBU"
   },
   "source": [
    "The OpenAI LLMs could be better at using the Graph Data Science library as their knowledge cutoff is September 2021, and version 2 of the Graph Data Science library was released in April 2022. Therefore, we need to provide another example in the prompt to show the LLM show to use Personalized PageRank to give recommendations."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "id": "ZzgTn65jJXvF"
   },
   "outputs": [],
   "source": [
    "# https://github.com/hwchase17/langchain/blob/master/langchain/chains/graph_qa/prompts.py\n",
    "\n",
    "CYPHER_RECOMMENDATION_TEMPLATE = \"\"\"Task:Generate Cypher statement to query a graph database.\n",
    "Instructions:\n",
    "Use only the provided relationship types and properties in the schema.\n",
    "Do not use any other relationship types or properties that are not provided.\n",
    "Schema:\n",
    "{schema}\n",
    "Cypher examples:\n",
    "# How many streamers are from Norway?\n",
    "MATCH (s:Stream)-[:HAS_LANGUAGE]->(:Language {{name: 'no'}})\n",
    "RETURN count(s) AS streamers\n",
    "# Which streamers do you recommend if I like kimdoe?\n",
    "MATCH (s:Stream)\n",
    "WHERE s.name = \"kimdoe\"\n",
    "WITH collect(s) AS sourceNodes\n",
    "CALL gds.pageRank.stream(\"shared-audience\", \n",
    "  {{sourceNodes:sourceNodes, relationshipTypes:['SHARED_AUDIENCE'], \n",
    "    nodeLabels:['Stream']}})\n",
    "YIELD nodeId, score\n",
    "WITH gds.util.asNode(nodeId) AS node, score\n",
    "WHERE NOT node in sourceNodes\n",
    "RETURN node.name AS streamer, score\n",
    "ORDER BY score DESC LIMIT 3\n",
    "\n",
    "Note: Do not include any explanations or apologies in your responses.\n",
    "Do not respond to any questions that might ask anything else than for you to construct a Cypher statement.\n",
    "Do not include any text except the generated Cypher statement.\n",
    "\n",
    "The question is:\n",
    "{question}\"\"\"\n",
    "CYPHER_RECOMMENDATION_PROMPT = PromptTemplate(\n",
    "    input_variables=[\"schema\", \"question\"], template=CYPHER_RECOMMENDATION_TEMPLATE\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "KaSQI5me0s5f"
   },
   "source": [
    "We can now test the Personalized PageRank recommendations."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/",
     "height": 405
    },
    "id": "r14bQBlaL03n",
    "outputId": "25bef31a-21b2-44e6-b09a-0e60ea97bec6"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\n",
      "\u001b[1m> Entering new GraphCypherQAChain chain...\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `run` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n",
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Generated Cypher:\n",
      "\u001b[32;1m\u001b[1;3mMATCH (s:Stream)\n",
      "WHERE s.name = \"pokimane\"\n",
      "WITH collect(s) AS sourceNodes\n",
      "CALL gds.pageRank.stream(\"shared-audience\", \n",
      "  {sourceNodes:sourceNodes, relationshipTypes:['SHARED_AUDIENCE'], \n",
      "    nodeLabels:['Stream']})\n",
      "YIELD nodeId, score\n",
      "WITH gds.util.asNode(nodeId) AS node, score\n",
      "WHERE NOT node in sourceNodes\n",
      "RETURN node.name AS streamer, score\n",
      "ORDER BY score DESC LIMIT 3\u001b[0m\n",
      "Full Context:\n",
      "\u001b[32;1m\u001b[1;3m[{'streamer': 'xchocobars', 'score': 0.2343657053097286}, {'streamer': 'ariasaki', 'score': 0.06485239618458194}, {'streamer': 'natsumiii', 'score': 0.059693694865124915}]\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\u001b[1m> Finished chain.\u001b[0m\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{'query': '\\nWhich streamers do you recommend if I like pokimane?\\n',\n",
       " 'result': \"Based on the scores, I would recommend you to check out 'xchocobars' as she has the highest score. You might also enjoy watching 'ariasaki' and 'natsumiii'.\"}"
      ]
     },
     "execution_count": 12,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "chain_recommendation_example = GraphCypherQAChain.from_llm(\n",
    "    ChatOpenAI(temperature=0, model_name='gpt-4'), graph=graph, verbose=True,\n",
    "    cypher_prompt=CYPHER_RECOMMENDATION_PROMPT, \n",
    ")\n",
    "\n",
    "chain_recommendation_example.invoke({'query':\"\"\"\n",
    "Which streamers do you recommend if I like pokimane?\n",
    "\"\"\"})"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "F6BOSdmG0u8W"
   },
   "source": [
    "Unfortunately, here, we have to use the gpt-4 model as the gpt-3.5-turbo is stubborn and doesn't want to imitate the complex Personalized PageRank example.\n",
    "\n",
    "We can also test if the gpt-4 model will decide to generalize the Personalized PageRank recommendation in other use cases."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/",
     "height": 278
    },
    "id": "ZvWNlpFIMJwj",
    "outputId": "15f6edae-2938-4ae2-aa9c-d1ad3d78ddbb"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\n",
      "\u001b[1m> Entering new GraphCypherQAChain chain...\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `run` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n",
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Generated Cypher:\n",
      "\u001b[32;1m\u001b[1;3mMATCH (g:Game {name: 'Chess'})<-[:PLAYS]-(s:Stream)\n",
      "RETURN s.name AS streamer\n",
      "ORDER BY s.followers DESC LIMIT 3\u001b[0m\n",
      "Full Context:\n",
      "\u001b[32;1m\u001b[1;3m[{'streamer': 'gmhikaru'}, {'streamer': 'thisisnotgeorgenotfound'}, {'streamer': 'gothamchess'}]\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\u001b[1m> Finished chain.\u001b[0m\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{'query': '\\nWhich streamers do you recommend to watch if I like Chess games?\\n',\n",
       " 'result': \"I recommend watching gmhikaru, thisisnotgeorgenotfound, and gothamchess if you're interested in Chess games.\"}"
      ]
     },
     "execution_count": 13,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "chain_recommendation_example.invoke({'query':\"\"\"\n",
    "Which streamers do you recommend to watch if I like Chess games?\n",
    "\"\"\"})"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "72O2RJqR0xtE"
   },
   "source": [
    "The LLM decided to take a more straightforward route to provide recommendations and simply returned the three chess players with the highest follower count. We can't really blame it for choosing this option.\n",
    "\n",
    "However, LLMs are quite good at listening to hints:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/",
     "height": 368
    },
    "id": "ys23tsgfMYRp",
    "outputId": "672cd445-cd6e-464d-cb4e-557a81f638b5"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\n",
      "\u001b[1m> Entering new GraphCypherQAChain chain...\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `run` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n",
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Generated Cypher:\n",
      "\u001b[32;1m\u001b[1;3mMATCH (s:Stream)-[:PLAYS]->(:Game {name: 'Chess'})\n",
      "WITH collect(s) AS sourceNodes\n",
      "CALL gds.pageRank.stream(\"shared-audience\", \n",
      "  {sourceNodes:sourceNodes, relationshipTypes:['SHARED_AUDIENCE'], \n",
      "    nodeLabels:['Stream']})\n",
      "YIELD nodeId, score\n",
      "WITH gds.util.asNode(nodeId) AS node, score\n",
      "RETURN node.name AS streamer, score\n",
      "ORDER BY score DESC LIMIT 3\u001b[0m\n",
      "Full Context:\n",
      "\u001b[32;1m\u001b[1;3m[{'streamer': 'segonaye', 'score': 1.1104145011001378}, {'streamer': 'dafatw01', 'score': 0.9785136791483199}, {'streamer': 'chessbrah', 'score': 0.9612404689154856}]\u001b[0m\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/tomazbratanic/anaconda3/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:189: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
      "  warn_deprecated(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "\u001b[1m> Finished chain.\u001b[0m\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{'query': '\\nWhich streamers do you recommend to watch if I like Chess games?\\nUse Personalized PageRank to provide recommendations.\\nDo not exclude sourceNodes in the answer\\n',\n",
       " 'result': \"Based on the Personalized PageRank scores, I would recommend watching the following streamers for Chess games: 'segonaye' with a score of 1.11, 'dafatw01' with a score of 0.978, and 'chessbrah' with a score of 0.961. These scores suggest that these streamers are highly relevant to your interest in Chess games.\"}"
      ]
     },
     "execution_count": 14,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "chain_recommendation_example.invoke({'query':\"\"\"\n",
    "Which streamers do you recommend to watch if I like Chess games?\n",
    "Use Personalized PageRank to provide recommendations.\n",
    "Do not exclude sourceNodes in the answer\n",
    "\"\"\"})"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "vD3m_ovB01_3"
   },
   "source": [
    "## Summary\n",
    "In this blog post, we expanded on using knowledge graphs in LangChain applications, focusing on improving prompts for better Cypher statements. The main opportunity to improve the Cypher generation accuracy is to use the few-shot capabilities of LLMs, offering Cypher statement examples that dictate the type of statements an LLM should produce. Sometimes, the LLM model doesn't correctly guess the property values, while other times, it doesn't provide the Cypher statements we would like it to generate. Additionally, we have looked at how we can use graph algorithms like Personalized PageRank in LLM applications to provide better and more relevant answers."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "id": "zJvtuXlvuaon"
   },
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "colab": {
   "authorship_tag": "ABX9TyM4xOyaU8TfeD/8PDfwIi4J",
   "include_colab_link": true,
   "provenance": []
  },
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.11.5"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
