{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "bFj2ZAU6BtiR"
   },
   "source": [
    "<h1 align=\"center\">\n",
    "  <a href=\"https://portkey.ai\">\n",
    "    <img width=\"300\" src=\"https://analyticsindiamag.com/wp-content/uploads/2023/08/Logo-on-white-background.png\" alt=\"portkey\">\n",
    "  </a>\n",
    "</h1>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "5Mfpp2FLAYEA"
   },
   "source": [
    "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1SiyWV8ER-Gp2GEkMr9aA3KhebdeEJHBK?usp=sharing)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "ekkvItFsWyQL"
   },
   "source": [
    "# Portkey + nCompass Technologies"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "x8RrtT5yYEev"
   },
   "source": [
    "[Portkey](https://app.portkey.ai/) is the Control Panel for AI apps. With it's popular AI Gateway and Observability Suite, hundreds of teams ship reliable, cost-efficient, and fast apps.\n",
    "\n",
    "With Portkey, you can\n",
    "\n",
    " - Connect to 150+ models through a unified API,\n",
    " - View 40+ metrics & logs for all requests,\n",
    " - Enable semantic cache to reduce latency & costs,\n",
    " - Implement automatic retries & fallbacks for failed requests,\n",
    " - Add custom tags to requests for better tracking and analysis and more.\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "n1kPDnimZIXb"
   },
   "source": [
    "## Quickstart\n",
    "\n",
    "Since Portkey is fully compatible with the OpenAI signature, you can connect to the Portkey AI Gateway through OpenAI Client.\n",
    "\n",
    "- Set the `base_url` as `PORTKEY_GATEWAY_URL`\n",
    "- Add `default_headers` to consume the headers needed by Portkey using the `createHeaders` helper method."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "rQSrrUBwkmOP"
   },
   "source": [
    "You will need Portkey and nCompass API keys to run this notebook.\n",
    "\n",
    "- Sign up for Portkey and generate your API key [here](https://app.portkey.ai/).\n",
    "- Get your nCompass key [here](https://app.ncompass.tech/api-settings).\n",
    "- Check out the full list of models on nCompass' API [here](https://ncompass.tech/models)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "executionInfo": {
     "elapsed": 20046,
     "status": "ok",
     "timestamp": 1714220024728,
     "user": {
      "displayName": "Satvik Paramkusham",
      "userId": "09992778153373457651"
     },
     "user_tz": -330
    },
    "id": "fffx7Tc2ghTR",
    "outputId": "09ca4f38-4ad3-415d-ac40-94b92377230f"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m62.3/62.3 kB\u001b[0m \u001b[31m1.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
      "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m311.6/311.6 kB\u001b[0m \u001b[31m9.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
      "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m75.6/75.6 kB\u001b[0m \u001b[31m7.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
      "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m12.7/12.7 MB\u001b[0m \u001b[31m35.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
      "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m77.9/77.9 kB\u001b[0m \u001b[31m5.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
      "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m58.3/58.3 kB\u001b[0m \u001b[31m2.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
      "\u001b[?25h"
     ]
    }
   ],
   "source": [
    "!pip install -qU portkey-ai openai"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "ptP4L78HlBUL"
   },
   "source": [
    "## With OpenAI Client"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "executionInfo": {
     "elapsed": 11755,
     "status": "ok",
     "timestamp": 1714220347821,
     "user": {
      "displayName": "Satvik Paramkusham",
      "userId": "09992778153373457651"
     },
     "user_tz": -330
    },
    "id": "Yz7e9rokcCj0",
    "outputId": "23c67962-4166-4bb1-cd37-9ca81495a9aa"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Nice to meet you! I'm LLaMA, a large language model trained by a team of researcher at Meta AI. My primary function is to generate human-like responses to a wide range of questions and topics, from science and history to entertainment and culture.\n",
      "\n",
      "I'm not a human, but rather an artificial intelligence designed to simulate conversation and answer questions to the best of my knowledge. I've been trained on a massive dataset of text from the internet and can respond in multiple languages.\n",
      "\n",
      "I can help with things like:\n",
      "\n",
      "* Answering questions on a variety of topics\n",
      "* Generating text on a given topic or subject\n",
      "* Translating text from one language to another\n",
      "* Summarizing long pieces of text into shorter, more digestible versions\n",
      "* Offering suggestions or ideas for creative projects\n",
      "* Even just having a conversation and chatting about your day or interests!\n",
      "\n",
      "So, what's on your mind? Want to chat about something specific or just see where the conversation takes us?\n"
     ]
    }
   ],
   "source": [
    "from openai import OpenAI\n",
    "from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders\n",
    "from google.colab import userdata\n",
    "\n",
    "client = OpenAI(\n",
    "    api_key= userdata.get('NCOMPASS_API_KEY'), ## replace it your nCompass API key\n",
    "    base_url=PORTKEY_GATEWAY_URL,\n",
    "    default_headers=createHeaders(\n",
    "        provider=\"ncompass\",\n",
    "        api_key= userdata.get('PORTKEY_API_KEY'), ## replace it your Portkey API key\n",
    "    )\n",
    ")\n",
    "\n",
    "chat_complete = client.chat.completions.create(\n",
    "    model=\"meta-llama/Llama-3.3-70B-Instruct\",\n",
    "    messages=[{\"role\": \"user\",\n",
    "               \"content\": \"Who are you?\"}],\n",
    ")\n",
    "\n",
    "print(chat_complete.choices[0].message.content)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "wMhOOkaLqkSp"
   },
   "source": [
    "## Observability with Portkey"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "LzECvXQaqr6Z"
   },
   "source": [
    "By routing requests through Portkey you can track a number of metrics like - tokens used, latency, cost, etc.\n",
    "\n",
    "Here's a screenshot of the dashboard you get with Portkey!"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "W80YWspqr7s0"
   },
   "source": [
    "![Screenshot 2024-04-10 at 4.32.34 PM.png]()\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "id": "yqrIH5mY01BI"
   },
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "colab": {
   "provenance": []
  },
  "kernelspec": {
   "display_name": "Python 3",
   "name": "python3"
  },
  "language_info": {
   "name": "python"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 0
}
