{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "9fd54a32",
   "metadata": {},
   "source": [
    "<a href=\"https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/fireworks_cookbook.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "9e3a8796-edc8-43f2-94ad-fe4fb20d70ed",
   "metadata": {},
   "source": [
    "# Fireworks Function Calling Cookbook\n",
    "\n",
    "Fireworks.ai supports function calling for its LLMs, similar to OpenAI. This lets users directly describe the set of tools/functions available and have the model dynamically pick the right function calls to invoke, without complex prompting on the user's part.\n",
    "\n",
    "Since our Fireworks LLM directly subclasses OpenAI, we can use our existing abstractions with Fireworks.\n",
    "\n",
    "We show this on three levels: directly on the model API, as part of a Pydantic Program (structured output extraction), and as part of an agent."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "3f6f8702",
   "metadata": {},
   "outputs": [],
   "source": [
    "%pip install llama-index-llms-fireworks"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "83ea30ee",
   "metadata": {},
   "outputs": [],
   "source": [
    "%pip install llama-index"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "b070abb8-fa3f-4892-b23e-3ae91d0bf340",
   "metadata": {},
   "outputs": [],
   "source": [
    "import os\n",
    "\n",
    "os.environ[\"FIREWORKS_API_KEY\"] = \"fw_3ZkvBpQyjRzbicpihhrihaEP\""
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "5497a17f-1099-4baf-884a-3620705be350",
   "metadata": {},
   "outputs": [],
   "source": [
    "from llama_index.llms.fireworks import Fireworks\n",
    "\n",
    "## define fireworks model, for a list of function calling models see: https://app.fireworks.ai/models/?filter=LLM&functionCalling=true\n",
    "llm = Fireworks(\n",
    "    model=\"accounts/fireworks/models/deepseek-v3p1-terminus\", temperature=0\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b007403c-6b7a-420c-92f1-4171d05ed9bb",
   "metadata": {},
   "source": [
    "## Function Calling on the LLM Module\n",
    "\n",
    "You can directly input function calls on the LLM module."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "015c2d39",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "ChatCompletion(id='07921e74-5dca-409c-a4d3-1a2e0c7cd1e7', choices=[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='```json\\n{\\n  \"name\": \"Halo\",\\n  \"artist\": \"Beyoncé\"\\n}\\n```', refusal=None, role='assistant', annotations=None, audio=None, function_call=None, tool_calls=None))], created=1761704700, model='accounts/fireworks/models/kimi-k2-instruct-0905', object='chat.completion', service_tier=None, system_fingerprint=None, usage=CompletionUsage(completion_tokens=25, prompt_tokens=145, total_tokens=170, completion_tokens_details=None, prompt_tokens_details=PromptTokensDetails(audio_tokens=None, cached_tokens=0)))\n"
     ]
    }
   ],
   "source": [
    "import os\n",
    "import json\n",
    "from openai import OpenAI\n",
    "from pydantic import BaseModel, Field\n",
    "from llama_index.llms.openai.utils import to_openai_tool\n",
    "\n",
    "\n",
    "class Song(BaseModel):\n",
    "    \"\"\"A song with name and artist\"\"\"\n",
    "\n",
    "    name: str = Field(description=\"The name of the song\")\n",
    "    artist: str = Field(description=\"The artist who performed the song\")\n",
    "\n",
    "\n",
    "song_fn = to_openai_tool(Song)\n",
    "\n",
    "# Initialize Fireworks client\n",
    "client = OpenAI(\n",
    "    api_key=os.environ.get(\"FIREWORKS_API_KEY\"),\n",
    "    base_url=\"https://api.fireworks.ai/inference/v1\",\n",
    ")\n",
    "\n",
    "response = client.chat.completions.create(\n",
    "    model=\"accounts/fireworks/models/kimi-k2-instruct-0905\",\n",
    "    messages=[{\"role\": \"user\", \"content\": \"Generate a song from Beyonce\"}],\n",
    "    tools=[song_fn],\n",
    "    temperature=0.1,\n",
    ")\n",
    "\n",
    "print(response)\n",
    "\n",
    "if response.choices[0].message.tool_calls:\n",
    "    tool_call = response.choices[0].message.tool_calls[0]\n",
    "    print(f\"\\nTool called: {tool_call.function.name}\")\n",
    "\n",
    "    # Parse the arguments to get structured output\n",
    "    arguments = json.loads(tool_call.function.arguments)\n",
    "    print(f\"Arguments: {arguments}\")\n",
    "\n",
    "    # Create Song instance from the structured output\n",
    "    song = Song(**arguments)\n",
    "    print(f\"\\nExtracted Song:\")\n",
    "    print(f\"Name: {song.name}\")\n",
    "    print(f\"Artist: {song.artist}\")"
   ]
  }
 ],
 "metadata": {
  "colab": {
   "provenance": []
  },
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
