{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "0",
   "metadata": {},
   "source": [
    "# V1/V2 Client Compatibility Demonstration\n",
    "\n",
    "This notebook demonstrates that AG2's V2 client architecture (ModelClientV2) is fully compatible with V1 clients (ModelClient). Multiple client versions can work together seamlessly in the same group chat by specifying different `llm_config` settings.\n",
    "\n",
    "## Key Concept: Client Version Determined by `llm_config`\n",
    "\n",
    "The client version is controlled by the `api_type` field in `llm_config`:\n",
    "\n",
    "- **V2 Client**: `\"api_type\": \"openai_v2\"` - Returns rich `UnifiedResponse` with typed content blocks\n",
    "- **V1 Clients**: `\"api_type\": \"google\"` or default OpenAI config - Uses legacy response format\n",
    "\n",
    "This example demonstrates a group chat with mixed client versions working together."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "1",
   "metadata": {},
   "outputs": [],
   "source": [
    "import os\n",
    "\n",
    "# Configure LLM to use V2 client\n",
    "llm_config = {\n",
    "    \"config_list\": [\n",
    "        {\n",
    "            \"api_type\": \"openai_v2\",  # <-- Key: use V2 client architecture\n",
    "            \"model\": \"gpt-4o-mini\",  # Vision-capable model\n",
    "            \"api_key\": os.getenv(\"OPENAI_API_KEY\"),\n",
    "        }\n",
    "    ],\n",
    "    \"temperature\": 0.3,\n",
    "}\n",
    "\n",
    "llm_config_v1_gemini = {\n",
    "    \"config_list\": [\n",
    "        {\n",
    "            \"api_type\": \"google\",  # <-- Key: use V1 client architecture\n",
    "            \"model\": \"gemini-2.5-flash\",\n",
    "            \"api_key\": os.getenv(\"GEMINI_API_KEY\"),\n",
    "        }\n",
    "    ],\n",
    "    \"temperature\": 0.3,\n",
    "}\n",
    "\n",
    "llm_config_v1_oai = {\n",
    "    \"config_list\": [\n",
    "        {\n",
    "            \"model\": \"gpt-4o-mini\",\n",
    "            \"api_key\": os.getenv(\"OPENAI_API_KEY\"),\n",
    "        }\n",
    "    ],\n",
    "    \"temperature\": 0.3,\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2",
   "metadata": {},
   "source": [
    "## Configuration: Three LLM Configs with Different Client Versions\n",
    "\n",
    "Below we configure three different `llm_config` settings demonstrating V1 and V2 client versions:\n",
    "\n",
    "1. **`llm_config`** - V2 OpenAI client using `\"api_type\": \"openai_v2\"`\n",
    "2. **`llm_config_v1_gemini`** - V1 Gemini client using `\"api_type\": \"google\"`\n",
    "3. **`llm_config_v1_oai`** - V1 OpenAI client using default configuration (no explicit `api_type`)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "3",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Group chat amongst agents to create a 4th grade lesson plan\n",
    "# Flow determined by Group Chat Manager automatically, and\n",
    "# should be Teacher > Planner > Reviewer > Teacher (repeats if necessary)\n",
    "\n",
    "# 1. Import our agent and group chat classes\n",
    "from autogen import ConversableAgent, GroupChat, GroupChatManager\n",
    "\n",
    "# Define our LLM configuration for OpenAI's GPT-4o mini\n",
    "# uses the OPENAI_API_KEY environment variable\n",
    "\n",
    "# Planner agent setup\n",
    "planner_message = \"Create lesson plans for 4th grade. Use format: <title>, <learning_objectives>, <script>\"\n",
    "planner = ConversableAgent(\n",
    "    name=\"planner_agent\", llm_config=llm_config, system_message=planner_message, description=\"Creates lesson plans\"\n",
    ")\n",
    "\n",
    "# Reviewer agent setup\n",
    "reviewer_message = \"Review lesson plans against 4th grade curriculum. Provide max 3 changes.\"\n",
    "reviewer = ConversableAgent(\n",
    "    name=\"reviewer_agent\",\n",
    "    llm_config=llm_config_v1_gemini,\n",
    "    system_message=reviewer_message,\n",
    "    description=\"Reviews lesson plans\",\n",
    ")\n",
    "\n",
    "# Teacher agent setup\n",
    "teacher_message = \"Choose topics and work with planner and reviewer. Say DONE! when finished.\"\n",
    "teacher = ConversableAgent(\n",
    "    name=\"teacher_agent\",\n",
    "    llm_config=llm_config_v1_oai,\n",
    "    system_message=teacher_message,\n",
    ")\n",
    "\n",
    "# Setup group chat\n",
    "groupchat = GroupChat(agents=[teacher, planner, reviewer], speaker_selection_method=\"auto\", messages=[])\n",
    "\n",
    "# Create manager\n",
    "# At each turn, the manager will check if the message contains DONE! and end the chat if so\n",
    "# Otherwise, it will select the next appropriate agent using its LLM\n",
    "manager = GroupChatManager(\n",
    "    name=\"group_manager\",\n",
    "    groupchat=groupchat,\n",
    "    llm_config=llm_config,\n",
    "    is_termination_msg=lambda x: \"DONE!\" in (x.get(\"content\", \"\") or \"\").upper(),\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "4",
   "metadata": {},
   "source": [
    "## Group Chat with Mixed V1/V2 Clients\n",
    "\n",
    "Now we'll create a group chat with agents using different client versions:\n",
    "\n",
    "- **Planner Agent**: Uses V2 OpenAI client (`llm_config`)\n",
    "- **Reviewer Agent**: Uses V1 Gemini client (`llm_config_v1_gemini`)\n",
    "- **Teacher Agent**: Uses V2 OpenAI client (`llm_config`)\n",
    "- **Group Manager**: Uses V2 OpenAI client (`llm_config`)\n",
    "\n",
    "This demonstrates that V1 and V2 clients can work together seamlessly in the same conversation."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "5",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Start the conversation\n",
    "chat_result = teacher.initiate_chat(recipient=manager, message=\"Let's teach the kids about the solar system.\")\n",
    "\n",
    "# Print the chat\n",
    "print(chat_result.chat_history)"
   ]
  }
 ],
 "metadata": {
  "front_matter": {
   "description": "Client V1/V2 Compatibility",
   "tags": [
    "openai",
    "clients"
   ]
  },
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.11.11"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
