{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "8e1ec46b",
   "metadata": {},
   "source": [
    "# Using multiple Kafka clusters"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "dcfccd0f",
   "metadata": {},
   "outputs": [],
   "source": [
    "# | hide\n",
    "\n",
    "import platform\n",
    "import pytest\n",
    "from IPython.display import Markdown as md\n",
    "\n",
    "from pydantic import BaseModel, Field\n",
    "\n",
    "from fastkafka import FastKafka\n",
    "from fastkafka.testing import Tester, ApacheKafkaBroker, run_script_and_cancel"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "61526c5c",
   "metadata": {},
   "source": [
    "Ready to take your FastKafka app to the next level? This guide shows you how to connect to multiple Kafka clusters effortlessly. Consolidate topics and produce messages across clusters like a pro. \n",
    "Unleash the full potential of your Kafka-powered app with FastKafka. Let's dive in and elevate your application's capabilities!"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "099c41ef",
   "metadata": {},
   "source": [
    "### Test message\n",
    "\n",
    "To showcase the functionalities of FastKafka and illustrate the concepts discussed, we can use a simple test message called `TestMsg`. Here's the definition of the `TestMsg` class:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "c4828bd1",
   "metadata": {},
   "outputs": [],
   "source": [
    "class TestMsg(BaseModel):\n",
    "    msg: str = Field(...)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "79d89a52",
   "metadata": {},
   "source": [
    "## Defining multiple broker configurations\n",
    "\n",
    "When building a FastKafka application, you may need to consume messages from multiple Kafka clusters, each with its own set of broker configurations. FastKafka provides the flexibility to define different broker clusters using the brokers argument in the consumes decorator. Let's explore an example code snippet"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "53fb0f9b",
   "metadata": {},
   "outputs": [],
   "source": [
    "from pydantic import BaseModel, Field\n",
    "\n",
    "from fastkafka import FastKafka\n",
    "\n",
    "\n",
    "class TestMsg(BaseModel):\n",
    "    msg: str = Field(...)\n",
    "\n",
    "\n",
    "kafka_brokers_1 = dict(\n",
    "    development=dict(url=\"dev.server_1\", port=9092),\n",
    "    production=dict(url=\"prod.server_1\", port=9092),\n",
    ")\n",
    "kafka_brokers_2 = dict(\n",
    "    development=dict(url=\"dev.server_2\", port=9092),\n",
    "    production=dict(url=\"prod.server_1\", port=9092),\n",
    ")\n",
    "\n",
    "app = FastKafka(kafka_brokers=kafka_brokers_1, bootstrap_servers_id=\"development\")\n",
    "\n",
    "\n",
    "@app.consumes(topic=\"preprocessed_signals\")\n",
    "async def on_preprocessed_signals_1(msg: TestMsg):\n",
    "    print(f\"Received on s1: {msg=}\")\n",
    "    await to_predictions_1(msg)\n",
    "\n",
    "\n",
    "@app.consumes(topic=\"preprocessed_signals\", brokers=kafka_brokers_2)\n",
    "async def on_preprocessed_signals_2(msg: TestMsg):\n",
    "    print(f\"Received on s2: {msg=}\")\n",
    "    await to_predictions_2(msg)\n",
    "\n",
    "\n",
    "@app.produces(topic=\"predictions\")\n",
    "async def to_predictions_1(msg: TestMsg) -> TestMsg:\n",
    "    return msg\n",
    "\n",
    "\n",
    "@app.produces(topic=\"predictions\", brokers=kafka_brokers_2)\n",
    "async def to_predictions_2(msg: TestMsg) -> TestMsg:\n",
    "    return msg"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a8fed8cc",
   "metadata": {},
   "source": [
    "In this example, the application has two consumes endpoints, both of which will consume events from `preprocessed_signals` topic. `on_preprocessed_signals_1` will consume events from `kafka_brokers_1` configuration and `on_preprocessed_signals_2` will consume events from `kafka_brokers_2` configuration.\n",
    "When producing, `to_predictions_1` will produce to `predictions` topic on `kafka_brokers_1` cluster and `to_predictions_2` will produce to `predictions` topic on `kafka_brokers_2` cluster.\n",
    "\n",
    "\n",
    "#### How it works\n",
    "\n",
    "The `kafka_brokers_1` configuration represents the primary cluster, while `kafka_brokers_2` serves as an alternative cluster specified in the decorator.\n",
    "\n",
    "Using the FastKafka class, the app object is initialized with the primary broker configuration (`kafka_brokers_1`). By default, the `@app.consumes` decorator without the brokers argument consumes messages from the `preprocessed_signals` topic on `kafka_brokers_1`.\n",
    "\n",
    "To consume messages from a different cluster, the `@app.consumes` decorator includes the `brokers` argument. This allows explicit specification of the broker cluster in the `on_preprocessed_signals_2` function, enabling consumption from the same topic but using the `kafka_brokers_2` configuration.\n",
    "\n",
    "The brokers argument can also be used in the @app.produces decorator to define multiple broker clusters for message production.\n",
    "\n",
    "It's important to ensure that all broker configurations have the same required settings as the primary cluster to ensure consistent behavior."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "3c8d8b00",
   "metadata": {},
   "source": [
    "## Testing the application\n",
    "\n",
    "To test our FastKafka 'mirroring' application, we can use our testing framework. Lets take a look how it's done:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "843030d2",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "23-06-23 12:15:51.156 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker._patch_consumers_and_producers(): Patching consumers and producers!\n",
      "23-06-23 12:15:51.157 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker starting\n",
      "23-06-23 12:15:51.157 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'dev.server_1:9092'}'\n",
      "23-06-23 12:15:51.158 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:15:51.158 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'dev.server_2:9092'}'\n",
      "23-06-23 12:15:51.159 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:15:51.178 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'dev.server_1:9092'}'\n",
      "23-06-23 12:15:51.178 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:15:51.179 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'dev.server_2:9092'}'\n",
      "23-06-23 12:15:51.180 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:15:51.180 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:15:51.180 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'dev.server_1:9092'}\n",
      "23-06-23 12:15:51.181 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:15:51.181 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:15:51.182 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:15:51.182 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['preprocessed_signals']\n",
      "23-06-23 12:15:51.182 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:15:51.186 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:15:51.187 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'dev.server_2:9092'}\n",
      "23-06-23 12:15:51.187 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:15:51.188 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:15:51.188 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:15:51.189 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['preprocessed_signals']\n",
      "23-06-23 12:15:51.189 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:15:51.189 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:15:51.190 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'dev.server_1:9092'}\n",
      "23-06-23 12:15:51.190 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:15:51.190 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:15:51.191 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:15:51.191 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['predictions']\n",
      "23-06-23 12:15:51.191 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:15:51.192 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:15:51.192 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'dev.server_2:9092'}\n",
      "23-06-23 12:15:51.193 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:15:51.193 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:15:51.193 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:15:51.194 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['predictions']\n",
      "23-06-23 12:15:51.194 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "Received on s1: msg=TestMsg(msg='signal_s1')\n",
      "Received on s2: msg=TestMsg(msg='signal_s2')\n",
      "23-06-23 12:15:56.181 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:15:56.181 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:15:56.182 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:15:56.182 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:15:56.182 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:15:56.183 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:15:56.183 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:15:56.183 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:15:56.184 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:15:56.184 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:15:56.185 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:15:56.185 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:15:56.185 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:15:56.186 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:15:56.186 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:15:56.186 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:15:56.188 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker stopping\n"
     ]
    }
   ],
   "source": [
    "from fastkafka.testing import Tester\n",
    "\n",
    "async with Tester(app) as tester:\n",
    "    # Send TestMsg to topic/broker pair on_preprocessed_signals_1 is consuming from\n",
    "    await tester.mirrors[app.on_preprocessed_signals_1](TestMsg(msg=\"signal_s1\"))\n",
    "    # Assert on_preprocessed_signals_1 consumed sent message\n",
    "    await app.awaited_mocks.on_preprocessed_signals_1.assert_called_with(\n",
    "        TestMsg(msg=\"signal_s1\"), timeout=5\n",
    "    )\n",
    "    # Assert app has produced a prediction\n",
    "    await tester.mirrors[app.to_predictions_1].assert_called_with(\n",
    "        TestMsg(msg=\"signal_s1\"), timeout=5\n",
    "    )\n",
    "\n",
    "    # Send TestMsg to topic/broker pair on_preprocessed_signals_2 is consuming from\n",
    "    await tester.mirrors[app.on_preprocessed_signals_2](TestMsg(msg=\"signal_s2\"))\n",
    "    # Assert on_preprocessed_signals_2 consumed sent message\n",
    "    await app.awaited_mocks.on_preprocessed_signals_2.assert_called_with(\n",
    "        TestMsg(msg=\"signal_s2\"), timeout=5\n",
    "    )\n",
    "    # Assert app has produced a prediction\n",
    "    await tester.mirrors[app.to_predictions_2].assert_called_with(\n",
    "        TestMsg(msg=\"signal_s2\"), timeout=5\n",
    "    )"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2ec0bd7f",
   "metadata": {},
   "source": [
    "The usage of the `tester.mirrors` dictionary allows specifying the desired topic/broker combination for sending the test messages, especially when working with multiple Kafka clusters. \n",
    "This ensures that the data is sent to the appropriate topic/broker based on the consuming function, and consumed from appropriate topic/broker based on the producing function."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e397ac22",
   "metadata": {},
   "source": [
    "## Running the application\n",
    "\n",
    "You can run your application using `fastkafka run` CLI command in the same way that you would run a single cluster app.\n",
    "\n",
    "To start your app, copy the code above in multi_cluster_example.py and run it by running:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "ca3c1c72",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/markdown": [
       "Now we can run the app. Copy the code above in multi_cluster_example.py, adjust your server configurations, and run it by running\n",
       "```shell\n",
       "fastkafka run --num-workers=1 --kafka-broker=development multi_cluster_example:app\n",
       "```"
      ],
      "text/plain": [
       "<IPython.core.display.Markdown object>"
      ]
     },
     "execution_count": null,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# | echo: false\n",
    "\n",
    "script_file = \"multi_cluster_example.py\"\n",
    "filename = script_file.split(\".py\")[0]\n",
    "cmd = f\"fastkafka run --num-workers=1 --kafka-broker=development {filename}:app\"\n",
    "md(\n",
    "    f\"Now we can run the app. Copy the code above in {script_file}, adjust your server configurations, and run it by running\\n```shell\\n{cmd}\\n```\"\n",
    ")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "e34daab3",
   "metadata": {},
   "outputs": [],
   "source": [
    "# | hide\n",
    "\n",
    "multi_cluster_example = \"\"\"\n",
    "from pydantic import BaseModel, Field\n",
    "\n",
    "from fastkafka import FastKafka\n",
    "\n",
    "class TestMsg(BaseModel):\n",
    "    msg: str = Field(...)\n",
    "\n",
    "kafka_brokers_1 = dict(\n",
    "    development=dict(url=\"<url_of_your_kafka_bootstrap_server_1>\", port=<port_of_your_kafka_bootstrap_server_1>),\n",
    "    production=dict(url=\"prod.server_1\", port=9092),\n",
    ")\n",
    "kafka_brokers_2 = dict(\n",
    "    development=dict(url=\"<url_of_your_kafka_bootstrap_server_2>\", port=<port_of_your_kafka_bootstrap_server_2>),\n",
    "    production=dict(url=\"prod.server_1\", port=9092),\n",
    ")\n",
    "\n",
    "app = FastKafka(kafka_brokers=kafka_brokers_1)\n",
    "\n",
    "\n",
    "@app.consumes(topic=\"preprocessed_signals\")\n",
    "async def on_preprocessed_signals_1(msg: TestMsg):\n",
    "    print(f\"Received on s1: {msg=}\")\n",
    "\n",
    "\n",
    "@app.consumes(topic=\"preprocessed_signals\", brokers=kafka_brokers_2)\n",
    "async def on_preprocessed_signals_2(msg: TestMsg):\n",
    "    print(f\"Received on s2: {msg=}\")\n",
    "\"\"\""
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "9192d56a",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "23-06-23 12:16:04.473 [INFO] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.start(): entering...\n",
      "23-06-23 12:16:04.475 [WARNING] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.start(): (<_UnixSelectorEventLoop running=True closed=False debug=False>) is already running!\n",
      "23-06-23 12:16:04.475 [WARNING] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.start(): calling nest_asyncio.apply()\n",
      "23-06-23 12:16:04.476 [INFO] fastkafka._components.test_dependencies: Java is already installed.\n",
      "23-06-23 12:16:04.477 [INFO] fastkafka._components.test_dependencies: But not exported to PATH, exporting...\n",
      "23-06-23 12:16:04.706 [INFO] fastkafka._components.test_dependencies: Kafka is installed.\n",
      "23-06-23 12:16:04.707 [INFO] fastkafka._components.test_dependencies: But not exported to PATH, exporting...\n",
      "23-06-23 12:16:04.708 [INFO] fastkafka._testing.apache_kafka_broker: Starting zookeeper...\n",
      "23-06-23 12:16:05.426 [INFO] fastkafka._testing.apache_kafka_broker: Starting kafka...\n",
      "23-06-23 12:16:07.330 [INFO] fastkafka._testing.apache_kafka_broker: Local Kafka broker up and running on 127.0.0.1:24092\n",
      "23-06-23 12:16:08.909 [INFO] fastkafka._testing.apache_kafka_broker: <class 'fastkafka.testing.ApacheKafkaBroker'>.start(): returning 127.0.0.1:24092\n",
      "23-06-23 12:16:08.910 [INFO] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.start(): exited.\n",
      "23-06-23 12:16:08.910 [INFO] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.start(): entering...\n",
      "23-06-23 12:16:08.911 [WARNING] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.start(): (<_UnixSelectorEventLoop running=True closed=False debug=False>) is already running!\n",
      "23-06-23 12:16:08.911 [WARNING] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.start(): calling nest_asyncio.apply()\n",
      "23-06-23 12:16:08.912 [INFO] fastkafka._components.test_dependencies: Java is already installed.\n",
      "23-06-23 12:16:09.025 [INFO] fastkafka._components.test_dependencies: Kafka is installed.\n",
      "23-06-23 12:16:09.026 [INFO] fastkafka._testing.apache_kafka_broker: Starting zookeeper...\n",
      "Port 2181 is already in use\n",
      "23-06-23 12:16:09.027 [INFO] fastkafka._testing.apache_kafka_broker: zookeeper startup failed, generating a new port and retrying...\n",
      "23-06-23 12:16:09.027 [INFO] fastkafka._testing.apache_kafka_broker: zookeeper new port=42347\n",
      "23-06-23 12:16:09.723 [INFO] fastkafka._testing.apache_kafka_broker: Starting kafka...\n",
      "23-06-23 12:16:11.649 [INFO] fastkafka._testing.apache_kafka_broker: Local Kafka broker up and running on 127.0.0.1:24093\n",
      "23-06-23 12:16:13.244 [INFO] fastkafka._testing.apache_kafka_broker: <class 'fastkafka.testing.ApacheKafkaBroker'>.start(): returning 127.0.0.1:24093\n",
      "23-06-23 12:16:13.245 [INFO] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.start(): exited.\n",
      "23-06-23 12:16:19.562 [INFO] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.stop(): entering...\n",
      "23-06-23 12:16:19.563 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Terminating the process 181770...\n",
      "23-06-23 12:16:21.150 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Process 181770 terminated.\n",
      "23-06-23 12:16:21.151 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Terminating the process 181382...\n",
      "23-06-23 12:16:22.486 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Process 181382 terminated.\n",
      "23-06-23 12:16:22.488 [INFO] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.stop(): exited.\n",
      "23-06-23 12:16:22.489 [INFO] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.stop(): entering...\n",
      "23-06-23 12:16:22.489 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Terminating the process 180408...\n",
      "23-06-23 12:16:24.073 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Process 180408 terminated.\n",
      "23-06-23 12:16:24.073 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Terminating the process 180019...\n",
      "23-06-23 12:16:25.408 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Process 180019 terminated.\n",
      "23-06-23 12:16:25.410 [INFO] fastkafka._testing.apache_kafka_broker: ApacheKafkaBroker.stop(): exited.\n"
     ]
    }
   ],
   "source": [
    "# | hide\n",
    "\n",
    "with ApacheKafkaBroker(\n",
    "    topics=[\"preprocessed_signals\"], apply_nest_asyncio=True, listener_port=24092\n",
    ") as bootstrap_server_1, ApacheKafkaBroker(\n",
    "    topics=[\"preprocessed_signals\"], apply_nest_asyncio=True, listener_port=24093\n",
    ") as bootstrap_server_2:\n",
    "    server_url_1 = bootstrap_server_1.split(\":\")[0]\n",
    "    server_port_1 = bootstrap_server_1.split(\":\")[1]\n",
    "    server_url_2 = bootstrap_server_2.split(\":\")[0]\n",
    "    server_port_2 = bootstrap_server_2.split(\":\")[1]\n",
    "    exit_code, output = await run_script_and_cancel(\n",
    "        script=multi_cluster_example.replace(\n",
    "            \"<url_of_your_kafka_bootstrap_server_1>\", server_url_1\n",
    "        )\n",
    "        .replace(\"<port_of_your_kafka_bootstrap_server_1>\", server_port_1)\n",
    "        .replace(\"<url_of_your_kafka_bootstrap_server_2>\", server_url_2)\n",
    "        .replace(\"<port_of_your_kafka_bootstrap_server_2>\", server_port_2),\n",
    "        script_file=script_file,\n",
    "        cmd=cmd,\n",
    "        cancel_after=5,\n",
    "    )\n",
    "\n",
    "    expected_returncode = [0, 1]\n",
    "    assert exit_code in expected_returncode, output.decode(\"UTF-8\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2b6055d0",
   "metadata": {},
   "source": [
    "In your app logs, you should see your app starting up and your two consumer functions connecting to different kafka clusters."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "dacc3dd5",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[182747]: 23-06-23 12:16:14.092 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "[182747]: 23-06-23 12:16:14.092 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': '127.0.0.1:24092'}\n",
      "[182747]: 23-06-23 12:16:14.092 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "[182747]: 23-06-23 12:16:14.092 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': '127.0.0.1:24093'}\n",
      "[182747]: 23-06-23 12:16:14.131 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "[182747]: 23-06-23 12:16:14.131 [INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'preprocessed_signals'})\n",
      "[182747]: 23-06-23 12:16:14.131 [INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'preprocessed_signals'}\n",
      "[182747]: 23-06-23 12:16:14.131 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "[182747]: 23-06-23 12:16:14.136 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "[182747]: 23-06-23 12:16:14.136 [INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'preprocessed_signals'})\n",
      "[182747]: 23-06-23 12:16:14.136 [INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'preprocessed_signals'}\n",
      "[182747]: 23-06-23 12:16:14.136 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "[182747]: 23-06-23 12:16:14.141 [INFO] aiokafka.consumer.group_coordinator: Metadata for topic has changed from {} to {'preprocessed_signals': 1}. \n",
      "[182747]: 23-06-23 12:16:14.141 [INFO] aiokafka.consumer.group_coordinator: Metadata for topic has changed from {} to {'preprocessed_signals': 1}. \n",
      "Starting process cleanup, this may take a few seconds...\n",
      "23-06-23 12:16:18.294 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Terminating the process 182747...\n",
      "[182747]: 23-06-23 12:16:19.380 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "[182747]: 23-06-23 12:16:19.380 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "[182747]: 23-06-23 12:16:19.380 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "[182747]: 23-06-23 12:16:19.380 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:19.471 [INFO] fastkafka._components._subprocess: terminate_asyncio_process(): Process 182747 terminated.\n",
      "\n"
     ]
    }
   ],
   "source": [
    "# | echo: false\n",
    "\n",
    "print(output.decode(\"UTF-8\"))"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "770d36aa",
   "metadata": {},
   "source": [
    "## Application documentation\n",
    "\n",
    "At the moment the documentation for multicluster app is not yet implemented, but it is under development and you can expecti it soon!"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "fd591d33",
   "metadata": {},
   "source": [
    "## Examples on how to use multiple broker configurations"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1825a024",
   "metadata": {},
   "source": [
    "### Example #1\n",
    "\n",
    "In this section, we'll explore how you can effectively forward topics between different Kafka clusters, enabling seamless data synchronization for your applications.\n",
    "\n",
    "Imagine having two Kafka clusters, namely `kafka_brokers_1` and `kafka_brokers_2`, each hosting its own set of topics and messages. Now, if you want to forward a specific topic (in this case: `preprocessed_signals`) from kafka_brokers_1 to kafka_brokers_2, FastKafka provides an elegant solution.\n",
    "\n",
    "Let's examine the code snippet that configures our application for topic forwarding:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "f8b84d48",
   "metadata": {},
   "outputs": [],
   "source": [
    "from pydantic import BaseModel, Field\n",
    "\n",
    "from fastkafka import FastKafka\n",
    "\n",
    "class TestMsg(BaseModel):\n",
    "    msg: str = Field(...)\n",
    "\n",
    "kafka_brokers_1 = dict(localhost=dict(url=\"server_1\", port=9092))\n",
    "kafka_brokers_2 = dict(localhost=dict(url=\"server_2\", port=9092))\n",
    "\n",
    "app = FastKafka(kafka_brokers=kafka_brokers_1)\n",
    "\n",
    "\n",
    "@app.consumes(topic=\"preprocessed_signals\")\n",
    "async def on_preprocessed_signals_original(msg: TestMsg):\n",
    "    await to_preprocessed_signals_forward(msg)\n",
    "\n",
    "\n",
    "@app.produces(topic=\"preprocessed_signals\", brokers=kafka_brokers_2)\n",
    "async def to_preprocessed_signals_forward(data: TestMsg) -> TestMsg:\n",
    "    return data"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "18ceaf23",
   "metadata": {},
   "source": [
    "Here's how it works: our FastKafka application is configured to consume messages from `kafka_brokers_1` and process them in the `on_preprocessed_signals_original` function. We want to forward these messages to `kafka_brokers_2`. To achieve this, we define the `to_preprocessed_signals_forward` function as a producer, seamlessly producing the processed messages to the preprocessed_signals topic within the `kafka_brokers_2` cluster."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2e71340e",
   "metadata": {},
   "source": [
    "#### Testing"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "3adc619a",
   "metadata": {},
   "source": [
    "To test our FastKafka forwarding application, we can use our testing framework. Let's take a look at the testing code snippet:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "5b6868e9",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "23-06-23 12:16:31.689 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker._patch_consumers_and_producers(): Patching consumers and producers!\n",
      "23-06-23 12:16:31.690 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker starting\n",
      "23-06-23 12:16:31.691 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'server_2:9092'}'\n",
      "23-06-23 12:16:31.691 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:16:31.701 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'server_1:9092'}'\n",
      "23-06-23 12:16:31.702 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:16:31.702 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:16:31.703 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'server_1:9092'}\n",
      "23-06-23 12:16:31.703 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:16:31.704 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:16:31.704 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:16:31.704 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['preprocessed_signals']\n",
      "23-06-23 12:16:31.706 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:16:31.706 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:16:31.707 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'server_2:9092'}\n",
      "23-06-23 12:16:31.707 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:16:31.708 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:16:31.708 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:16:31.709 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['preprocessed_signals']\n",
      "23-06-23 12:16:31.709 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:16:35.703 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:16:35.703 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:16:35.704 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:35.704 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:16:35.705 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:16:35.705 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:16:35.706 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:35.707 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:16:35.707 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker stopping\n"
     ]
    }
   ],
   "source": [
    "from fastkafka.testing import Tester\n",
    "\n",
    "async with Tester(app) as tester:\n",
    "    await tester.mirrors[app.on_preprocessed_signals_original](TestMsg(msg=\"signal\"))\n",
    "    await tester.mirrors[app.to_preprocessed_signals_forward].assert_called(timeout=5)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "819d38c7",
   "metadata": {},
   "source": [
    "With the help of the **Tester** object, we can simulate and verify the behavior of our FastKafka application. Here's how it works:\n",
    "\n",
    "1. We create an instance of the **Tester** by passing in our *app* object, which represents our FastKafka application.\n",
    "\n",
    "2. Using the **tester.mirrors** dictionary, we can send a message to a specific Kafka broker and topic combination. In this case, we use `tester.mirrors[app.on_preprocessed_signals_original]` to send a TestMsg message with the content \"signal\" to the appropriate Kafka broker and topic.\n",
    "\n",
    "3. After sending the message, we can perform assertions on the mirrored function using `tester.mirrors[app.to_preprocessed_signals_forward].assert_called(timeout=5)`. This assertion ensures that the mirrored function has been called within a specified timeout period (in this case, 5 seconds)."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "3237efbe",
   "metadata": {},
   "source": [
    "### Example #2\n",
    "\n",
    "In this section, we'll explore how you can effortlessly consume data from multiple sources, process it, and aggregate the results into a single topic on a specific cluster.\n",
    "\n",
    "Imagine you have two Kafka clusters: **kafka_brokers_1** and **kafka_brokers_2**, each hosting its own set of topics and messages. Now, what if you want to consume data from both clusters, perform some processing, and produce the results to a single topic on **kafka_brokers_1**? FastKafka has got you covered!\n",
    "\n",
    "Let's take a look at the code snippet that configures our application for aggregating multiple clusters:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "a38fc478",
   "metadata": {},
   "outputs": [],
   "source": [
    "from pydantic import BaseModel, Field\n",
    "\n",
    "from fastkafka import FastKafka\n",
    "\n",
    "class TestMsg(BaseModel):\n",
    "    msg: str = Field(...)\n",
    "\n",
    "kafka_brokers_1 = dict(localhost=dict(url=\"server_1\", port=9092))\n",
    "kafka_brokers_2 = dict(localhost=dict(url=\"server_2\", port=9092))\n",
    "\n",
    "app = FastKafka(kafka_brokers=kafka_brokers_1)\n",
    "\n",
    "\n",
    "@app.consumes(topic=\"preprocessed_signals\")\n",
    "async def on_preprocessed_signals_1(msg: TestMsg):\n",
    "    print(f\"Default: {msg=}\")\n",
    "    await to_predictions(msg)\n",
    "\n",
    "\n",
    "@app.consumes(topic=\"preprocessed_signals\", brokers=kafka_brokers_2)\n",
    "async def on_preprocessed_signals_2(msg: TestMsg):\n",
    "    print(f\"Specified: {msg=}\")\n",
    "    await to_predictions(msg)\n",
    "\n",
    "\n",
    "@app.produces(topic=\"predictions\")\n",
    "async def to_predictions(prediction: TestMsg) -> TestMsg:\n",
    "    print(f\"Sending prediction: {prediction}\")\n",
    "    return [prediction]"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "95a1642f",
   "metadata": {},
   "source": [
    "Here's the idea: our FastKafka application is set to consume messages from the topic \"preprocessed_signals\" on **kafka_brokers_1** cluster, as well as from the same topic on **kafka_brokers_2** cluster. We have two consuming functions, `on_preprocessed_signals_1` and `on_preprocessed_signals_2`, that handle the messages from their respective clusters. These functions perform any required processing, in this case, just calling the to_predictions function.\n",
    "\n",
    "The exciting part is that the to_predictions function acts as a producer, sending the processed results to the \"predictions\" topic on **kafka_brokers_1 cluster**. By doing so, we effectively aggregate the data from multiple sources into a single topic on a specific cluster.\n",
    "\n",
    "This approach enables you to consume data from multiple Kafka clusters, process it, and produce the aggregated results to a designated topic. Whether you're generating predictions, performing aggregations, or any other form of data processing, FastKafka empowers you to harness the full potential of multiple clusters."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d80755a2",
   "metadata": {},
   "source": [
    "#### Testing\n",
    "\n",
    "Let's take a look at the testing code snippet:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "aadbdd9e",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "23-06-23 12:16:41.222 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker._patch_consumers_and_producers(): Patching consumers and producers!\n",
      "23-06-23 12:16:41.223 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker starting\n",
      "23-06-23 12:16:41.224 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'server_1:9092'}'\n",
      "23-06-23 12:16:41.224 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:16:41.239 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'server_1:9092'}'\n",
      "23-06-23 12:16:41.239 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:16:41.240 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'server_2:9092'}'\n",
      "23-06-23 12:16:41.240 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:16:41.241 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:16:41.241 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'server_1:9092'}\n",
      "23-06-23 12:16:41.241 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:16:41.242 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:16:41.242 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:16:41.242 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['preprocessed_signals']\n",
      "23-06-23 12:16:41.243 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:16:41.243 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:16:41.245 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'server_2:9092'}\n",
      "23-06-23 12:16:41.245 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:16:41.245 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:16:41.246 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:16:41.246 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['preprocessed_signals']\n",
      "23-06-23 12:16:41.247 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:16:41.247 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:16:41.248 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'server_1:9092'}\n",
      "23-06-23 12:16:41.248 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:16:41.249 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:16:41.249 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:16:41.249 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['predictions']\n",
      "23-06-23 12:16:41.249 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "Default: msg=TestMsg(msg='signal')\n",
      "Sending prediction: msg='signal'\n",
      "Specified: msg=TestMsg(msg='signal')\n",
      "Sending prediction: msg='signal'\n",
      "23-06-23 12:16:45.241 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:16:45.242 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:16:45.242 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:45.242 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:16:45.243 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:16:45.243 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:16:45.244 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:16:45.245 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:45.245 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:16:45.245 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:16:45.246 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:45.246 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:16:45.247 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker stopping\n"
     ]
    }
   ],
   "source": [
    "from fastkafka.testing import Tester\n",
    "\n",
    "async with Tester(app) as tester:\n",
    "    await tester.mirrors[app.on_preprocessed_signals_1](TestMsg(msg=\"signal\"))\n",
    "    await tester.mirrors[app.on_preprocessed_signals_2](TestMsg(msg=\"signal\"))\n",
    "    await tester.on_predictions.assert_called(timeout=5)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "4ebba677",
   "metadata": {},
   "source": [
    "Here's how the code above works:\n",
    "\n",
    "1. Within an `async with` block, create an instance of the Tester by passing in your app object, representing your FastKafka application.\n",
    "\n",
    "2. Using the tester.mirrors dictionary, you can send messages to specific Kafka broker and topic combinations. In this case, we use `tester.mirrors[app.on_preprocessed_signals_1]` and `tester.mirrors[app.on_preprocessed_signals_2]` to send TestMsg messages with the content \"signal\" to the corresponding Kafka broker and topic combinations.\n",
    "\n",
    "3. After sending the messages, you can perform assertions on the **on_predictions** function using `tester.on_predictions.assert_called(timeout=5)`. This assertion ensures that the on_predictions function has been called within a specified timeout period (in this case, 5 seconds)."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "aaf82425",
   "metadata": {},
   "source": [
    "### Example #3\n",
    "\n",
    "In some scenarios, you may need to produce messages to multiple Kafka clusters simultaneously. FastKafka simplifies this process by allowing you to configure your application to produce messages to multiple clusters effortlessly. Let's explore how you can achieve this:\n",
    "\n",
    "Consider the following code snippet that demonstrates producing messages to multiple clusters:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "0e48106b",
   "metadata": {},
   "outputs": [],
   "source": [
    "from pydantic import BaseModel, Field\n",
    "\n",
    "from fastkafka import FastKafka\n",
    "\n",
    "class TestMsg(BaseModel):\n",
    "    msg: str = Field(...)\n",
    "\n",
    "kafka_brokers_1 = dict(localhost=dict(url=\"server_1\", port=9092))\n",
    "kafka_brokers_2 = dict(localhost=dict(url=\"server_2\", port=9092))\n",
    "\n",
    "app = FastKafka(kafka_brokers=kafka_brokers_1)\n",
    "\n",
    "\n",
    "@app.consumes(topic=\"preprocessed_signals\")\n",
    "async def on_preprocessed_signals(msg: TestMsg):\n",
    "    print(f\"{msg=}\")\n",
    "    await to_predictions_1(TestMsg(msg=\"prediction\"))\n",
    "    await to_predictions_2(TestMsg(msg=\"prediction\"))\n",
    "\n",
    "\n",
    "@app.produces(topic=\"predictions\")\n",
    "async def to_predictions_1(prediction: TestMsg) -> TestMsg:\n",
    "    print(f\"Sending prediction to s1: {prediction}\")\n",
    "    return [prediction]\n",
    "\n",
    "\n",
    "@app.produces(topic=\"predictions\", brokers=kafka_brokers_2)\n",
    "async def to_predictions_2(prediction: TestMsg) -> TestMsg:\n",
    "    print(f\"Sending prediction to s2: {prediction}\")\n",
    "    return [prediction]"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "dc670be1",
   "metadata": {},
   "source": [
    "Here's what you need to know about producing to multiple clusters:\n",
    "\n",
    "1. We define two Kafka broker configurations: **kafka_brokers_1** and **kafka_brokers_2**, representing different clusters with their respective connection details.\n",
    "\n",
    "2. We create an instance of the FastKafka application, specifying **kafka_brokers_1** as the primary cluster for producing messages.\n",
    "\n",
    "3. The `on_preprocessed_signals` function serves as a consumer, handling incoming messages from the \"preprocessed_signals\" topic. Within this function, we invoke two producer functions: `to_predictions_1` and `to_predictions_2`.\n",
    "\n",
    "4. The `to_predictions_1` function sends predictions to the \"predictions\" topic on *kafka_brokers_1* cluster.\n",
    "\n",
    "5. Additionally, the `to_predictions_2` function sends the same predictions to the \"predictions\" topic on *kafka_brokers_2* cluster. This allows for producing the same data to multiple clusters simultaneously.\n",
    "\n",
    "By utilizing this approach, you can seamlessly produce messages to multiple Kafka clusters, enabling you to distribute data across different environments or leverage the strengths of various clusters.\n",
    "\n",
    "Feel free to customize the producer functions as per your requirements, performing any necessary data transformations or enrichment before sending the predictions.\n",
    "\n",
    "With FastKafka, producing to multiple clusters becomes a breeze, empowering you to harness the capabilities of multiple environments effortlessly."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0c1caf66",
   "metadata": {},
   "source": [
    "#### Testing\n",
    "\n",
    "Let's take a look at the testing code snippet:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "66fdc528",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "23-06-23 12:16:49.903 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker._patch_consumers_and_producers(): Patching consumers and producers!\n",
      "23-06-23 12:16:49.904 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker starting\n",
      "23-06-23 12:16:49.904 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'server_1:9092'}'\n",
      "23-06-23 12:16:49.905 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:16:49.905 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'server_2:9092'}'\n",
      "23-06-23 12:16:49.906 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:16:49.921 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'server_1:9092'}'\n",
      "23-06-23 12:16:49.921 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()\n",
      "23-06-23 12:16:49.921 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:16:49.922 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'server_1:9092'}\n",
      "23-06-23 12:16:49.922 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:16:49.923 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:16:49.923 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:16:49.924 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['preprocessed_signals']\n",
      "23-06-23 12:16:49.924 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:16:49.924 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:16:49.925 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'server_1:9092'}\n",
      "23-06-23 12:16:49.925 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:16:49.926 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:16:49.926 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:16:49.926 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['predictions']\n",
      "23-06-23 12:16:49.927 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "23-06-23 12:16:49.927 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...\n",
      "23-06-23 12:16:49.928 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'auto_offset_reset': 'earliest', 'max_poll_records': 100, 'bootstrap_servers': 'server_2:9092'}\n",
      "23-06-23 12:16:49.928 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()\n",
      "23-06-23 12:16:49.928 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.\n",
      "23-06-23 12:16:49.929 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called\n",
      "23-06-23 12:16:49.929 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['predictions']\n",
      "23-06-23 12:16:49.929 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.\n",
      "msg=TestMsg(msg='signal')\n",
      "Sending prediction to s1: msg='prediction'\n",
      "Sending prediction to s2: msg='prediction'\n",
      "23-06-23 12:16:53.922 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:16:53.922 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:16:53.923 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:53.923 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:16:53.923 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:16:53.924 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:53.924 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:16:53.925 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called\n",
      "23-06-23 12:16:53.925 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.\n",
      "23-06-23 12:16:53.925 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.\n",
      "23-06-23 12:16:53.926 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:16:53.926 [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called\n",
      "23-06-23 12:16:53.926 [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker stopping\n"
     ]
    }
   ],
   "source": [
    "from fastkafka.testing import Tester\n",
    "\n",
    "async with Tester(app) as tester:\n",
    "    await tester.to_preprocessed_signals(TestMsg(msg=\"signal\"))\n",
    "    await tester.mirrors[to_predictions_1].assert_called(timeout=5)\n",
    "    await tester.mirrors[to_predictions_2].assert_called(timeout=5)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "99617426",
   "metadata": {},
   "source": [
    "Here's how you can perform the necessary tests:\n",
    "\n",
    "1. Within an async with block, create an instance of the **Tester** by passing in your app object, representing your FastKafka application.\n",
    "\n",
    "2. Using the `tester.to_preprocessed_signals` method, you can send a TestMsg message with the content \"signal\".\n",
    "\n",
    "3. After sending the message, you can perform assertions on the to_predictions_1 and to_predictions_2 functions using `tester.mirrors[to_predictions_1].assert_called(timeout=5)` and `tester.mirrors[to_predictions_2].assert_called(timeout=5)`. These assertions ensure that the respective producer functions have produced data to their respective topic/broker combinations.\n",
    "\n",
    "By employing this testing approach, you can verify that the producing functions correctly send messages to their respective clusters. The testing framework provided by FastKafka enables you to ensure the accuracy and reliability of your application's producing logic."
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "python3",
   "language": "python",
   "name": "python3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
