---
title: "Agents"
id: agents
slug: "/agents"
description: "This page explains how to create an AI agent in Haystack capable of retrieving information, generating responses, and taking actions using various Haystack components."
---

# Agents

This page explains how to create an AI agent in Haystack capable of retrieving information, generating responses, and taking actions using various Haystack components.

## What’s an AI Agent?

An AI agent is a system that can:

- Understand user input (text, image, audio, and other queries),
- Retrieve relevant information (documents or structured data),
- Generate intelligent responses (using LLMs like OpenAI or Hugging Face models),
- Perform actions (calling APIs, fetching live data, executing functions).

### Understanding AI Agents

AI agents are autonomous systems that use large language models (LLMs) to make decisions and solve complex tasks. They interact with their environment using tools, memory, and reasoning.

### What Makes an AI Agent

An AI agent is more than a chatbot. It actively plans, chooses the right tools and executes tasks to achieve a goal. Unlike traditional software, it adapts to new information and refines its process as needed.

1. **LLM as the Brain**: The agent's core is an LLM, which understands context, processes natural language and serves as the central intelligence system.
2. **Tools for Interaction**: Agents connect to external tools, APIs, and databases to gather information and take action.
3. **Memory for Context**: Short-term memory helps track conversations, while long-term memory stores knowledge for future interactions.
4. **Reasoning and Planning**: Agents break down complex problems, come up with step-by-step action plans, and adapt based on new data and feedback.

### How AI Agents Work

An AI agent starts with a prompt that defines its role and objectives. It decides when to use tools, gathers data, and refines its approach through loops of reasoning and action. It evaluates progress and adjusts its strategy to improve results.

For example, a customer service agent answers queries using a database. If it lacks an answer, it fetches real-time data, summarizes it, and provides a response. A coding assistant understands project requirements, suggests solutions, and even writes code.

## Key Components

### Agents

Haystack has a universal [Agent](../pipeline-components/agents-1/agent.mdx) component that interacts with chat-based LLMs and tools to solve complex queries. It requires a Chat Generator that supports tools to work and can be customizable according to your needs. Check out the [Agent](../pipeline-components/agents-1/agent.mdx) documentation, or the [example](#tool-calling-agent) below to see how it works.

### Additional Components

You can build an AI agent in Haystack yourself, using the three main elements in a pipeline:

- [Chat Generators](../pipeline-components/generators.mdx) to generate tool calls (with tool name and arguments) or assistant responses with an LLM,
- [`Tool`](../tools/tool.mdx) class that allows the LLM to perform actions such as running a pipeline or calling an external API, connecting to the external world,
- [`ToolInvoker`](../pipeline-components/tools/toolinvoker.mdx) component to execute tool calls generated by an LLM. It parses the LLM's tool-calling responses and invokes the appropriate tool with the correct arguments from the pipeline.

There are three ways of creating a tool in Haystack:

- [`Tool`](../tools/tool.mdx) class – Creates a tool representation for a consistent tool-calling experience across all Generators. It allows for most customization, as you can define its own name and description.
- [`ComponentTool`](../tools/componenttool.mdx) class – Wraps a Haystack component as a callable tool.
- [`@tool`](../tools/tool.mdx#tool-decorator) decorator – Creates tools from Python functions and automatically uses their function name and docstring.
- [Toolset](../tools/toolset.mdx) – A container for grouping multiple tools that can be passed directly to Agents or Generators.

## Example Agents

### Tool-Calling Agent

You can create a similar tool-calling agent with the `Agent` component:

```python
from haystack.components.agents import Agent
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.websearch import SerperDevWebSearch
from haystack.dataclasses import Document, ChatMessage
from haystack.tools.component_tool import ComponentTool

from typing import List

## Create the web search component
web_search = SerperDevWebSearch(top_k=3)

## Create the ComponentTool with simpler parameters
web_tool = ComponentTool(
    component=web_search,
    name="web_search",
    description="Search the web for current information like weather, news, or facts."
)

## Create the agent with the web tool
tool_calling_agent = Agent(
    chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"),
    system_prompt="""You're a helpful agent. When asked about current information like weather, news, or facts,
                     use the web_search tool to find the information and then summarize the findings.
                     When you get web search results, extract the relevant information and present it in a clear,
                     concise manner.""",
    tools=[web_tool]
)

## Run the agent with the user message
user_message = ChatMessage.from_user("How is the weather in Berlin?")
result = tool_calling_agent.run(messages=[user_message])

## Print the result - using .text instead of .content
print(result["messages"][-1].text)
```

Resulting in:

```python

- **Morning**: 49°F
- **Afternoon**: 57°F
- **Evening**: 47°F
- **Overnight**: 39°F

For more details, you can check the full forecasts on [AccuWeather](https://www.accuweather.com/en/de/berlin/10178/current-weather/178087) or [Weather.com](https://weather.com/weather/today/l/5ca23443513a0fdc1d37ae2ffaf5586162c6fe592a66acc9320a0d0536be1bb9).
```

### Pipeline With Tools

Here’s an example of how you would build a tool-calling agent with the help of `ToolInvoker`.

This is what’s happening in this code example:

1. `OpenAIChatGenerator` uses an LLM to analyze the user's message and determines whether to provide an assistant response or initiate a tool call.
2. `ConditionalRouter` directs the output from the `OpenAIChatGenerator` to `there_are_tool_calls` branch if it’s a tool call or to `final_replies` to return to the user directly.
3. `ToolInvoker` executes the tool call generated by the LLM. `ComponentTool` wraps the `SerperDevWebSearch` component that fetches real-time search results, making it accessible for `ToolInvoker` to execute it as a tool.
4. After the tool provides its output, the `ToolInvoker` sends this information back to the `OpenAIChatGenerator`, along with the original user question stored by the `MessageCollector`.

```python
from haystack import component, Pipeline
from haystack.components.tools import ToolInvoker
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.routers import ConditionalRouter
from haystack.components.websearch import SerperDevWebSearch
from haystack.core.component.types import Variadic
from haystack.dataclasses import ChatMessage
from haystack.tools import ComponentTool

from typing import Any, Dict, List

## helper component to temporarily store last user query before the tool call
@component()
class MessageCollector:
    def __init__(self):
        self._messages = []

    @component.output_types(messages=List[ChatMessage])
    def run(self, messages: Variadic[List[ChatMessage]]) -> Dict[str, Any]:

        self._messages.extend([msg for inner in messages for msg in inner])
        return {"messages": self._messages}

    def clear(self):
        self._messages = []

## Create a tool from a component
web_tool = ComponentTool(
    component=SerperDevWebSearch(top_k=3)
)

## Define routing conditions
routes = [
    {
        "condition": "{{replies[0].tool_calls | length > 0}}",
        "output": "{{replies}}",
        "output_name": "there_are_tool_calls",
        "output_type": List[ChatMessage],
    },
    {
        "condition": "{{replies[0].tool_calls | length == 0}}",
        "output": "{{replies}}",
        "output_name": "final_replies",
        "output_type": List[ChatMessage],
    },
]

## Create the pipeline
tool_agent = Pipeline()
tool_agent.add_component("message_collector", MessageCollector())
tool_agent.add_component("generator", OpenAIChatGenerator(model="gpt-4o-mini", tools=[web_tool]))
tool_agent.add_component("router", ConditionalRouter(routes, unsafe=True))
tool_agent.add_component("tool_invoker", ToolInvoker(tools=[web_tool]))

tool_agent.connect("generator.replies", "router")
tool_agent.connect("router.there_are_tool_calls", "tool_invoker")
tool_agent.connect("router.there_are_tool_calls", "message_collector")
tool_agent.connect("tool_invoker.tool_messages", "message_collector")
tool_agent.connect("message_collector", "generator.messages")

messages = [
    ChatMessage.from_system("You're a helpful agent choosing the right tool when necessary"),
    ChatMessage.from_user("How is the weather in Berlin?")]
result = tool_agent.run({"messages": messages})

print(result["router"]["final_replies"][0].text)
```

Resulting in:

```python

For more detailed weather updates, you can check the following links:
- [AccuWeather](https://www.accuweather.com/en/de/berlin/10178/weather-forecast/178087)
- [Weather.com](https://weather.com/weather/today/l/5ca23443513a0fdc1d37ae2ffaf5586162c6fe592a66acc9320a0d0536be1bb9)
```
