Spaces:
Sleeping
Sleeping
metadata
title: Tool Calling Agent (LangGraph)
emoji: ⚒️
colorFrom: indigo
colorTo: green
sdk: docker
app_file: main.py
pinned: false
LangGraph Tool-Calling Agent
A research assistant powered by LangGraph and Chainlit that can search the web and query arXiv papers to answer questions.
Features
- Multi-tool agent with web search capabilities (Tavily and DuckDuckGo)
- Academic research with arXiv integration
- Interactive Chainlit web interface
- Streaming responses with real-time tool usage visibility
Local Development
- Clone the repository
- Create a virtual environment:
python -m venv .venv
- Activate the virtual environment:
- Windows:
.venv\Scripts\activate
- macOS/Linux:
source .venv/bin/activate
- Windows:
- Install dependencies:
pip install -r requirements.txt
- Create a
.env
file with your API keys:OPENAI_API_KEY=your_openai_key_here TAVILY_API_KEY=your_tavily_key_here LANGCHAIN_API_KEY=your_langchain_key_here LANGCHAIN_TRACING_V2=true LANGCHAIN_PROJECT=tool-calling-agent
- Run the application:
chainlit run main.py
- Open your browser to http://localhost:8501
Deploying to Hugging Face Spaces
- Create a new Space on Hugging Face (https://huggingface.co/new-space)
- Choose "Docker" as the Space SDK
- Clone your Space repository
- Copy your project files to the cloned repository
- Add your API keys as repository secrets in the Space settings
- Push your changes to Hugging Face
- Your app will build and deploy automatically
Environment Variables for Hugging Face
Make sure to add these environment variables in your Hugging Face Space settings:
OPENAI_API_KEY
TAVILY_API_KEY
LANGCHAIN_API_KEY
LANGCHAIN_TRACING_V2
LANGCHAIN_PROJECT
Project Structure
main.py
- Entry point and Chainlit handlerconfig.py
- Configuration managementtools.py
- Tool definitions and setupgraph.py
- LangGraph agent implementationchainlit.yaml
- Chainlit configurationDockerfile
- Container definition for deployment