---
title: Long-term Memory
description: Persistent knowledge storage for cross-session learning
---

Long-term memory provides persistent knowledge storage using LangGraph's Store system. It enables agents to remember important information across sessions through **memory tools** that the AI can proactively call to save and retrieve information.

## Quick Start

### Basic Configuration

```python
from langcrew import Crew, Agent
from langcrew.memory import MemoryConfig, LongTermMemoryConfig

# Enable long-term memory
memory_config = MemoryConfig(
    provider="sqlite",
    connection_string="sqlite:///long_term.db",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="my-app"  # RECOMMENDED for production
    )
)

crew = Crew(agents=[agent], memory=memory_config)
```

### Configuration Parameters

| Parameter              | Type                  | Description                                           | Default  |
| ---------------------- | --------------------- | ----------------------------------------------------- | -------- |
| `enabled`              | bool                  | Enable long-term memory                               | False    |
| `provider`             | str \| None           | Storage provider override (inherits if None)          | None     |
| `connection_string`    | str \| None           | Connection string override (inherits if None)         | None     |
| `app_id`               | str \| None           | Application identifier (RECOMMENDED for production)   | None     |
| `index`                | IndexConfig \| None   | Vector search configuration                           | None     |
| `user_memory`          | MemoryScopeConfig     | User-specific memory configuration                    | enabled  |
| `app_memory`           | MemoryScopeConfig     | Application-wide memory (⚠️ experimental)             | disabled |
| `search_response_format` | str                 | Search result format ("content" or "content_and_artifact") | "content" |

## Memory Scopes

Long-term memory operates in two scopes, each with dedicated **memory tools** that agents can call:

### User Memory (Default: Enabled)

Stores personal user preferences, information, and context.

**Automatically created tools:**
- `manage_user_memory`: Save, update, or delete user memories
- `search_user_memory`: Search and retrieve user memories

**How it works:**
```python
from langcrew.memory import MemoryScopeConfig
from langgraph.store.base import IndexConfig

memory_config = MemoryConfig(
    provider="postgres",
    connection_string="postgresql://localhost/memory",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="my-app-prod",
        user_memory=MemoryScopeConfig(
            enabled=True,
            manage_instructions="""Save memories when user:
            1. Expresses preferences (I like/love/prefer)
            2. Shares personal info (job, location, hobbies)
            3. Explicitly asks you to remember something
            """,
            search_instructions="""Search when you need to:
            1. Recall user preferences or context
            2. Provide personalized recommendations
            """
        )
    )
)

crew = Crew(agents=[agent], memory=memory_config)

# When user says "I'm vegetarian and I love Italian food"
# -> AI automatically calls manage_user_memory tool
crew.kickoff(
    inputs={"user_input": "I'm vegetarian and I love Italian food"},
    thread_id="user_alice",
    config={"configurable": {"user_id": "alice"}}
)

# Later session - AI calls search_user_memory when needed
crew.kickoff(
    inputs={"user_input": "Recommend a restaurant"},
    thread_id="user_alice_new_session",
    config={"configurable": {"user_id": "alice"}}
)
```

**Memory namespace isolation:**
- With `app_id`: `("user_memories", "my-app-prod", "alice")`
- Without `app_id`: `("user_memories", "alice")`

### App Memory (Default: Disabled, ⚠️ Experimental)

Stores application-wide insights shared across all users.

**Automatically created tools:**
- `manage_app_memory`: Save application-level insights
- `search_app_memory`: Search shared insights

```python
memory_config = MemoryConfig(
    provider="postgres",
    connection_string="postgresql://localhost/memory",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="saas-app-v1",
        app_memory=MemoryScopeConfig(
            enabled=True,  # ⚠️ Experimental
            manage_instructions="Store application-wide patterns and insights only",
            search_instructions="Search for common patterns to improve assistance"
        )
    )
)
```

**Memory namespace isolation:**
- With `app_id`: `("app_memories", "saas-app-v1")`
- Without `app_id`: `("app_memories",)`

**⚠️ Important**: App memory is experimental and should be carefully monitored to ensure it only stores aggregated insights, not personal user data.

## How Memory Tools Work

Long-term memory uses **LangMem tools** that are automatically added to your agents:

### Automatic Tool Creation

When you enable long-term memory, these tools are created and added to each agent:

```python
# When you configure:
memory_config = MemoryConfig(
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="my-app",
        user_memory=MemoryScopeConfig(enabled=True)
    )
)

# Behind the scenes, agents get these tools:
# - manage_user_memory: For saving/updating/deleting memories
# - search_user_memory: For retrieving memories
```

### Tool Invocation

The AI automatically calls these tools based on the `instructions` you configure:

**When to save (manage_instructions):**
```python
manage_instructions="Call this tool when user:
1. Expresses preferences (I like/love/prefer)
2. Shares personal information
3. Explicitly asks you to remember something"
```

**When to search (search_instructions):**
```python
search_instructions="Call this tool when:
1. User asks about their preferences
2. You need to personalize responses
3. User asks 'What do you know about me?'"
```

### Runtime Flow

```
User: "I love pizza"
  ↓
AI detects preference expression
  ↓
AI calls manage_user_memory(content="User loves pizza")
  ↓
Memory saved to Store with namespace ("user_memories", "my-app", "{user_id}")
  ↓
AI responds: "Got it! I'll remember you love pizza."

---

User: "What food do I like?"
  ↓
AI detects need to recall preference
  ↓
AI calls search_user_memory(query="food preferences")
  ↓
Store returns: "User loves pizza"
  ↓
AI responds: "You mentioned you love pizza!"
```

## Vector Search Integration

Enable semantic search for better memory retrieval:

```python
from langgraph.store.base import IndexConfig

memory_config = MemoryConfig(
    provider="postgres",
    connection_string="postgresql://localhost/memory",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="my-app",
        index=IndexConfig(
            dims=1536,
            embed="openai:text-embedding-3-small"
        )
    )
)
```

**Supported Embedding Models:**
- `openai:text-embedding-3-small` (1536 dims)
- `openai:text-embedding-3-large` (3072 dims)
- `openai:text-embedding-ada-002` (1536 dims)
- Custom models (see [LangGraph IndexConfig docs](https://langchain-ai.github.io/langgraph/how-tos/persistence_postgres/#semantic-search-using-embedding-similarity))

## Application Isolation with app_id

The `app_id` parameter provides namespace isolation when multiple applications share the same database:

```python
# App 1
memory_config_app1 = MemoryConfig(
    provider="postgres",
    connection_string="postgresql://shared-db/memory",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="chatbot-v1"  # Isolated namespace
    )
)

# App 2 (same database, different namespace)
memory_config_app2 = MemoryConfig(
    provider="postgres",
    connection_string="postgresql://shared-db/memory",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="assistant-v1"  # Different isolated namespace
    )
)
```

**Data Isolation:**
- **With app_id**: User "alice" in App 1 is completely separate from user "alice" in App 2
- **Without app_id**: All applications share the same user namespace (not recommended for production)

## Use Cases

### Personalized Assistant

```python
memory_config = MemoryConfig(
    provider="sqlite",
    connection_string="sqlite:///assistant.db",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="personal-assistant"
    )
)

assistant = Agent(
    role="Personal Assistant",
    goal="Provide personalized assistance based on user preferences",
    backstory="You remember user preferences and provide tailored recommendations"
)

crew = Crew(agents=[assistant], memory=memory_config)

# First interaction
crew.kickoff(
    inputs={"user_input": "I prefer morning meetings and hate Mondays"},
    thread_id="user_123_session_1"
)

# Later - agent remembers preferences
crew.kickoff(
    inputs={"user_input": "Schedule a team meeting"},
    thread_id="user_123_session_2"
)
# Agent will suggest Tuesday-Friday mornings
```

### Customer Relationship Management

```python
memory_config = MemoryConfig(
    provider="postgresql",
    connection_string="postgresql://localhost/crm",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="crm-system",
        index=IndexConfig(
            dims=1536,
            embed="openai:text-embedding-3-small"
        )
    )
)

crm_agent = Agent(
    role="Customer Success Manager",
    goal="Build lasting customer relationships",
    backstory="You remember customer history, preferences, and past interactions"
)

crew = Crew(agents=[crm_agent], memory=memory_config)
```

## Storage Providers

Long-term memory supports all LangCrew storage providers:

### SQLite (Development/Single-user)

```python
memory_config = MemoryConfig(
    provider="sqlite",
    connection_string="sqlite:///long_term.db",
    long_term=LongTermMemoryConfig(enabled=True, app_id="my-app")
)
```

### PostgreSQL (Production)

```python
memory_config = MemoryConfig(
    provider="postgresql",
    connection_string="postgresql://user:pass@localhost:5432/memory_db",
    long_term=LongTermMemoryConfig(
        enabled=True,
        app_id="my-app-prod",
        index=IndexConfig(dims=1536, embed="openai:text-embedding-3-small")
    )
)
```

### MySQL (Production)

```python
memory_config = MemoryConfig(
    provider="mysql",
    connection_string="mysql://user:pass@localhost:3306/memory_db",
    long_term=LongTermMemoryConfig(enabled=True, app_id="my-app")
)
```

## Troubleshooting

### Memories Not Persisting

- Verify long-term config has `enabled: True`
- Check database connection and permissions
- Ensure store is properly configured

### Search Not Finding Memories

- Verify `index` configuration is set for semantic search
- Check embedding model is accessible
- Ensure memories contain relevant content

### App Isolation Issues

- Always set `app_id` in production environments
- Use unique `app_id` for each application
- Verify `app_id` is consistent across sessions

## Next Steps

- **[Short-term Memory](/guides/memory/short-term)** - Session-based context
- **[Storage Configuration](/guides/memory/storage)** - Configure storage backends
- **[Memory Concepts](/concepts/memory)** - Understand memory architecture
