Overview

The CrewAI framework provides a sophisticated memory system designed to significantly enhance AI agent capabilities. CrewAI offers three distinct memory approaches that serve different use cases:

  1. Basic Memory System - Built-in short-term, long-term, and entity memory
  2. User Memory - User-specific memory with Mem0 integration (legacy approach)
  3. External Memory - Standalone external memory providers (new approach)

Memory System Components

ComponentDescription
Short-Term MemoryTemporarily stores recent interactions and outcomes using RAG, enabling agents to recall and utilize information relevant to their current context during the current executions.
Long-Term MemoryPreserves valuable insights and learnings from past executions, allowing agents to build and refine their knowledge over time.
Entity MemoryCaptures and organizes information about entities (people, places, concepts) encountered during tasks, facilitating deeper understanding and relationship mapping. Uses RAG for storing entity information.
Contextual MemoryMaintains the context of interactions by combining ShortTermMemory, LongTermMemory, and EntityMemory, aiding in the coherence and relevance of agent responses over a sequence of tasks or a conversation.

The simplest and most commonly used approach. Enable memory for your crew with a single parameter:

Quick Start

from crewai import Crew, Agent, Task, Process

# Enable basic memory system
crew = Crew(
    agents=[...],
    tasks=[...],
    process=Process.sequential,
    memory=True,  # Enables short-term, long-term, and entity memory
    verbose=True
)

How It Works

  • Short-Term Memory: Uses ChromaDB with RAG for current context
  • Long-Term Memory: Uses SQLite3 to store task results across sessions
  • Entity Memory: Uses RAG to track entities (people, places, concepts)
  • Storage Location: Platform-specific location via appdirs package
  • Custom Storage Directory: Set CREWAI_STORAGE_DIR environment variable

Custom Embedder Configuration

crew = Crew(
    agents=[...],
    tasks=[...],
    memory=True,
    embedder={
        "provider": "openai",
        "config": {
            "model": "text-embedding-3-small"
        }
    }
)

Custom Storage Paths

import os
from crewai import Crew
from crewai.memory import LongTermMemory
from crewai.memory.storage.ltm_sqlite_storage import LTMSQLiteStorage

# Configure custom storage location
crew = Crew(
    memory=True,
    long_term_memory=LongTermMemory(
        storage=LTMSQLiteStorage(
            db_path=os.getenv("CREWAI_STORAGE_DIR", "./storage") + "/memory.db"
        )
    )
)

2. User Memory with Mem0 (Legacy)

Legacy Approach: While fully functional, this approach is considered legacy. For new projects requiring user-specific memory, consider using External Memory instead.

User Memory integrates with Mem0 to provide user-specific memory that persists across sessions and integrates with the crew’s contextual memory system.

Prerequisites

pip install mem0ai

Mem0 Cloud Configuration

import os
from crewai import Crew, Process

# Set your Mem0 API key
os.environ["MEM0_API_KEY"] = "m0-your-api-key"

crew = Crew(
    agents=[...],
    tasks=[...],
    memory=True,  # Required for contextual memory integration
    memory_config={
        "provider": "mem0",
        "config": {"user_id": "john"},
        "user_memory": {}  # Required - triggers user memory initialization
    },
    process=Process.sequential,
    verbose=True
)

Advanced Mem0 Configuration

crew = Crew(
    agents=[...],
    tasks=[...],
    memory=True,
    memory_config={
        "provider": "mem0",
        "config": {
            "user_id": "john",
            "org_id": "my_org_id",        # Optional
            "project_id": "my_project_id", # Optional
            "api_key": "custom-api-key"    # Optional - overrides env var
        },
        "user_memory": {}
    }
)

Local Mem0 Configuration

crew = Crew(
    agents=[...],
    tasks=[...],
    memory=True,
    memory_config={
        "provider": "mem0",
        "config": {
            "user_id": "john",
            "local_mem0_config": {
                "vector_store": {
                    "provider": "qdrant",
                    "config": {"host": "localhost", "port": 6333}
                },
                "llm": {
                    "provider": "openai",
                    "config": {"api_key": "your-api-key", "model": "gpt-4"}
                },
                "embedder": {
                    "provider": "openai",
                    "config": {"api_key": "your-api-key", "model": "text-embedding-3-small"}
                }
            }
        },
        "user_memory": {}
    }
)

3. External Memory (New Approach)

External Memory provides a standalone memory system that operates independently from the crew’s built-in memory. This is ideal for specialized memory providers or cross-application memory sharing.

Basic External Memory with Mem0

import os
from crewai import Agent, Crew, Process, Task
from crewai.memory.external.external_memory import ExternalMemory

os.environ["MEM0_API_KEY"] = "your-api-key"

# Create external memory instance
external_memory = ExternalMemory(
    embedder_config={
        "provider": "mem0", 
        "config": {"user_id": "U-123"}
    }
)

crew = Crew(
    agents=[...],
    tasks=[...],
    external_memory=external_memory,  # Separate from basic memory
    process=Process.sequential,
    verbose=True
)

Custom Storage Implementation

from crewai.memory.external.external_memory import ExternalMemory
from crewai.memory.storage.interface import Storage

class CustomStorage(Storage):
    def __init__(self):
        self.memories = []

    def save(self, value, metadata=None, agent=None):
        self.memories.append({
            "value": value, 
            "metadata": metadata, 
            "agent": agent
        })

    def search(self, query, limit=10, score_threshold=0.5):
        # Implement your search logic here
        return [m for m in self.memories if query.lower() in str(m["value"]).lower()]

    def reset(self):
        self.memories = []

# Use custom storage
external_memory = ExternalMemory(storage=CustomStorage())

crew = Crew(
    agents=[...],
    tasks=[...],
    external_memory=external_memory
)

Memory System Comparison

FeatureBasic MemoryUser Memory (Legacy)External Memory
Setup ComplexitySimpleMediumMedium
IntegrationBuilt-in contextualContextual + User-specificStandalone
StorageLocal filesMem0 Cloud/LocalCustom/Mem0
Cross-session
User-specific
Custom providersLimitedMem0 onlyAny provider
Recommended forMost use casesLegacy projectsSpecialized needs

Supported Embedding Providers

OpenAI (Default)

crew = Crew(
    memory=True,
    embedder={
        "provider": "openai",
        "config": {"model": "text-embedding-3-small"}
    }
)

Ollama

crew = Crew(
    memory=True,
    embedder={
        "provider": "ollama",
        "config": {"model": "mxbai-embed-large"}
    }
)

Google AI

crew = Crew(
    memory=True,
    embedder={
        "provider": "google",
        "config": {
            "api_key": "your-api-key",
            "model": "text-embedding-004"
        }
    }
)

Azure OpenAI

crew = Crew(
    memory=True,
    embedder={
        "provider": "openai",
        "config": {
            "api_key": "your-api-key",
            "api_base": "https://your-resource.openai.azure.com/",
            "api_version": "2023-05-15",
            "model_name": "text-embedding-3-small"
        }
    }
)

Vertex AI

crew = Crew(
    memory=True,
    embedder={
        "provider": "vertexai",
        "config": {
            "project_id": "your-project-id",
            "region": "your-region",
            "api_key": "your-api-key",
            "model_name": "textembedding-gecko"
        }
    }
)

Security Best Practices

Environment Variables

import os
from crewai import Crew

# Store sensitive data in environment variables
crew = Crew(
    memory=True,
    embedder={
        "provider": "openai",
        "config": {
            "api_key": os.getenv("OPENAI_API_KEY"),
            "model": "text-embedding-3-small"
        }
    }
)

Storage Security

import os
from crewai import Crew
from crewai.memory import LongTermMemory
from crewai.memory.storage.ltm_sqlite_storage import LTMSQLiteStorage

# Use secure storage paths
storage_path = os.getenv("CREWAI_STORAGE_DIR", "./storage")
os.makedirs(storage_path, mode=0o700, exist_ok=True)  # Restricted permissions

crew = Crew(
    memory=True,
    long_term_memory=LongTermMemory(
        storage=LTMSQLiteStorage(
            db_path=f"{storage_path}/memory.db"
        )
    )
)

Troubleshooting

Common Issues

Memory not persisting between sessions?

  • Check CREWAI_STORAGE_DIR environment variable
  • Ensure write permissions to storage directory
  • Verify memory is enabled with memory=True

Mem0 authentication errors?

  • Verify MEM0_API_KEY environment variable is set
  • Check API key permissions on Mem0 dashboard
  • Ensure mem0ai package is installed

High memory usage with large datasets?

  • Consider using External Memory with custom storage
  • Implement pagination in custom storage search methods
  • Use smaller embedding models for reduced memory footprint

Performance Tips

  • Use memory=True for most use cases (simplest and fastest)
  • Only use User Memory if you need user-specific persistence
  • Consider External Memory for high-scale or specialized requirements
  • Choose smaller embedding models for faster processing
  • Set appropriate search limits to control memory retrieval size

Benefits of Using CrewAI’s Memory System

  • 🦾 Adaptive Learning: Crews become more efficient over time, adapting to new information and refining their approach to tasks.
  • 🫡 Enhanced Personalization: Memory enables agents to remember user preferences and historical interactions, leading to personalized experiences.
  • 🧠 Improved Problem Solving: Access to a rich memory store aids agents in making more informed decisions, drawing on past learnings and contextual insights.

Conclusion

Integrating CrewAI’s memory system into your projects is straightforward. By leveraging the provided memory components and configurations, you can quickly empower your agents with the ability to remember, reason, and learn from their interactions, unlocking new levels of intelligence and capability.