LlamaIndexTool

Description

The LlamaIndexTool is designed to be a general wrapper around LlamaIndex tools and query engines, enabling you to leverage LlamaIndex resources in terms of RAG/agentic pipelines as tools to plug into CrewAI agents. This tool allows you to seamlessly integrate LlamaIndex’s powerful data processing and retrieval capabilities into your CrewAI workflows.

Installation

To use this tool, you need to install LlamaIndex:

uv add llama-index

Steps to Get Started

To effectively use the LlamaIndexTool, follow these steps:

  1. Install LlamaIndex: Install the LlamaIndex package using the command above.
  2. Set Up LlamaIndex: Follow the LlamaIndex documentation to set up a RAG/agent pipeline.
  3. Create a Tool or Query Engine: Create a LlamaIndex tool or query engine that you want to use with CrewAI.

Example

The following examples demonstrate how to initialize the tool from different LlamaIndex components:

From a LlamaIndex Tool

Code
from crewai_tools import LlamaIndexTool
from crewai import Agent
from llama_index.core.tools import FunctionTool

# Example 1: Initialize from FunctionTool
def search_data(query: str) -> str:
    """Search for information in the data."""
    # Your implementation here
    return f"Results for: {query}"

# Create a LlamaIndex FunctionTool
og_tool = FunctionTool.from_defaults(
    search_data, 
    name="DataSearchTool",
    description="Search for information in the data"
)

# Wrap it with LlamaIndexTool
tool = LlamaIndexTool.from_tool(og_tool)

# Define an agent that uses the tool
@agent
def researcher(self) -> Agent:
    '''
    This agent uses the LlamaIndexTool to search for information.
    '''
    return Agent(
        config=self.agents_config["researcher"],
        tools=[tool]
    )

From LlamaHub Tools

Code
from crewai_tools import LlamaIndexTool
from llama_index.tools.wolfram_alpha import WolframAlphaToolSpec

# Initialize from LlamaHub Tools
wolfram_spec = WolframAlphaToolSpec(app_id="your_app_id")
wolfram_tools = wolfram_spec.to_tool_list()
tools = [LlamaIndexTool.from_tool(t) for t in wolfram_tools]

From a LlamaIndex Query Engine

Code
from crewai_tools import LlamaIndexTool
from llama_index.core import VectorStoreIndex
from llama_index.core.readers import SimpleDirectoryReader

# Load documents
documents = SimpleDirectoryReader("./data").load_data()

# Create an index
index = VectorStoreIndex.from_documents(documents)

# Create a query engine
query_engine = index.as_query_engine()

# Create a LlamaIndexTool from the query engine
query_tool = LlamaIndexTool.from_query_engine(
    query_engine,
    name="Company Data Query Tool",
    description="Use this tool to lookup information in company documents"
)

Class Methods

The LlamaIndexTool provides two main class methods for creating instances:

from_tool

Creates a LlamaIndexTool from a LlamaIndex tool.

Code
@classmethod
def from_tool(cls, tool: Any, **kwargs: Any) -> "LlamaIndexTool":
    # Implementation details

from_query_engine

Creates a LlamaIndexTool from a LlamaIndex query engine.

Code
@classmethod
def from_query_engine(
    cls,
    query_engine: Any,
    name: Optional[str] = None,
    description: Optional[str] = None,
    return_direct: bool = False,
    **kwargs: Any,
) -> "LlamaIndexTool":
    # Implementation details

Parameters

The from_query_engine method accepts the following parameters:

  • query_engine: Required. The LlamaIndex query engine to wrap.
  • name: Optional. The name of the tool.
  • description: Optional. The description of the tool.
  • return_direct: Optional. Whether to return the response directly. Default is False.

Conclusion

The LlamaIndexTool provides a powerful way to integrate LlamaIndex’s capabilities into CrewAI agents. By wrapping LlamaIndex tools and query engines, it enables agents to leverage sophisticated data retrieval and processing functionalities, enhancing their ability to work with complex information sources.