Skip to content

A lightweight, multi-agent system built with LangGraph that customizes the research process using AI analysts and experts to generate comprehensive reports.

Notifications You must be signed in to change notification settings

hadeelbkh/research-assistant

Repository files navigation

Research Assistant

A lightweight, multi-agent system built with LangGraph that customizes the research process using AI analysts and experts to generate comprehensive reports.

Research Assistant Architecture

Overview

The Research Assistant leverages multiple AI agents to conduct in-depth research on complex topics. The system generates specialized AI analysts, conducts parallel interviews with expert agents, and synthesizes findings into comprehensive reports.

Key Features

  • Dynamic Analyst Generation: Creates specialized AI analysts based on your research topic
  • Human-in-the-Loop: Allows refinement of analysts before research begins
  • Parallel Processing: Conducts multiple expert interviews simultaneously using map-reduce patterns
  • Multi-Source Research: Integrates web search (Tavily) and Wikipedia for comprehensive coverage
  • Structured Output: Generates well-formatted reports with introduction, content, and conclusion
  • LangGraph Studio Compatible: Includes multiple graph implementations for different use cases

Architecture

The system consists of several key components:

  1. Analyst Creation: Generates AI personas focused on different aspects of your topic
  2. Interview Orchestration: Conducts structured interviews between analysts and experts
  3. Parallel Research: Multiple interviews run simultaneously for efficiency
  4. Report Synthesis: Combines all insights into a coherent final report

Available Graphs

  • research_assistant.py: Main research workflow
  • parallelization.py: Demonstrates parallel execution patterns
  • sub_graphs.py: Shows sub-graph implementations
  • map_reduce.py: Implements map-reduce patterns for scaling

Quick Start

Prerequisites

  • Python 3.11+
  • OpenAI API key
  • Tavily API key (for web search)
  • LangSmith API key (optional, for tracing)

Installation

  1. Clone the repository:
git clone https://github.com/hadeelbkh/research-assistant
cd research-assistant
  1. Create a virtual environment:
python -m venv ra-env
# On Windows:
ra-env\Scripts\activate
# On macOS/Linux:
source ra-env/bin/activate
  1. Install dependencies:
pip install -r requirements.txt

Environment Setup

Set up your API keys:

export OPENAI_API_KEY="your-openai-key"
export TAVILY_API_KEY="your-tavily-key"
export LANGSMITH_API_KEY="your-langsmith-key"  # Optional

Running the Research Assistant

Option 1: Jupyter Notebook (Recommended for Learning)

Open and run research-assistant.ipynb for an interactive experience with detailed explanations.

Option 2: LangGraph Studio

  1. Navigate to the studio directory:
cd studio
  1. Deploy with LangGraph Studio:
langgraph dev
  1. Access the web interface to interact with different graph implementations.

Usage Example

from studio.research_assistant import graph
from langgraph.checkpoint.memory import MemorySaver

# Initialize
memory = MemorySaver()
thread = {"configurable": {"thread_id": "1"}}

# Define research parameters
topic = "The impact of AI on healthcare"
max_analysts = 3

# Run research
for event in graph.stream({
    "topic": topic,
    "max_analysts": max_analysts
}, thread, stream_mode="values"):
    # Process results
    if "final_report" in event:
        print(event["final_report"])

Configuration

Analyst Customization

Modify the analyst_instructions in research_assistant.py to customize how analysts are generated:

analyst_instructions = """You are tasked with creating a set of AI analyst personas..."""

Interview Depth

Adjust interview depth by modifying max_num_turns in the interview state:

interview_state = {
    "max_num_turns": 5,  # Adjust conversation depth
    # ... other parameters
}

Research Sources

The system supports multiple research sources:

  • Web Search: Via Tavily API
  • Wikipedia: For encyclopedic information
  • Custom Sources: Extend by adding new search functions

Project Structure

research-assistant/
├── research-assistant.ipynb    # Interactive tutorial notebook
├── studio/                     # LangGraph Studio files
│   ├── research_assistant.py   # Main research graph
│   ├── parallelization.py      # Parallel execution examples
│   ├── sub_graphs.py          # Sub-graph patterns
│   ├── map_reduce.py          # Map-reduce implementation
│   ├── langgraph.json         # Studio configuration
│   └── requirements.txt       # Studio-specific dependencies
├── requirements.txt           # Main project dependencies
├── research_assistant.jpg     # Architecture diagram
└── README.md                  # This file

Key Components

Analyst Generation

  • Creates specialized AI personas based on research topics
  • Supports human feedback for refinement
  • Generates diverse perspectives for comprehensive coverage

Interview System

  • Structured conversations between analysts and experts
  • Multi-turn dialogues for deep insights
  • Automatic question generation and follow-up

Parallel Processing

  • Multiple interviews run simultaneously
  • Map-reduce patterns for scalability
  • Efficient resource utilization

Report Generation

  • Structured output with clear sections
  • Source attribution and citations
  • Professional formatting

Dependencies

Core Libraries

  • LangGraph: Multi-agent orchestration
  • LangChain: LLM integration and tooling
  • OpenAI: Language model provider
  • Tavily: Web search API
  • Wikipedia: Knowledge base access

Development Tools

  • Jupyter: Interactive development
  • LangSmith: Observability and tracing
  • Pydantic: Data validation and models

Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature-name
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

Troubleshooting

Common Issues

  1. API Key Errors: Ensure all required API keys are set in environment variables
  2. Memory Issues: Reduce max_analysts for complex topics
  3. Rate Limiting: Add delays between API calls if hitting rate limits

Getting Help

  • Check the Jupyter notebook for detailed examples
  • Review LangGraph documentation for advanced patterns
  • Open an issue for bug reports or feature requests

About

A lightweight, multi-agent system built with LangGraph that customizes the research process using AI analysts and experts to generate comprehensive reports.

Topics

Resources

Stars

Watchers

Forks