A powerful web-based research assistant built with Node.js that leverages Ollama LLM to generate intelligent search queries and fetch comprehensive results from popular search APIs (Tavily, Perplexity, or DuckDuckGo).
Based on Ollama Deep Researcher.
- Intelligent Query Generation - Uses Ollama LLM to create optimized search queries
- Multiple Search APIs - Supports Tavily, Perplexity, and DuckDuckGo
- Iterative Research - Refines searches and builds a comprehensive summary
- Simple CLI Interface - Easy to use from command line
- Modular Architecture - Easily extensible for adding more features
- Install Node.js (v18+): nodejs.org
- Install Ollama: ollama.com
- Pull the required model:
ollama pull deepseek-r1:8b
git clone https://github.com/noubre/ollama-deep-researcher.git
cd ollama-deep-researcher
npm install
cp .env.example .env
Edit .env
with your settings:
- Choose a search API
- Add API keys if using Tavily or Perplexity
npm start "Your research topic"
Or directly:
node src/cli.js "Your research topic"
Environment Variable | Description | Options |
---|---|---|
SEARCH_API |
The search API to use | tavily , perplexity , duckduckgo |
TAVILY_API_KEY |
API key for Tavily | Required if using Tavily |
PERPLEXITY_API_KEY |
API key for Perplexity | Required if using Perplexity |
OLLAMA_BASE_URL |
URL for Ollama API | Default: http://localhost:11434 |
OLLAMA_MODEL |
Model to use | Default: deepseek-r1:8b |
The application follows a modular architecture:
- src/cli.js: Command line interface
- src/assistant/: Core assistant functionality
- index.js: Main entry point for the assistant
- graph.js: Research workflow using LangGraph
- configuration.js: Environment configuration
- prompts.js: LLM prompt templates
- utils.js: Helper functions for web searches and processing
- The research process is organized as a graph of states and actions
- Initial topic is used to generate a specific search query via Ollama
- Search query is used to fetch results from the configured search API
- Results are processed and summarized
- The process iterates, refining the research with follow-up questions
- A final comprehensive summary is produced
You can use different Ollama models by changing the OLLAMA_MODEL
variable in your .env
file. For example:
OLLAMA_MODEL=llama3:8b
For debugging issues:
node debug.js "Your research topic"
This will provide more verbose output to help troubleshoot any problems.
- Basic research functionality
- Support for Tavily, Perplexity, and DuckDuckGo APIs
- Command-line interface
- LangGraph-based research workflow
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch:
git checkout -b feature/amazing-feature
- Commit your changes:
git commit -m 'Add some amazing feature'
- Push to the branch:
git push origin feature/amazing-feature
- Open a Pull Request
- Follow existing code style
- Add tests for new features when possible
- Update documentation to reflect changes
This project is licensed under the MIT License - see the LICENSE file for details.