Skip to content

Luckman-Khan/Multi_Source_Research_Agent

Repository files navigation

Multi-Source Research Agent

This project implements a smart research agent using LangGraph powered by Google Gemini. The agent can perform parallel searches across multiple sources (Google, Bing, Reddit), analyze the results, and synthesize a comprehensive answer.

Features

  • Multi-Source Research: Queries Google, Bing, and Reddit simultaneously for diverse perspectives.
  • Intelligent Analysis: Uses LLMs to analyze search results from each source independently.
  • Reddit Deep Dive: Specifically fetches and analyzes reddit posts and comments for community sentiment and detailed discussions.
  • Synthesis: Combines insights from all sources into a final, well-structured answer.
  • File Output: Saves the final synthesized response to a Markdown file (final_response.md).

Prerequisites

  • Python 3.10+
  • API Keys for:
    • Google Gemini (Google GenAI)
    • Bright Data Api

Installation

  1. Clone the repository:

    git clone https://github.com/Luckman-Khan/Multi_Source_Research_Agent.git
    cd Multi-Source_Research_Agent
  2. Install dependencies: It is recommended to use a virtual environment.

    # using uv or pip
    uv sync 
    # OR
    pip install -r requirements.txt

    (Note: Ensure you have langgraph, langchain, langchain-google-genai, praw, python-dotenv and other necessary packages installed.)

  3. Set up Environment Variables: Create a .env file in the root directory and add your API keys:

    GOOGLE_API_KEY=your_google_api_key
    BRIGHTDATA_API_KEY=your_bright_data_api_key

Usage

  1. Run the Agent: You can run the agent via the provided Jupyter Notebook main.ipynb.

    Open the notebook and run all cells. The last cell initiates the run_chatbot() function.

    run_chatbot()
  2. Interact:

    • The agent will prompt you to "Ask me anything".
    • Type your research question (e.g., "Current state of AI agents").
    • Type exit to quit.
  3. Output:

    • The agent prints the progress of searches and analyses to the console.
    • The final synthesized answer is printed to the console.
    • The final answer is also saved to final_response.md in the project directory.

Project Structure

  • main.ipynb: The main entry point containing the LangGraph definition and the chatbot loop.
  • web_operations.py: Contains functions for performing Google/Bing searches and Reddit API interactions.
  • prompts.py: Stores the prompt templates used for analysis and synthesis.
  • snapshot_operations.py: Helper functions for managing efficient state or browsing snapshots, if applicable.
  • final_response.md: The output file where the latest answer is saved.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published