Skip to content

feixukeji/Truth-Seeker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Truth Seeker

An AI-powered tool to verify life claims. It combines Gemini 3 Pro with Semantic Scholar academic search to help users debunk myths and find the truth.

Features

  • πŸ” Claim Verification: Input common life claims (health, psychology, etc.), and AI will judge their veracity.
  • πŸ“š Academic Support: Automatically searches for relevant academic papers to provide evidence-based scientific explanations.
  • πŸ–ΌοΈ Multimodal Support: Upload images and documents; AI will extract claims from them.
  • πŸ’¬ Interactive Chat: Supports follow-up questions and challenges, with history stored locally in the browser.
  • πŸ“± Mobile Friendly: Responsive design for a smooth experience on mobile devices.

Tech Stack

Backend

  • FastAPI - Python Web Framework
  • google-genai - Gemini API SDK
  • Semantic Scholar API - Academic paper search
  • asyncio - Asynchronous task queue (respecting 1 RPS limit)

Frontend

  • Next.js 15 - React Framework
  • TypeScript - Type safety
  • Tailwind CSS - Styling
  • react-markdown - Markdown rendering

Project Structure

Truth Seeker/
β”œβ”€β”€ backend/
β”‚   β”œβ”€β”€ main.py              # FastAPI entry point
β”‚   β”œβ”€β”€ requirements.txt     # Python dependencies
β”‚   β”œβ”€β”€ .env.example         # Environment variables example
β”‚   β”œβ”€β”€ Dockerfile           # Backend Dockerfile
β”‚   β”œβ”€β”€ prompts/
β”‚   β”‚   └── system_prompt.py # System prompts
β”‚   β”œβ”€β”€ routers/
β”‚   β”‚   └── chat.py          # Chat API routes
β”‚   β”œβ”€β”€ services/
β”‚   β”‚   β”œβ”€β”€ gemini_service.py      # Gemini API service
β”‚   β”‚   └── semantic_scholar.py    # Academic search service
β”‚   └── utils/
β”‚       └── queue_manager.py # Request queue management
β”œβ”€β”€ frontend/
β”‚   β”œβ”€β”€ app/
β”‚   β”‚   β”œβ”€β”€ layout.tsx       # Root layout
β”‚   β”‚   β”œβ”€β”€ page.tsx         # Main page
β”‚   β”‚   └── globals.css      # Global styles
β”‚   β”œβ”€β”€ components/
β”‚   β”‚   β”œβ”€β”€ ChatMessage.tsx  # Message components
β”‚   β”‚   β”œβ”€β”€ ChatInput.tsx    # Input components
β”‚   β”‚   └── ConversationSidebar.tsx  # Sidebar
β”‚   β”œβ”€β”€ hooks/
β”‚   β”‚   └── useConversationHistory.ts  # History hook
β”‚   β”œβ”€β”€ lib/
β”‚   β”‚   └── api.ts           # API client
β”‚   β”œβ”€β”€ Dockerfile           # Frontend Dockerfile
β”‚   └── package.json
β”œβ”€β”€ nginx/
β”‚   └── nginx.conf           # Nginx config
β”œβ”€β”€ docker-compose.yml       # Docker compose
└── README.md

Quick Start

1. Clone the project

git clone https://github.com/feixukeji/truth-seeker.git
cd truth-seeker

2. Backend Setup

cd backend

# Create virtual environment
python -m venv venv
.\venv\Scripts\activate  # Windows
# source venv/bin/activate  # Linux/Mac

# Install dependencies
pip install -r requirements.txt

# Configure environment variables
copy .env.example .env
# Edit .env and fill in your API keys:
# GOOGLE_API_KEY=Your Gemini API Key
# SEMANTIC_SCHOLAR_API_KEY=Your Semantic Scholar API Key (Optional)

3. Frontend Setup

cd frontend

# Install dependencies
npm install

4. Run Services

Start Backend (Port 8000):

cd backend
.\venv\Scripts\activate
python main.py
# Or use: uvicorn main:app --reload

Start Frontend (Port 3000):

cd frontend
npm run dev

5. Access Application

Open your browser and visit http://localhost:3000

Production Deployment (Docker)

This project supports one-click containerized deployment using Docker Compose, including frontend, backend, and Nginx reverse proxy.

1. Prerequisites

Ensure your server has:

2. Configure Environment Variables

Create a .env file in the backend directory and fill in your API keys:

cd backend
# Windows
copy .env.example .env
# Linux/Mac
# cp .env.example .env

# Edit .env and fill in real keys

3. Start Services

Run the following in the project root:

docker-compose up -d --build

4. Access Services

Once started, Nginx will listen on port 80.

  • Access via server IP or domain: http://localhost or http://your-server-ip
  • API endpoints are at: http://your-server-ip/api

5. Stop Services

docker-compose down

Getting API Keys

Gemini API

  1. Visit Google AI Studio
  2. Create or log in to your Google account
  3. Click "Get API Key" to obtain your key

Semantic Scholar API (Optional but Recommended)

  1. Visit Semantic Scholar API
  2. Register for an API Key (initial limit 1 RPS)
  3. You can use it without a key, but you will share the public rate limit

Usage

  1. Input Claim: Enter a life claim you want to verify, e.g., "Do eggs increase cholesterol?"

  2. Upload Files: You can upload images or PDF files; AI will automatically extract claims for verification.

  3. View Results: AI will provide a judgment (Correct / Partially Correct / Incorrect / Insufficient Evidence) along with:

    • A brief explanation
    • Detailed scientific analysis
    • Citations of relevant academic papers
  4. Continue Conversation: Ask for details or challenge the judgment.

Notes

  • Semantic Scholar API is limited to 1 request/second; you may need to wait in a queue during high traffic.
  • Gemini 3 Pro uses thinking capabilities; the first response may take some time.
  • Conversation history is stored in the browser's localStorage; clearing browser data will lose history.

License

This project is licensed under the PolyForm Noncommercial License.