An AI-powered tool to verify life claims. It combines Gemini 3 Pro with Semantic Scholar academic search to help users debunk myths and find the truth.
- π Claim Verification: Input common life claims (health, psychology, etc.), and AI will judge their veracity.
- π Academic Support: Automatically searches for relevant academic papers to provide evidence-based scientific explanations.
- πΌοΈ Multimodal Support: Upload images and documents; AI will extract claims from them.
- π¬ Interactive Chat: Supports follow-up questions and challenges, with history stored locally in the browser.
- π± Mobile Friendly: Responsive design for a smooth experience on mobile devices.
- FastAPI - Python Web Framework
- google-genai - Gemini API SDK
- Semantic Scholar API - Academic paper search
- asyncio - Asynchronous task queue (respecting 1 RPS limit)
- Next.js 15 - React Framework
- TypeScript - Type safety
- Tailwind CSS - Styling
- react-markdown - Markdown rendering
Truth Seeker/
βββ backend/
β βββ main.py # FastAPI entry point
β βββ requirements.txt # Python dependencies
β βββ .env.example # Environment variables example
β βββ Dockerfile # Backend Dockerfile
β βββ prompts/
β β βββ system_prompt.py # System prompts
β βββ routers/
β β βββ chat.py # Chat API routes
β βββ services/
β β βββ gemini_service.py # Gemini API service
β β βββ semantic_scholar.py # Academic search service
β βββ utils/
β βββ queue_manager.py # Request queue management
βββ frontend/
β βββ app/
β β βββ layout.tsx # Root layout
β β βββ page.tsx # Main page
β β βββ globals.css # Global styles
β βββ components/
β β βββ ChatMessage.tsx # Message components
β β βββ ChatInput.tsx # Input components
β β βββ ConversationSidebar.tsx # Sidebar
β βββ hooks/
β β βββ useConversationHistory.ts # History hook
β βββ lib/
β β βββ api.ts # API client
β βββ Dockerfile # Frontend Dockerfile
β βββ package.json
βββ nginx/
β βββ nginx.conf # Nginx config
βββ docker-compose.yml # Docker compose
βββ README.md
git clone https://github.com/feixukeji/truth-seeker.git
cd truth-seekercd backend
# Create virtual environment
python -m venv venv
.\venv\Scripts\activate # Windows
# source venv/bin/activate # Linux/Mac
# Install dependencies
pip install -r requirements.txt
# Configure environment variables
copy .env.example .env
# Edit .env and fill in your API keys:
# GOOGLE_API_KEY=Your Gemini API Key
# SEMANTIC_SCHOLAR_API_KEY=Your Semantic Scholar API Key (Optional)cd frontend
# Install dependencies
npm installStart Backend (Port 8000):
cd backend
.\venv\Scripts\activate
python main.py
# Or use: uvicorn main:app --reloadStart Frontend (Port 3000):
cd frontend
npm run devOpen your browser and visit http://localhost:3000
This project supports one-click containerized deployment using Docker Compose, including frontend, backend, and Nginx reverse proxy.
Ensure your server has:
Create a .env file in the backend directory and fill in your API keys:
cd backend
# Windows
copy .env.example .env
# Linux/Mac
# cp .env.example .env
# Edit .env and fill in real keysRun the following in the project root:
docker-compose up -d --buildOnce started, Nginx will listen on port 80.
- Access via server IP or domain:
http://localhostorhttp://your-server-ip - API endpoints are at:
http://your-server-ip/api
docker-compose down- Visit Google AI Studio
- Create or log in to your Google account
- Click "Get API Key" to obtain your key
- Visit Semantic Scholar API
- Register for an API Key (initial limit 1 RPS)
- You can use it without a key, but you will share the public rate limit
-
Input Claim: Enter a life claim you want to verify, e.g., "Do eggs increase cholesterol?"
-
Upload Files: You can upload images or PDF files; AI will automatically extract claims for verification.
-
View Results: AI will provide a judgment (Correct / Partially Correct / Incorrect / Insufficient Evidence) along with:
- A brief explanation
- Detailed scientific analysis
- Citations of relevant academic papers
-
Continue Conversation: Ask for details or challenge the judgment.
- Semantic Scholar API is limited to 1 request/second; you may need to wait in a queue during high traffic.
- Gemini 3 Pro uses thinking capabilities; the first response may take some time.
- Conversation history is stored in the browser's
localStorage; clearing browser data will lose history.
This project is licensed under the PolyForm Noncommercial License.