Try ThinkEx · Features · Self-Host · Contribute
Today's apps and AI split what should be a single, fluid process: AI reasoning happens in isolated chat threads, while your information is scattered across tabs and windows.
This split prevents knowledge from compounding. Each conversation starts from scratch. Insights don't connect to your existing work. You can't build on past thinking. Valuable insights get buried in chat history, and you find yourself explaining the same context repeatedly. Information disappears into logs you never revisit.
ThinkEx solves this by making context explicit, organized, and persistent.
ThinkEx is a visual thinking environment where notes, media, and AI conversations compound into lasting knowledge.
Think of a large desk where you spread out textbooks, notes, and papers to work. You look back and forth, connecting dots, comparing sources, and asking questions. ThinkEx brings that desk to your browser, where AI can help alongside you.
- See Everything: Bring PDFs, videos, and notes onto a visual canvas. Organize them spatially to make sense of the information.
- Compare Sources: Look across your sources side-by-side. Spot patterns and contradictions that only emerge when everything is visible.
- Targeted Reasoning: Select specific items on your desk for the AI to analyze. Point to a note and a paragraph and ask for the connection.
- Capture Insights: Extract findings into structured knowledge that become part of your permanent workspace.
- User-Controlled Context: Manually select exact cards, notes, or document sections for the AI. No opaque retrieval mechanisms.
- Spatial Canvas: Arrange notes, PDFs, videos, and chat side-by-side.
- First-Class Media: Native PDF viewing with highlights; YouTube videos with transcript-backed context.
- Persistent Knowledge: Saved cards (notes, flashcards, references) remain in your workspace.
- Multi-Model: Switch AI models per task without locking into a single provider.
- Sharing: Share or export workspaces with others
| Approach | Examples | What It Loses |
|---|---|---|
| Chat-First | ChatGPT, Gemini, Claude | Insights vanish into endless scroll and context resets every conversation. |
| Notes-First | Notion, Obsidian | AI is bolted on and isolated from your info. |
| Retrieval-First | NotebookLM | Sources are trapped behind the interface where you can't see or work with them. |
Nothing disappears into a black box. You see what AI sees and control what it works with. And it's open source, so you get full transparency, no model lock-in, and a product driven by the community.
- Framework: Next.js
- Styling: Tailwind CSS, Shadcn UI
- Database: PostgreSQL with Drizzle ORM
- State: TanStack Query, Zustand
- Auth: Better Auth
ThinkEx can be self hosted for local development. The setup uses Docker for PostgreSQL (recommended) while running the Next.js app locally.
- Node.js (v20+)
- pnpm (will be installed automatically if missing)
- Docker (recommended for PostgreSQL) OR PostgreSQL (v12+) installed locally
- Required API Keys:
- Google AI: API key from Google AI Studio
GOOGLE_GENERATIVE_AI_API_KEY
- Assistant UI: API key and base URL from Assistant Cloud
NEXT_PUBLIC_ASSISTANT_BASE_URLASSISTANT_API_KEY
- Google AI: API key from Google AI Studio
- Optional API Keys:
- Google OAuth: Get credentials from Google Cloud Console (for OAuth login)
GOOGLE_CLIENT_IDGOOGLE_CLIENT_SECRET
- Supabase: Project URL and keys from Supabase (for file storage, alternative to local storage)
NEXT_PUBLIC_SUPABASE_URLNEXT_PUBLIC_SUPABASE_PUBLISHABLE_OR_ANON_KEYSUPABASE_SERVICE_ROLE_KEY
- Google OAuth: Get credentials from Google Cloud Console (for OAuth login)
Run the interactive setup script:
git clone https://github.com/ThinkEx-OSS/thinkex.git
cd thinkex
./setup.shThe script will:
- Check prerequisites (Node.js, pnpm, Docker)
- Create
.envfile from template - Generate
BETTER_AUTH_SECRETautomatically - Start PostgreSQL in Docker (or use local PostgreSQL if Docker is not available)
- Configure database connection
- Install dependencies
- Initialize the database schema
- Start the development server automatically
Access ThinkEx at http://localhost:3000
PostgreSQL Docker Commands:
- Stop PostgreSQL:
docker-compose down - Start PostgreSQL:
docker-compose up -d - View logs:
docker-compose logs -f postgres
-
Clone the repository
git clone https://github.com/ThinkEx-OSS/thinkex.git cd thinkex -
Start PostgreSQL (Docker)
docker-compose up -d postgres
Or use your local PostgreSQL installation.
-
Install dependencies
pnpm install
-
Configure environment variables
cp .env.example .env
Edit
.envand configure:- Database: Set
DATABASE_URLto your PostgreSQL connection string
# For Docker PostgreSQL: DATABASE_URL=postgresql://thinkex:thinkex_password_change_me@localhost:5432/thinkex # For local PostgreSQL: DATABASE_URL=postgresql://user:password@localhost:5432/thinkex
- Better Auth: Generate
BETTER_AUTH_SECRETwithopenssl rand -base64 32 - Google OAuth: Get credentials from Google Cloud Console
- Supabase: Your Supabase project URL and keys (for file storage, if using Supabase storage)
- Google AI: API key from Google AI Studio
- Database: Set
-
Initialize the database
pnpm db:push
-
Start the development server
pnpm dev
-
Access the application Open http://localhost:3000 in your browser.
ThinkEx supports two storage backends for file uploads:
Option 1: Local File Storage (Recommended for Self-Hosting)
- Set
STORAGE_TYPE=localin your.envfile - Files are stored in the
./uploadsdirectory - No external dependencies required
- Simple setup with full control over your data
Option 2: Supabase Storage (Cloud-based)
- Set
STORAGE_TYPE=supabasein your.envfile - Configure Supabase credentials:
NEXT_PUBLIC_SUPABASE_URL: Your Supabase project URLNEXT_PUBLIC_SUPABASE_PUBLISHABLE_OR_ANON_KEY: Anon key from SupabaseSUPABASE_SERVICE_ROLE_KEY: Service role key from Supabase
- Create a storage bucket named
file-uploadand set it to Public
We welcome contributions.
- Fork the repository.
- Create a feature branch:
git checkout -b feature/new-feature - Commit changes:
git commit -m 'Add new feature' - Push to branch:
git push origin feature/new-feature - Open a Pull Request.
See CONTRIBUTING.md for details.
This project is licensed under the AGPL-3.0 License.