A high-performance MCP (Model Context Protocol) vector database implemented in Ada SPARK, designed to leverage Ada's strong typing, formal verification capabilities, and built-in concurrency features for safe and reliable vector operations.
- Ada SPARK Architecture: Built with formal verification and strong typing for maximum safety
- Task-Based Concurrency: Ada's protected objects and tasks for safe concurrent operations
- OAuth 2.1 Authentication: Secure authentication with client credentials and refresh token flows
- REST API: Full REST API alongside MCP protocol for web integration
- Vector Compression: Multiple compression algorithms to reduce storage requirements
- MCP Protocol Support: Full Model Context Protocol implementation for AI/ML integration
- Persistent Storage: File-based persistence with in-memory caching for fast operations
- Backup & Recovery: Full backup/restore capabilities with JSON export/import
- Concurrent Vector Operations: Leverages Ada's task system for parallel operations
- Multiple Distance Metrics: Cosine similarity, Euclidean distance, Manhattan distance
- Dynamic Store Management: Create and manage multiple vector stores at runtime
- Memory Safety: Ada's strong typing prevents buffer overflows and memory corruption
- Formal Verification: SPARK contracts ensure correctness of critical operations
Ada SPARK VectorDB Application
├── Cluster_Manager (Distributed Management)
├── OAuth_Server (OAuth 2.1 Authentication Server)
├── Rest_Api_Server (REST API Server)
├── Vector_Store_Supervisor (Protected Object for Store Management)
│ ├── Vector_Store (Protected Object per store)
│ │ └── Vector_Persistence (File I/O + Compression)
│ └── Vector_Store (Protected Object per store)
│ └── Vector_Persistence (File I/O + Compression)
├── MCP_Server (MCP Protocol Handler with OAuth)
└── Vector_Index_Manager (Index Management)
# If you have Alire installed
alr get spark_vectordb
cd spark_vectordb*
alr build
alr run
# Or clone and build
git clone <repository-url>
cd ada-spark-vectordb
alr build --profiles=development
alr rungit clone <repository-url>
cd ada-spark-vectordb
make build
make runOption 1: Alire (Recommended)
- Alire 2.0+ (Ada package manager)
- Download from: https://alire.ada.dev/
Option 2: Traditional GNAT
- GNAT Community 2023+ or GNAT Pro
- Ada SPARK 2014+
- GPRbuild
Alire automatically manages the Ada toolchain and dependencies:
# Download and install from https://alire.ada.dev/
# Or use package managers:
# macOS (with Homebrew)
brew install alire
# Arch Linux
yay -S alire
# Or download binary releases from GitHubAdaCore Community Edition:
- Download from: https://www.adacore.com/download
- Follow the installation instructions for your platform
Package Managers:
# Ubuntu/Debian
sudo apt-get install gnat gprbuild
# macOS (with Homebrew)
brew install gnat
# Fedora
sudo dnf install gcc-gnat gprbuild
# Arch Linux
sudo pacman -S gcc-ada gprbuildUsing Alire (Recommended):
# Development build (default)
alr build
# Or specify build profile
alr build --profiles=development
alr build --profiles=validation # Full validation with contracts
alr build --profiles=release # Optimized release build
# Using Make with Alire
make build # Uses Alire if available
make BUILD_MODE=release build # Release buildUsing Make:
make build # Auto-detects Alire or gprbuild
make examples # Build example programs
make BUILD_MODE=validation build # Full validation buildUsing GPRbuild directly:
gprbuild -P spark_vectordb.gpr -XBUILD_MODE=developmentUsing the build script:
./build.sh # Auto-detects best build methodUsing Alire:
alr run # Run main applicationUsing Make:
make run # Run main application
make run-examples # Run example programsDirect execution:
./bin/spark_vectordb_main # Main application
./bin/examples/basic_usage # Example programWith Alire:
# Initial setup (if starting from scratch)
alr init --bin spark_vectordb
# Regular development
alr build # Build
alr run # Run
alr clean # Clean
# SPARK verification
alr exec -- gnatprove -P spark_vectordb.gpr --level=2
# Install locally
alr installBuild Profiles:
validation: Full contracts, assertions, and runtime checksdevelopment: Debug info with contracts (default)release: Optimized with minimal checks
with Spark_VectorDB; use Spark_VectorDB;
procedure Example is
Token_Response : OAuth_Token_Map;
Vector_1 : Vector_Array := (1.0, 2.0, 3.0);
Query_Vector : Vector_Array := (1.1, 2.1, 3.1);
Results : Search_Results_Array;
begin
-- Start the application
Start_Application;
-- Register OAuth client (optional - default admin client is created)
Register_OAuth_Client("my_client", "my_secret",
(Scopes => ("read", "write")));
-- Get OAuth access token
Token_Response := Get_OAuth_Token("admin", "admin_secret_2024",
("read", "write"));
-- Create a regular vector store
Create_Store("my_store");
-- Insert vectors (with automatic compression if enabled)
Insert_Vector("my_store", "doc1", Vector_1,
(Title => "Document 1"));
-- Search for similar vectors
Results := Search_Vectors("my_store", Query_Vector, 5);
-- Get store statistics
declare
Stats : Store_Statistics := Get_Stats("my_store");
begin
Put_Line("Store count: " & Stats.Count'Image);
end;
end Example;This project is licensed under the GNU General Public License v3.0 - see the LICENSE file for details.