Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
224 changes: 224 additions & 0 deletions OPERATIONS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,224 @@
# RemitWise Operations Guide

This document covers operational procedures for RemitWise, including database backup, restore, and production deployment considerations.

---

## Database

### Current Setup

RemitWise uses **SQLite** (via Prisma) as its database. The database file is a single portable file on disk:

| Environment | Default path | Configured via |
|-------------|------------------------|--------------------|
| Development | `./prisma/dev.db` | `DATABASE_URL` env |
| Test | In-memory / temp file | `TEST_DATABASE_URL` |
| Production | Path you set | `DATABASE_URL` env |

The schema lives in [prisma/schema.prisma](prisma/schema.prisma). It currently stores:

- **User** — Stellar wallet address + timestamps
- **UserPreference** — currency, language, notification settings (1-to-1 with User)

> **Production recommendation:** For hosted deployments, migrate to a managed PostgreSQL service (Supabase, Neon, PlanetScale, Vercel Postgres, Railway, etc.) to get point-in-time recovery, managed backups, and connection pooling. See [Migrating to PostgreSQL](#migrating-to-postgresql) below.

---

## Backup Strategy

### SQLite (Development / Self-hosted)

SQLite databases are a single file. The safest way to back them up without corruption is to use SQLite's built-in `.backup` command or simply copy the file while no writes are in flight.

**Retention policy:** Keep at least **7 daily backups** before rotating. For production data, extend to 30 days.

#### Manual backup

```bash
# Using SQLite's online backup (safe while DB is live)
sqlite3 /path/to/dev.db ".backup '/path/to/backups/dev.db.$(date +%Y%m%d_%H%M%S)'"

# Or a simple file copy (safe when app is stopped, or with WAL mode enabled)
cp /path/to/dev.db /path/to/backups/dev.db.$(date +%Y%m%d_%H%M%S)
```

#### Automated backup script

The repository ships a backup script at [scripts/backup-db.sh](scripts/backup-db.sh).

```bash
# Run once manually
bash scripts/backup-db.sh

# Schedule daily at 02:00 via cron (edit with `crontab -e`)
0 2 * * * /bin/bash /absolute/path/to/remitwise/scripts/backup-db.sh >> /var/log/remitwise-backup.log 2>&1
```

The script:
1. Creates a timestamped `.db` copy in `./backups/`
2. Optionally uploads to S3 if `BACKUP_S3_BUCKET` is set
3. Deletes local copies older than `BACKUP_RETENTION_DAYS` (default: 7)

#### Uploading to S3 (optional)

Set these environment variables (`.env.local` or your deployment secrets):

```bash
BACKUP_S3_BUCKET=your-bucket-name # Required to enable S3 upload
BACKUP_S3_PREFIX=remitwise/db # Optional prefix, default: remitwise/db
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...
AWS_DEFAULT_REGION=us-east-1
```

Requires the [AWS CLI](https://aws.amazon.com/cli/) to be installed on the host.

---

### Managed PostgreSQL (Production)

If you switch to a managed PostgreSQL provider, built-in automated backups handle most of this for you:

| Provider | Automatic backups | Retention | Point-in-time recovery | Notes |
|------------------|-------------------|-------------|------------------------|--------------------------------------------|
| Supabase | Yes (daily) | 7 days (free), 30 days (Pro) | Pro plan | Enable in Project Settings → Database |
| Neon | Yes (continuous) | 7 days | Yes (all plans) | Branching also acts as a snapshot |
| Vercel Postgres | Yes (daily) | 7 days | Paid plans | Managed via Vercel dashboard |
| Railway | Yes (daily) | 7 days | Paid plans | Configure in service settings |
| AWS RDS | Yes (automated) | 1–35 days | Yes | Set `BackupRetentionPeriod` in console |

For these providers, **no additional scripting is needed** — verify that automated backups are enabled in the provider dashboard and set a retention period of at least 7 days.

#### Manual `pg_dump` snapshot (any PostgreSQL)

Even with managed backups, taking a manual snapshot before schema migrations is good practice:

```bash
# Full dump (custom format — compressed, supports selective restore)
pg_dump --format=custom \
--no-acl --no-owner \
--dbname="$DATABASE_URL" \
--file="remitwise_$(date +%Y%m%d_%H%M%S).pgdump"

# Restore a specific dump
pg_restore --clean --no-acl --no-owner \
--dbname="$DATABASE_URL" \
remitwise_20260101_020000.pgdump
```

---

## Restore Steps

### Restoring a SQLite backup

1. **Stop the application** (or ensure no active write connections).

2. **Replace the database file:**
```bash
# Back up the current (possibly corrupted) file first
cp prisma/dev.db prisma/dev.db.broken

# Restore from a known-good backup
cp backups/dev.db.20260101_020000 prisma/dev.db
```

3. **Verify the restored database:**
```bash
sqlite3 prisma/dev.db "SELECT count(*) FROM User;"
```

4. **Run any pending Prisma migrations** to ensure schema is current:
```bash
npx prisma migrate deploy
```

5. **Restart the application.**

### Restoring from a managed PostgreSQL backup

Follow your provider's restore flow:

- **Supabase:** Dashboard → Database → Backups → Restore
- **Neon:** Dashboard → Branches → Create branch from point-in-time
- **Vercel Postgres:** Dashboard → Storage → your DB → Backups tab
- **Railway:** Dashboard → your database service → Backups

After restoring at the provider level:

```bash
# Apply any migrations that were added after the backup point
npx prisma migrate deploy
```

### Restoring a `pg_dump` file

```bash
# 1. Create a clean target database (optional — pg_restore --clean handles this)
createdb remitwise_restored

# 2. Restore
pg_restore --clean --no-acl --no-owner \
--dbname="postgresql://user:pass@host/remitwise_restored" \
remitwise_20260101_020000.pgdump

# 3. Run pending migrations
DATABASE_URL="postgresql://user:pass@host/remitwise_restored" \
npx prisma migrate deploy
```

---

## Migrating to PostgreSQL

To switch from SQLite to PostgreSQL:

1. **Update `prisma/schema.prisma`:**
```diff
datasource db {
- provider = "sqlite"
+ provider = "postgresql"
url = env("DATABASE_URL")
}
```

2. **Set `DATABASE_URL`** to your PostgreSQL connection string:
```bash
DATABASE_URL="postgresql://user:password@host:5432/remitwise?schema=public"
```

3. **Create a new migration** (or reset for a fresh database):
```bash
# For a fresh database
npx prisma migrate dev --name init

# For an existing database with no prior Prisma migrations
npx prisma migrate deploy
```

4. **Migrate existing data** from SQLite (if needed):
```bash
# Export SQLite data as SQL
sqlite3 prisma/dev.db .dump > sqlite_export.sql

# Data will need manual adjustment for PostgreSQL syntax differences
# (boolean literals, AUTOINCREMENT → SERIAL, etc.)
```

For a smoother migration, use a tool like [pgloader](https://pgloader.io/):
```bash
pgloader sqlite:///absolute/path/prisma/dev.db \
postgresql://user:password@host/remitwise
```

---

## Checklist — Before a Production Release

- [ ] Confirm automated backups are enabled in the database provider dashboard
- [ ] Verify backup retention is set to at least 7 days
- [ ] Take a manual snapshot / `pg_dump` before running schema migrations
- [ ] Test restore procedure on a staging environment at least once
- [ ] Confirm `DATABASE_URL` points to the correct production database
- [ ] Run `npx prisma migrate deploy` (not `migrate dev`) in production
10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -553,6 +553,16 @@ Response shape for all paginated endpoints:
- Dark mode
- Mobile app (React Native)

## Operations

For database backup, restore, and production deployment procedures see [OPERATIONS.md](./OPERATIONS.md).

**Quick reference:**

- **Backup (SQLite):** `bash scripts/backup-db.sh` — creates a timestamped copy in `./backups/`, optionally uploads to S3, and rotates files older than 7 days.
- **Restore:** Stop the app → replace `prisma/dev.db` → `npx prisma migrate deploy` → restart.
- **Production:** Switch `prisma/schema.prisma` provider to `postgresql`, set `DATABASE_URL`, and use a managed provider (Supabase, Neon, Vercel Postgres, etc.) with built-in daily backups and 7-day retention.

## License

MIT
Expand Down
93 changes: 93 additions & 0 deletions scripts/backup-db.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
#!/usr/bin/env bash
# =============================================================================
# RemitWise — SQLite database backup script
#
# Usage:
# bash scripts/backup-db.sh
#
# Environment variables (can be set in .env.local or the shell):
# DATABASE_PATH Path to the SQLite .db file (default: prisma/dev.db)
# BACKUP_DIR Directory for local backups (default: ./backups)
# BACKUP_RETENTION_DAYS Days to keep local backups (default: 7)
# BACKUP_S3_BUCKET S3 bucket name; leave unset to skip S3 upload
# BACKUP_S3_PREFIX S3 key prefix (default: remitwise/db)
# AWS_ACCESS_KEY_ID \
# AWS_SECRET_ACCESS_KEY > Required when BACKUP_S3_BUCKET is set
# AWS_DEFAULT_REGION /
# =============================================================================
set -euo pipefail

# ---------------------------------------------------------------------------
# Load .env.local if present (key=value pairs only; no export needed)
# ---------------------------------------------------------------------------
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(dirname "$SCRIPT_DIR")"

if [[ -f "$REPO_ROOT/.env.local" ]]; then
while IFS= read -r line; do
[[ "$line" =~ ^#.*$ || -z "$line" ]] && continue
key="${line%%=*}"
val="${line#*=}"
# Only export if not already set in the environment
[[ -v "$key" ]] || export "$key"="$val"
done < "$REPO_ROOT/.env.local"
fi

# ---------------------------------------------------------------------------
# Configuration
# ---------------------------------------------------------------------------
DATABASE_PATH="${DATABASE_PATH:-$REPO_ROOT/prisma/dev.db}"
BACKUP_DIR="${BACKUP_DIR:-$REPO_ROOT/backups}"
BACKUP_RETENTION_DAYS="${BACKUP_RETENTION_DAYS:-7}"
BACKUP_S3_BUCKET="${BACKUP_S3_BUCKET:-}"
BACKUP_S3_PREFIX="${BACKUP_S3_PREFIX:-remitwise/db}"

TIMESTAMP="$(date +%Y%m%d_%H%M%S)"
BACKUP_FILENAME="remitwise_db_${TIMESTAMP}.db"
BACKUP_PATH="$BACKUP_DIR/$BACKUP_FILENAME"

# ---------------------------------------------------------------------------
# Preflight checks
# ---------------------------------------------------------------------------
if [[ ! -f "$DATABASE_PATH" ]]; then
echo "[backup] ERROR: database file not found: $DATABASE_PATH" >&2
exit 1
fi

if ! command -v sqlite3 &>/dev/null; then
echo "[backup] ERROR: sqlite3 is not installed or not on PATH" >&2
exit 1
fi

mkdir -p "$BACKUP_DIR"

# ---------------------------------------------------------------------------
# Create backup using SQLite's online backup API (safe during live writes)
# ---------------------------------------------------------------------------
echo "[backup] Starting backup: $DATABASE_PATH → $BACKUP_PATH"
sqlite3 "$DATABASE_PATH" ".backup '$BACKUP_PATH'"
echo "[backup] Backup complete: $BACKUP_PATH ($(du -sh "$BACKUP_PATH" | cut -f1))"

# ---------------------------------------------------------------------------
# Optional: upload to S3
# ---------------------------------------------------------------------------
if [[ -n "$BACKUP_S3_BUCKET" ]]; then
if ! command -v aws &>/dev/null; then
echo "[backup] WARNING: BACKUP_S3_BUCKET is set but 'aws' CLI not found — skipping S3 upload" >&2
else
S3_KEY="${BACKUP_S3_PREFIX}/${BACKUP_FILENAME}"
echo "[backup] Uploading to s3://${BACKUP_S3_BUCKET}/${S3_KEY} ..."
aws s3 cp "$BACKUP_PATH" "s3://${BACKUP_S3_BUCKET}/${S3_KEY}" \
--storage-class STANDARD_IA
echo "[backup] S3 upload complete."
fi
fi

# ---------------------------------------------------------------------------
# Rotate old local backups
# ---------------------------------------------------------------------------
echo "[backup] Removing local backups older than ${BACKUP_RETENTION_DAYS} days ..."
find "$BACKUP_DIR" -maxdepth 1 -name "remitwise_db_*.db" \
-mtime "+${BACKUP_RETENTION_DAYS}" -print -delete

echo "[backup] Done."