Skip to content

Conversation

@davassi
Copy link
Owner

@davassi davassi commented Nov 28, 2025

No description provided.

@davassi davassi requested a review from Copilot November 28, 2025 08:01
@davassi davassi merged commit 6d78041 into master Nov 28, 2025
6 checks passed
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds SQLite database persistence for order execution history and introduces history/analytics API endpoints to query stored order data. The implementation enables tracking of successful orders, failures, and retry attempts.

Key changes:

  • New OrderDatabase class providing SQLite persistence for orders and failures
  • Integration of database logging into the webhook order placement flow
  • Four new REST API endpoints for querying order history, failures, and statistics

Reviewed changes

Copilot reviewed 4 out of 5 changed files in this pull request and generated 11 comments.

Show a summary per file
File Description
hypertrade/database.py New SQLite database layer with tables for orders and failures, providing CRUD operations and statistics queries
hypertrade/routes/webhooks.py Integrated database logging into order placement flow; added history_router with 4 endpoints for querying order data
hypertrade/daemon.py Added database initialization at app startup with error handling and configurable enable/disable
hypertrade/config.py Added db_path and db_enabled configuration fields with defaults
hypertrade.db Binary SQLite database file committed to repository (should be excluded)
Comments suppressed due to low confidence (1)

hypertrade/database.py:7

  • Import of 'Path' is not used.
from pathlib import Path

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +337 to +376
def get_statistics(self) -> Dict[str, Any]:
"""Get summary statistics about orders and failures.
Returns:
Dictionary with stats
"""
conn = self._get_connection()
cursor = conn.cursor()

cursor.execute("SELECT COUNT(*) as total_orders FROM orders")
total_orders = cursor.fetchone()["total_orders"]

cursor.execute("SELECT COUNT(*) as failed_orders FROM orders WHERE status IN ('FAILED', 'REJECTED')")
failed_orders = cursor.fetchone()["failed_orders"]

cursor.execute("SELECT COUNT(*) as total_failures FROM failures")
total_failures = cursor.fetchone()["total_failures"]

cursor.execute("""
SELECT symbol, COUNT(*) as count FROM orders
GROUP BY symbol ORDER BY count DESC LIMIT 5
""")
top_symbols = [dict(row) for row in cursor.fetchall()]

cursor.execute("""
SELECT error_type, COUNT(*) as count FROM failures
GROUP BY error_type ORDER BY count DESC LIMIT 5
""")
top_errors = [dict(row) for row in cursor.fetchall()]

conn.close()

return {
"total_orders": total_orders,
"failed_orders": failed_orders,
"success_rate": (total_orders - failed_orders) / total_orders * 100 if total_orders > 0 else 0,
"total_failures": total_failures,
"top_symbols": top_symbols,
"top_errors": top_errors,
}
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Performance issue: Multiple sequential queries without optimization. The get_statistics() method executes 5 separate queries sequentially, which could be slow as the database grows. Consider combining these into a single query using Common Table Expressions (CTEs) or subqueries for better performance:

cursor.execute("""
    WITH order_stats AS (
        SELECT 
            COUNT(*) as total,
            SUM(CASE WHEN status IN ('FAILED', 'REJECTED') THEN 1 ELSE 0 END) as failed
        FROM orders
    ),
    failure_stats AS (
        SELECT COUNT(*) as total FROM failures
    )
    SELECT * FROM order_stats, failure_stats
""")

Copilot uses AI. Check for mistakes.
Comment on lines +202 to +270
if db and req_id:
db.log_order(
request_id=req_id,
symbol=symbol,
side=side.value,
signal=signal.value,
quantity=contracts,
price=price,
leverage=leverage,
subaccount=vault_address,
status="REJECTED",
execution_ms=(time.perf_counter() - start_time) * 1000,
)
db.log_failure(
request_id=req_id,
error_type=e.__class__.__name__,
error_message=str(e),
attempt=1,
retry_count=0,
)
raise HTTPException(status_code=400, detail=f"Invalid order: {e}") from e
except HyperliquidNetworkError as e:
log.error("Network error placing order (after retries): %s", e)
if db and req_id:
db.log_order(
request_id=req_id,
symbol=symbol,
side=side.value,
signal=signal.value,
quantity=contracts,
price=price,
leverage=leverage,
subaccount=vault_address,
status="FAILED",
execution_ms=(time.perf_counter() - start_time) * 1000,
)
db.log_failure(
request_id=req_id,
error_type=e.__class__.__name__,
error_message=str(e),
attempt=3,
retry_count=2,
)
raise HTTPException(
status_code=503,
detail="Temporary service unavailable - order may have been placed, check manually"
) from e
except HyperliquidAPIError as e:
log.error("API error placing order (after retries): %s", e)
if db and req_id:
db.log_order(
request_id=req_id,
symbol=symbol,
side=side.value,
signal=signal.value,
quantity=contracts,
price=price,
leverage=leverage,
subaccount=vault_address,
status="FAILED",
execution_ms=(time.perf_counter() - start_time) * 1000,
)
db.log_failure(
request_id=req_id,
error_type=e.__class__.__name__,
error_message=str(e),
attempt=3,
retry_count=2,
)
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Significant code duplication in database logging logic. The same db.log_order() and db.log_failure() calls are repeated in three exception handlers (lines 202-221, 225-244, 251-270) with only minor differences in status and attempt values. Consider extracting this into a helper function like _log_order_failure(db, req_id, order_params, error, attempt, retry_count, status) to reduce duplication and improve maintainability.

Copilot uses AI. Check for mistakes.
Comment on lines +242 to +269
attempt=3,
retry_count=2,
)
raise HTTPException(
status_code=503,
detail="Temporary service unavailable - order may have been placed, check manually"
) from e
except HyperliquidAPIError as e:
log.error("API error placing order (after retries): %s", e)
if db and req_id:
db.log_order(
request_id=req_id,
symbol=symbol,
side=side.value,
signal=signal.value,
quantity=contracts,
price=price,
leverage=leverage,
subaccount=vault_address,
status="FAILED",
execution_ms=(time.perf_counter() - start_time) * 1000,
)
db.log_failure(
request_id=req_id,
error_type=e.__class__.__name__,
error_message=str(e),
attempt=3,
retry_count=2,
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hardcoded attempt and retry_count values don't reflect actual retry logic. In the HyperliquidNetworkError and HyperliquidAPIError handlers, the values attempt=3 and retry_count=2 are hardcoded (lines 242-243, 268-269), but these should match the actual retry attempts from _place_order_with_retry(). If max_retries changes or if errors occur at different stages, these hardcoded values will be incorrect. Consider tracking actual attempt numbers in the retry function and passing them through, or deriving them from the exception context.

Copilot uses AI. Check for mistakes.
Comment on lines +1 to +376
"""SQLite database layer for persisting order operations and failures."""

import os
import sqlite3
import logging
from datetime import datetime, timezone
from pathlib import Path
from typing import Optional, List, Dict, Any

log = logging.getLogger("uvicorn.error")


class OrderDatabase:
"""SQLite database for storing order operations and failures."""

def __init__(self, db_path: str = "./hypertrade.db"):
"""Initialize database connection and create tables if needed.
Args:
db_path: Path to SQLite database file
"""
self.db_path = db_path
self._ensure_db_exists()

def _get_connection(self) -> sqlite3.Connection:
"""Get a database connection with row factory."""
conn = sqlite3.connect(self.db_path)
conn.row_factory = sqlite3.Row
return conn

def _ensure_db_exists(self) -> None:
"""Create database tables if they don't exist."""
conn = self._get_connection()
cursor = conn.cursor()

# Orders table: tracks all executed orders (successful and failed)
cursor.execute("""
CREATE TABLE IF NOT EXISTS orders (
id INTEGER PRIMARY KEY AUTOINCREMENT,
request_id TEXT UNIQUE,
timestamp TEXT NOT NULL,
symbol TEXT NOT NULL,
side TEXT NOT NULL,
signal TEXT NOT NULL,
quantity REAL NOT NULL,
price REAL NOT NULL,
leverage INTEGER,
subaccount TEXT,
status TEXT NOT NULL,
order_id TEXT,
avg_price REAL,
total_size REAL,
response_json TEXT,
execution_ms REAL
)
""")

# Failures table: detailed failure information
cursor.execute("""
CREATE TABLE IF NOT EXISTS failures (
id INTEGER PRIMARY KEY AUTOINCREMENT,
order_id INTEGER,
request_id TEXT,
timestamp TEXT NOT NULL,
error_type TEXT NOT NULL,
error_message TEXT NOT NULL,
attempt INTEGER NOT NULL,
retry_count INTEGER DEFAULT 0,
FOREIGN KEY (order_id) REFERENCES orders(id)
)
""")

# Create indices for faster queries
cursor.execute("CREATE INDEX IF NOT EXISTS idx_orders_timestamp ON orders(timestamp)")
cursor.execute("CREATE INDEX IF NOT EXISTS idx_orders_symbol ON orders(symbol)")
cursor.execute("CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status)")
cursor.execute("CREATE INDEX IF NOT EXISTS idx_orders_request_id ON orders(request_id)")
cursor.execute("CREATE INDEX IF NOT EXISTS idx_failures_order_id ON failures(order_id)")
cursor.execute("CREATE INDEX IF NOT EXISTS idx_failures_timestamp ON failures(timestamp)")

conn.commit()
conn.close()

# Restrict database file permissions to owner only (rw-------)
try:
os.chmod(self.db_path, 0o600)
log.debug("Database file permissions restricted to owner only")
except OSError as e:
log.warning("Could not restrict database file permissions: %s", e)

log.info("Database initialized: %s", self.db_path)

def log_order(
self,
request_id: str,
symbol: str,
side: str,
signal: str,
quantity: float,
price: float,
status: str,
leverage: Optional[int] = None,
subaccount: Optional[str] = None,
order_id: Optional[str] = None,
avg_price: Optional[float] = None,
total_size: Optional[float] = None,
response_json: Optional[str] = None,
execution_ms: Optional[float] = None,
) -> int:
"""Log an order execution to the database.
Args:
request_id: Unique request identifier
symbol: Trading symbol (e.g., 'ETHUSDT')
side: Order side (BUY/SELL)
signal: Signal type (OPEN_LONG, CLOSE_LONG, etc.)
quantity: Order quantity
price: Order price
status: Status (PLACED, FILLED, FAILED, REJECTED)
leverage: Order leverage multiplier
subaccount: Subaccount address if trading on subaccount
order_id: Exchange order ID
avg_price: Average execution price
total_size: Total size executed
response_json: Full response JSON from exchange
execution_ms: Execution time in milliseconds
Returns:
Order ID in database
"""
conn = self._get_connection()
cursor = conn.cursor()

try:
cursor.execute("""
INSERT INTO orders (
request_id, timestamp, symbol, side, signal, quantity, price,
leverage, subaccount, status, order_id, avg_price, total_size,
response_json, execution_ms
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""", (
request_id,
datetime.now(timezone.utc).isoformat(),
symbol,
side,
signal,
quantity,
price,
leverage,
subaccount,
status,
order_id,
avg_price,
total_size,
response_json,
execution_ms,
))
conn.commit()
order_pk = cursor.lastrowid
log.debug(
"Order logged: id=%d request_id=%s symbol=%s side=%s status=%s",
order_pk, request_id, symbol, side, status
)
return order_pk
except sqlite3.IntegrityError:
log.warning("Duplicate request_id: %s", request_id)
raise
finally:
conn.close()

def log_failure(
self,
request_id: str,
error_type: str,
error_message: str,
attempt: int = 1,
retry_count: int = 0,
order_id: Optional[int] = None,
) -> int:
"""Log an order failure or error.
Args:
request_id: Unique request identifier
error_type: Error class name (HyperliquidValidationError, etc.)
error_message: Human-readable error message
attempt: Current attempt number
retry_count: Number of retries performed
order_id: Foreign key to orders table (if available)
Returns:
Failure log ID in database
"""
conn = self._get_connection()
cursor = conn.cursor()

try:
cursor.execute("""
INSERT INTO failures (
order_id, request_id, timestamp, error_type, error_message,
attempt, retry_count
) VALUES (?, ?, ?, ?, ?, ?, ?)
""", (
order_id,
request_id,
datetime.now(timezone.utc).isoformat(),
error_type,
error_message,
attempt,
retry_count,
))
conn.commit()
failure_id = cursor.lastrowid
log.debug(
"Failure logged: id=%d request_id=%s error_type=%s attempt=%d",
failure_id, request_id, error_type, attempt
)
return failure_id
finally:
conn.close()

def get_orders(
self,
limit: int = 100,
offset: int = 0,
symbol: Optional[str] = None,
status: Optional[str] = None,
side: Optional[str] = None,
) -> List[Dict[str, Any]]:
"""Query orders from database.
Args:
limit: Maximum number of records to return
offset: Number of records to skip
symbol: Filter by symbol
status: Filter by status
side: Filter by side (BUY/SELL)
Returns:
List of order dictionaries
"""
conn = self._get_connection()
cursor = conn.cursor()

query = "SELECT * FROM orders WHERE 1=1"
params: List[Any] = []

if symbol:
query += " AND symbol = ?"
params.append(symbol)
if status:
query += " AND status = ?"
params.append(status)
if side:
query += " AND side = ?"
params.append(side)

query += " ORDER BY timestamp DESC LIMIT ? OFFSET ?"
params.extend([limit, offset])

cursor.execute(query, params)
rows = cursor.fetchall()
conn.close()

return [dict(row) for row in rows]

def get_failures(
self,
limit: int = 100,
offset: int = 0,
error_type: Optional[str] = None,
) -> List[Dict[str, Any]]:
"""Query failures from database.
Args:
limit: Maximum number of records to return
offset: Number of records to skip
error_type: Filter by error type
Returns:
List of failure dictionaries
"""
conn = self._get_connection()
cursor = conn.cursor()

query = "SELECT * FROM failures WHERE 1=1"
params: List[Any] = []

if error_type:
query += " AND error_type = ?"
params.append(error_type)

query += " ORDER BY timestamp DESC LIMIT ? OFFSET ?"
params.extend([limit, offset])

cursor.execute(query, params)
rows = cursor.fetchall()
conn.close()

return [dict(row) for row in rows]

def get_order_by_request_id(self, request_id: str) -> Optional[Dict[str, Any]]:
"""Get a single order by request ID.
Args:
request_id: Request identifier
Returns:
Order dictionary or None if not found
"""
conn = self._get_connection()
cursor = conn.cursor()

cursor.execute("SELECT * FROM orders WHERE request_id = ?", (request_id,))
row = cursor.fetchone()
conn.close()

return dict(row) if row else None

def get_failures_by_order_id(self, order_id: int) -> List[Dict[str, Any]]:
"""Get all failures for a specific order.
Args:
order_id: Order ID in database
Returns:
List of failure dictionaries
"""
conn = self._get_connection()
cursor = conn.cursor()

cursor.execute("SELECT * FROM failures WHERE order_id = ? ORDER BY timestamp", (order_id,))
rows = cursor.fetchall()
conn.close()

return [dict(row) for row in rows]

def get_statistics(self) -> Dict[str, Any]:
"""Get summary statistics about orders and failures.
Returns:
Dictionary with stats
"""
conn = self._get_connection()
cursor = conn.cursor()

cursor.execute("SELECT COUNT(*) as total_orders FROM orders")
total_orders = cursor.fetchone()["total_orders"]

cursor.execute("SELECT COUNT(*) as failed_orders FROM orders WHERE status IN ('FAILED', 'REJECTED')")
failed_orders = cursor.fetchone()["failed_orders"]

cursor.execute("SELECT COUNT(*) as total_failures FROM failures")
total_failures = cursor.fetchone()["total_failures"]

cursor.execute("""
SELECT symbol, COUNT(*) as count FROM orders
GROUP BY symbol ORDER BY count DESC LIMIT 5
""")
top_symbols = [dict(row) for row in cursor.fetchall()]

cursor.execute("""
SELECT error_type, COUNT(*) as count FROM failures
GROUP BY error_type ORDER BY count DESC LIMIT 5
""")
top_errors = [dict(row) for row in cursor.fetchall()]

conn.close()

return {
"total_orders": total_orders,
"failed_orders": failed_orders,
"success_rate": (total_orders - failed_orders) / total_orders * 100 if total_orders > 0 else 0,
"total_failures": total_failures,
"top_symbols": top_symbols,
"top_errors": top_errors,
}
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing test coverage for the new OrderDatabase class. The repository has comprehensive test coverage for other modules (test_webhook.py, test_health.py, test_subaccount.py), but the new database functionality lacks tests. Consider adding tests for:

  • Database initialization and table creation
  • Order logging with various status values
  • Failure logging
  • Query methods (get_orders, get_failures, etc.)
  • Statistics calculation
  • Edge cases like duplicate request_ids and empty result sets

Copilot uses AI. Check for mistakes.
Comment on lines +538 to +667
@history_router.get(
"/history/orders",
summary="Get order execution history",
)
async def get_orders_history(
request: Request,
limit: int = 100,
offset: int = 0,
symbol: Optional[str] = None,
status: Optional[str] = None,
side: Optional[str] = None,
) -> dict:
"""Retrieve order execution history from database.
Args:
limit: Maximum number of orders to return (default 100, max 1000)
offset: Number of orders to skip for pagination
symbol: Filter by trading symbol (e.g., 'ETHUSDT')
status: Filter by status (PLACED, FAILED, REJECTED)
side: Filter by side (BUY, SELL)
Returns:
Dictionary with orders list and metadata
"""
db = getattr(request.app.state, "db", None)
if not db:
raise HTTPException(status_code=503, detail="Database not available")

limit = min(max(1, limit), 1000)
offset = max(0, offset)

orders = db.get_orders(limit=limit, offset=offset, symbol=symbol, status=status, side=side)
return {
"status": "ok",
"count": len(orders),
"limit": limit,
"offset": offset,
"orders": orders,
}


@history_router.get(
"/history/failures",
summary="Get order failure logs",
)
async def get_failures_history(
request: Request,
limit: int = 100,
offset: int = 0,
error_type: Optional[str] = None,
) -> dict:
"""Retrieve order failure logs from database.
Args:
limit: Maximum number of failures to return (default 100, max 1000)
offset: Number of failures to skip for pagination
error_type: Filter by error type
Returns:
Dictionary with failures list and metadata
"""
db = getattr(request.app.state, "db", None)
if not db:
raise HTTPException(status_code=503, detail="Database not available")

limit = min(max(1, limit), 1000)
offset = max(0, offset)

failures = db.get_failures(limit=limit, offset=offset, error_type=error_type)
return {
"status": "ok",
"count": len(failures),
"limit": limit,
"offset": offset,
"failures": failures,
}


@history_router.get(
"/history/order/{request_id}",
summary="Get order details by request ID",
)
async def get_order_details(
request: Request,
request_id: str,
) -> dict:
"""Get detailed order information and associated failures.
Args:
request_id: Unique request identifier
Returns:
Dictionary with order details and failure logs
"""
db = getattr(request.app.state, "db", None)
if not db:
raise HTTPException(status_code=503, detail="Database not available")

order = db.get_order_by_request_id(request_id)
if not order:
raise HTTPException(status_code=404, detail=f"Order not found: {request_id}")

failures = db.get_failures_by_order_id(order["id"]) if order.get("id") else []

return {
"status": "ok",
"order": order,
"failures": failures,
}


@history_router.get(
"/history/stats",
summary="Get order statistics",
)
async def get_statistics(request: Request) -> dict:
"""Get summary statistics about orders and failures.
Returns:
Dictionary with various statistics
"""
db = getattr(request.app.state, "db", None)
if not db:
raise HTTPException(status_code=503, detail="Database not available")

stats = db.get_statistics()
return {
"status": "ok",
"statistics": stats,
}
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing test coverage for the new history endpoints. The repository has comprehensive test coverage for webhook endpoints, but the new history API endpoints (/history/orders, /history/failures, /history/order/{request_id}, /history/stats) lack tests. Consider adding tests to verify:

  • Pagination parameters (limit, offset)
  • Filtering functionality (symbol, status, side, error_type)
  • Parameter validation and bounds checking
  • Database unavailability handling
  • 404 responses for non-existent request_ids

Copilot uses AI. Check for mistakes.
Comment on lines +538 to +667
@history_router.get(
"/history/orders",
summary="Get order execution history",
)
async def get_orders_history(
request: Request,
limit: int = 100,
offset: int = 0,
symbol: Optional[str] = None,
status: Optional[str] = None,
side: Optional[str] = None,
) -> dict:
"""Retrieve order execution history from database.
Args:
limit: Maximum number of orders to return (default 100, max 1000)
offset: Number of orders to skip for pagination
symbol: Filter by trading symbol (e.g., 'ETHUSDT')
status: Filter by status (PLACED, FAILED, REJECTED)
side: Filter by side (BUY, SELL)
Returns:
Dictionary with orders list and metadata
"""
db = getattr(request.app.state, "db", None)
if not db:
raise HTTPException(status_code=503, detail="Database not available")

limit = min(max(1, limit), 1000)
offset = max(0, offset)

orders = db.get_orders(limit=limit, offset=offset, symbol=symbol, status=status, side=side)
return {
"status": "ok",
"count": len(orders),
"limit": limit,
"offset": offset,
"orders": orders,
}


@history_router.get(
"/history/failures",
summary="Get order failure logs",
)
async def get_failures_history(
request: Request,
limit: int = 100,
offset: int = 0,
error_type: Optional[str] = None,
) -> dict:
"""Retrieve order failure logs from database.
Args:
limit: Maximum number of failures to return (default 100, max 1000)
offset: Number of failures to skip for pagination
error_type: Filter by error type
Returns:
Dictionary with failures list and metadata
"""
db = getattr(request.app.state, "db", None)
if not db:
raise HTTPException(status_code=503, detail="Database not available")

limit = min(max(1, limit), 1000)
offset = max(0, offset)

failures = db.get_failures(limit=limit, offset=offset, error_type=error_type)
return {
"status": "ok",
"count": len(failures),
"limit": limit,
"offset": offset,
"failures": failures,
}


@history_router.get(
"/history/order/{request_id}",
summary="Get order details by request ID",
)
async def get_order_details(
request: Request,
request_id: str,
) -> dict:
"""Get detailed order information and associated failures.
Args:
request_id: Unique request identifier
Returns:
Dictionary with order details and failure logs
"""
db = getattr(request.app.state, "db", None)
if not db:
raise HTTPException(status_code=503, detail="Database not available")

order = db.get_order_by_request_id(request_id)
if not order:
raise HTTPException(status_code=404, detail=f"Order not found: {request_id}")

failures = db.get_failures_by_order_id(order["id"]) if order.get("id") else []

return {
"status": "ok",
"order": order,
"failures": failures,
}


@history_router.get(
"/history/stats",
summary="Get order statistics",
)
async def get_statistics(request: Request) -> dict:
"""Get summary statistics about orders and failures.
Returns:
Dictionary with various statistics
"""
db = getattr(request.app.state, "db", None)
if not db:
raise HTTPException(status_code=503, detail="Database not available")

stats = db.get_statistics()
return {
"status": "ok",
"statistics": stats,
}
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Security issue: Missing authentication/authorization on history endpoints. The new history API endpoints (/history/orders, /history/failures, /history/order/{request_id}, /history/stats) expose sensitive trading information but lack authentication or IP whitelisting. Unlike the webhook endpoint which has dependencies=[Depends(require_ip_whitelisted(None))], these endpoints are publicly accessible. Consider:

  1. Adding authentication (API key, JWT, etc.)
  2. Adding IP whitelisting similar to the webhook endpoint
  3. Implementing rate limiting specific to these endpoints

Without protection, anyone who can reach the server can view complete order history, failure logs, and trading statistics.

Copilot uses AI. Check for mistakes.
)

router = APIRouter(tags=["webhooks"])
history_router = APIRouter(tags=["history"])
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Consider adding API versioning prefix to history endpoints. The history endpoints are added without any API versioning (e.g., /api/v1/history/orders). As this is new functionality, consider adding a version prefix to make future API evolution easier without breaking changes. This is especially important for endpoints that may be consumed by external monitoring or analytics tools.

Suggested change
history_router = APIRouter(tags=["history"])
history_router = APIRouter(prefix="/api/v1/history", tags=["history"])

Copilot uses AI. Check for mistakes.
Comment on lines +202 to +292
if db and req_id:
db.log_order(
request_id=req_id,
symbol=symbol,
side=side.value,
signal=signal.value,
quantity=contracts,
price=price,
leverage=leverage,
subaccount=vault_address,
status="REJECTED",
execution_ms=(time.perf_counter() - start_time) * 1000,
)
db.log_failure(
request_id=req_id,
error_type=e.__class__.__name__,
error_message=str(e),
attempt=1,
retry_count=0,
)
raise HTTPException(status_code=400, detail=f"Invalid order: {e}") from e
except HyperliquidNetworkError as e:
log.error("Network error placing order (after retries): %s", e)
if db and req_id:
db.log_order(
request_id=req_id,
symbol=symbol,
side=side.value,
signal=signal.value,
quantity=contracts,
price=price,
leverage=leverage,
subaccount=vault_address,
status="FAILED",
execution_ms=(time.perf_counter() - start_time) * 1000,
)
db.log_failure(
request_id=req_id,
error_type=e.__class__.__name__,
error_message=str(e),
attempt=3,
retry_count=2,
)
raise HTTPException(
status_code=503,
detail="Temporary service unavailable - order may have been placed, check manually"
) from e
except HyperliquidAPIError as e:
log.error("API error placing order (after retries): %s", e)
if db and req_id:
db.log_order(
request_id=req_id,
symbol=symbol,
side=side.value,
signal=signal.value,
quantity=contracts,
price=price,
leverage=leverage,
subaccount=vault_address,
status="FAILED",
execution_ms=(time.perf_counter() - start_time) * 1000,
)
db.log_failure(
request_id=req_id,
error_type=e.__class__.__name__,
error_message=str(e),
attempt=3,
retry_count=2,
)
raise HTTPException(status_code=502, detail=f"Exchange error: {e}") from e

log.info("Order placed successfully: %s", result)

# Log successful order
if db and req_id:
db.log_order(
request_id=req_id,
symbol=symbol,
side=side.value,
signal=signal.value,
quantity=contracts,
price=price,
leverage=leverage,
subaccount=vault_address,
status="PLACED",
order_id=result.get("orderId"),
avg_price=result.get("avgPx"),
total_size=result.get("totalSz"),
response_json=json.dumps(result) if result else None,
execution_ms=(time.perf_counter() - start_time) * 1000,
)
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Database logging failures can disrupt order execution flow. If db.log_order() or db.log_failure() raise exceptions (e.g., disk full, permission errors, or SQLite integrity errors), they will propagate and override the actual exception being handled. This could cause:

  1. Validation errors to appear as database errors to the caller
  2. Loss of the actual error context
  3. Failure to raise the appropriate HTTPException

Consider wrapping database logging calls in try-except blocks that log the database error but allow the original exception handling to proceed:

try:
    db.log_order(...)
    db.log_failure(...)
except Exception as db_error:
    log.error("Failed to log to database: %s", db_error)

Copilot uses AI. Check for mistakes.
Comment on lines +75 to +77
# Database persistence
db_path: str = "./hypertrade.db"
db_enabled: bool = True
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing documentation for new database configuration. The new db_path and db_enabled settings should be documented in .env.example to inform users how to configure database persistence. Consider adding:

# Database persistence for order history
HYPERTRADE_DB_PATH=./hypertrade.db
HYPERTRADE_DB_ENABLED=true

Copilot uses AI. Check for mistakes.
import sqlite3
import logging
from datetime import datetime, timezone
from pathlib import Path
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unused import: Path from pathlib is imported but never used in the code. Remove this import to keep the codebase clean.

Suggested change
from pathlib import Path

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants