Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions docs/aws-store.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,16 @@ The coding agent supports an optional AWS-backed state store using **DynamoDB**
└──────────┘ └───────────┘ └────────────┘
```

- **DynamoDB** stores checkpoint metadata (thread ID, checkpoint ID, parent, serialization type, S3 key pointer) and intermediate writes (small channel values)
- **S3** stores the serialized checkpoint blob (channel_values, channel_versions, etc.)
- Single-table DynamoDB design with composite sort keys (`cp#...` for checkpoints, `wr#...` for writes)
- **DynamoDB** stores checkpoint metadata, intermediate writes, and small checkpoint payloads.
- **S3** stores larger checkpoint payloads (typically >350KB) that exceed DynamoDB's efficient item size limits.
- Unified storage design managed by `langgraph-checkpoint-aws.DynamoDBSaver`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify module-path usage in docs vs code
rg -n "langgraph-checkpoint-aws\.DynamoDBSaver|langgraph_checkpoint_aws\.DynamoDBSaver|from langgraph_checkpoint_aws import DynamoDBSaver" docs/aws-store.md src/store/dynamodb.py -C2

Repository: pokryfka/coding-agent

Length of output: 1069


Use the Python module path in this class reference.

Line 24 currently shows langgraph-checkpoint-aws.DynamoDBSaver, but the Python import path is langgraph_checkpoint_aws.DynamoDBSaver (with underscores, not hyphens).

Proposed fix
-- Unified storage design managed by `langgraph-checkpoint-aws.DynamoDBSaver`.
+- Unified storage design managed by `langgraph_checkpoint_aws.DynamoDBSaver`.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- Unified storage design managed by `langgraph-checkpoint-aws.DynamoDBSaver`.
- Unified storage design managed by `langgraph_checkpoint_aws.DynamoDBSaver`.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/aws-store.md` at line 24, The class reference uses the wrong module
name; replace the hyphenated reference "langgraph-checkpoint-aws.DynamoDBSaver"
with the correct Python import path "langgraph_checkpoint_aws.DynamoDBSaver" so
the docs point to the actual Python module and class (update the occurrence of
DynamoDBSaver in the docs/aws-store.md line that contains the incorrect module
string).


## Prerequisites

- AWS account with permissions to create DynamoDB tables, S3 buckets, and IAM policies
- [Terraform](https://www.terraform.io/) >= 1.5 (for resource provisioning)
- AWS credentials configured (env vars, IAM role, or AWS profile)
- Python `aioboto3` package: `pip install coding-agent[aws]`
- Python `langgraph-checkpoint-aws` package: `pip install coding-agent[aws]`

## Setup

Expand Down
3 changes: 2 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -23,14 +23,15 @@ dependencies = [

[project.optional-dependencies]
aws = [
"aioboto3>=13.0",
"langgraph-checkpoint-aws>=1.0",
]
dev = [
"pytest>=8.0",
"pytest-asyncio>=0.24",
"ruff>=0.9",
"moto[dynamodb,s3]>=5.0",
"interrogate>=1.7.0",
"aioboto3>=13.0",
]

[project.scripts]
Expand Down
133 changes: 133 additions & 0 deletions scripts/clear_aws_store.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,133 @@
#!/usr/bin/env uv run python3
"""Script to empty DynamoDB table and S3 bucket using aioboto3."""

from __future__ import annotations

import asyncio
import os
import sys
from pathlib import Path

try:
import aioboto3
except ImportError:
print("Error: aioboto3 not found. Please install it with 'pip install aioboto3'.")
sys.exit(1)

def load_env():
"""Load .env file from the project root."""
env_path = Path(".env")
if env_path.exists():
print(f"Loading environment from {env_path}")
with open(env_path) as f:
Comment on lines +18 to +22
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

load_env docstring says project root, but Line 17 reads current working directory.

If this script is launched outside repository root, .env lookup will miss the intended file.

Proposed fix
 def load_env():
     """Load .env file from the project root."""
-    env_path = Path(".env")
+    env_path = Path(__file__).resolve().parents[1] / ".env"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"""Load .env file from the project root."""
env_path = Path(".env")
if env_path.exists():
print(f"Loading environment from {env_path}")
with open(env_path) as f:
def load_env():
"""Load .env file from the project root."""
env_path = Path(__file__).resolve().parents[1] / ".env"
if env_path.exists():
print(f"Loading environment from {env_path}")
with open(env_path) as f:
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/clear_aws_store.py` around lines 16 - 20, The load_env docstring says
it loads .env from the project root but the code uses Path(".env") which checks
the current working directory; update the env_path computation to locate the
repository root relative to the script file (e.g. use
Path(__file__).resolve().parent.parent or Path(__file__).resolve().parents[1])
and join ".env" so env_path points to the project root .env; adjust any tests or
callers of load_env if they rely on cwd behavior.

for line in f:
line = line.strip()
if line and not line.startswith("#") and "=" in line:
key, value = line.split("=", 1)
key = key.strip()
value = value.strip()
if len(value) >= 2 and (
(value.startswith('"') and value.endswith('"')) or
(value.startswith("'") and value.endswith("'"))
):
value = value[1:-1]
os.environ.setdefault(key, value)
else:
print("No .env file found in current directory.")

async def empty_dynamodb(table_name, region, endpoint_url):
"""Scan and delete all items from a DynamoDB table."""
session = aioboto3.Session()
async with session.resource("dynamodb", region_name=region, endpoint_url=endpoint_url) as dynamodb:
table = await dynamodb.Table(table_name)

try:
# Getting key schema directly from describe_table to avoid property coroutine issues
desc = await table.meta.client.describe_table(TableName=table_name)
key_names = [k['AttributeName'] for k in desc['Table']['KeySchema']]
except Exception as e:
print(f"Error accessing table {table_name}: {e}")
return

print(f"Scanning table {table_name}...")

count = 0
scan_params = {}

while True:
response = await table.scan(**scan_params)
items = response.get('Items', [])

if items:
async with table.batch_writer() as batch:
for item in items:
await batch.delete_item(Key={k: item[k] for k in key_names})
count += 1

if 'LastEvaluatedKey' not in response:
break
scan_params['ExclusiveStartKey'] = response['LastEvaluatedKey']

print(f"Deleted {count} items from DynamoDB table {table_name}.")

async def empty_s3(bucket_name, region, endpoint_url):
"""Delete all objects and versions from an S3 bucket."""
session = aioboto3.Session()
async with session.resource("s3", region_name=region, endpoint_url=endpoint_url) as s3:
try:
bucket = await s3.Bucket(bucket_name)
print(f"Deleting all objects and versions from bucket {bucket_name}...")

deleted_count = 0
delete_batch = []

# Delete all versions (including current ones and delete markers)
async for version in bucket.object_versions.all():
# In aioboto3 resource collections, meta.data is usually pre-populated.
# Accessing raw attributes from meta.data avoids coroutine property issues.
key = version.meta.data.get('Key')
version_id = version.meta.data.get('VersionId')
delete_batch.append({'Key': key, 'VersionId': version_id})

if len(delete_batch) >= 1000:
await bucket.delete_objects(Delete={'Objects': delete_batch})
deleted_count += len(delete_batch)
delete_batch = []

# Final batch
if delete_batch:
await bucket.delete_objects(Delete={'Objects': delete_batch})
deleted_count += len(delete_batch)

print(f"Deleted {deleted_count} object(s)/version(s) from S3 bucket {bucket_name}.")
except Exception as e:
print(f"Error emptying bucket {bucket_name}: {e}")
import traceback
traceback.print_exc()

async def main():
load_env()

table_name = os.environ.get("DYNAMODB_TABLE_NAME")
bucket_name = os.environ.get("S3_CHECKPOINT_BUCKET")
region = os.environ.get("AWS_REGION") or os.environ.get("AWS_DEFAULT_REGION") or "us-east-1"
endpoint_url = os.environ.get("AWS_ENDPOINT_URL")

if not table_name and not bucket_name:
print("Error: DYNAMODB_TABLE_NAME or S3_CHECKPOINT_BUCKET not found in .env or environment.")
sys.exit(1)

tasks = []
if table_name:
print(f"--- DynamoDB ({region}) ---")
tasks.append(empty_dynamodb(table_name, region, endpoint_url))

if bucket_name:
print(f"\n--- S3 ({region}) ---")
tasks.append(empty_s3(bucket_name, region, endpoint_url))

if tasks:
await asyncio.gather(*tasks)

if __name__ == "__main__":
asyncio.run(main())
2 changes: 2 additions & 0 deletions src/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,8 @@ def main() -> None:
level=logging.DEBUG if args.verbose else logging.INFO,
format="%(asctime)s %(levelname)-8s %(name)s: %(message)s",
)
# Reduce noise from langgraph_checkpoint_aws
logging.getLogger("langgraph_checkpoint_aws").setLevel(logging.WARNING)

settings = load_settings(args.config)

Expand Down
Loading