Multi-pipeline system for archaeological remote sensing analysis with cost-effective cloud processing.
ArchaeoVLM provides specialized pipelines for archaeological analysis:
- LiDAR Processing: RunPod GPU + Google Cloud Storage (70% cost savings)
- Data Ingestion: Multi-source archaeological data integration
- Model Training: Custom archaeological feature detection
- Results Analysis: Visualization and reporting tools
lidar_visualization/- Complete LiDAR processing and analysis systempipelines/lidar_processing/- LiDAR semantic segmentation with PTv3runpod_gcs_archaeovlm.py- Main processing enginesetup_runpod_gcs.sh- One-click environment setupexample_usage.py- Usage examples and scenarios
pipelines/data_ingestion/- Multi-source data integrationpipelines/model_training/- Custom model training workflowspipelines/results_analysis/- Visualization and analysis toolsshared/- Shared components across pipelinesmodels/- Point Transformer V3 and other modelsutils/- Common utilities and helpersconfigs/- Shared configuration templates
docs/- Complete project documentationindex.html- Main project visualization interfacelidar_visualization/- LiDAR system documentationRUNPOD_GCS_GUIDE.md- Quick start guide (30 min setup)GCP_RUNPOD_DEPLOYMENT_GUIDE.md- Comprehensive deployment guide
data/- Local data directory
# 1. Setup Google Cloud Storage
gcloud projects create archaeovlm-project
gsutil mb gs://archaeovlm-lidar-data
gsutil mb gs://archaeovlm-results
# 2. Launch RunPod RTX 3090 instance ($0.34/hr)
# 3. Setup environment
cd lidar_visualization/pipelines/lidar_processing/
wget https://raw.githubusercontent.com/your-repo/lidar_visualization/pipelines/lidar_processing/setup_runpod_gcs.sh
chmod +x setup_runpod_gcs.sh
./setup_runpod_gcs.sh
# 4. Upload GCP credentials
scp gcp-key.json root@[runpod-ip]:/workspace/gcp-credentials.json
# 5. Start processing
python3 runpod_gcs_archaeovlm.py gs://archaeovlm-lidar-data/| Scenario | Files | Budget | Use Case |
|---|---|---|---|
| Pilot Study | 10 files | $2-3 | Quick validation |
| Site Survey | 25 files | $5-8 | Single site analysis |
| Regional Study | 50 files | $10-15 | Multi-site comparison |
| Research Project | 100 files | $20-30 | Publication dataset |
Edit lidar_visualization/pipelines/lidar_processing/archaeovlm_runpod_gcs_config.json:
{
"processing": {
"max_files": 25,
"preferred_regions": ["TAP", "FN2", "ANT"],
"file_selection": "size_based",
"date_range": ["2010", "2020"]
},
"storage": {
"project_id": "your-gcp-project",
"bucket": "your-results-bucket",
"input_bucket": "your-lidar-data"
}
}- Smart file selection by archaeological region and date
- Automatic cost estimation before processing
- Cloud backup of all results
- Auto-shutdown to prevent idle costs
- Real-time monitoring with cost tracking
- Compression to reduce storage costs by 70%
- Classified point clouds with archaeological labels
- Processing reports with statistics
- Cost tracking and quality metrics
- Ready for QGIS analysis and publication
cd lidar_visualization/pipelines/lidar_processing/
# Test setup
python3 test_runpod_gcs_setup.py
# Monitor processing
python3 monitor_runpod_gcs.py
# Check logs
tail -f /workspace/logs/archaeovlm.log- Quick Start:
docs/lidar_visualization/RUNPOD_GCS_GUIDE.md(30 min setup) - Complete Guide:
docs/lidar_visualization/GCP_RUNPOD_DEPLOYMENT_GUIDE.md(detailed) - Examples:
lidar_visualization/pipelines/lidar_processing/example_usage.py(different scenarios) - Web Interface:
docs/index.html(main project dashboard)
- Data Ingestion: Coming soon
- Model Training: Coming soon
- Results Analysis: Coming soon
Current Focus: LiDAR Processing Pipeline
Platform: RunPod GPU + Google Cloud Storage
Cost: $0.34/hr + $0.02/GB storage
Perfect for: Budget-conscious archaeological research