Professional web scraping solution that monitors your websites automatically and generates detailed reports every week.
✅ Monitors Multiple Websites - Track unlimited URLs automatically
✅ Weekly Reports - Get CSV data every Saturday at 10 AM Pakistan Time
✅ Zero Maintenance - Runs completely hands-free on GitHub's servers
✅ Easy Management - Add/remove websites by editing a simple text file
✅ Professional Data - Clean, structured CSV reports with timestamps
✅ Error Handling - Gracefully handles website downtime or issues
Each weekly report contains:
- ✅ Website URL and response status
- ✅ Content length and loading speed
- ✅ Timestamp of when data was collected
- ✅ Error details if a site was unreachable
- ✅ Historical tracking of all monitored sites
- Open the
urls.txtfile in this repository - Add your website URLs (one per line)
- Save and commit the changes
- The system will automatically monitor these URLs
- Reports are automatically generated as
output_YYYY-MM-DD.csv - Download them directly from this repository
- Each report contains all your monitored websites
- Go to Actions tab above
- Click "Weekly Stock Scraper"
- Click "Run workflow" → "Run workflow"
- Wait 1-2 minutes for completion
📦 Your Project
├── 📄 scraper.py # Main automation code
├── 📄 urls.txt # Your websites to monitor
├── 📄 requirements.txt # System dependencies
├── 📂 .github/workflows/ # Automation configuration
├── 📊 output_*.csv # Your weekly reports
└── 📖 README.md # This guide
| Feature | Details |
|---|---|
| Schedule | Every Saturday 5:00 AM UTC (10:00 AM PKT) |
| Platform | GitHub Actions (Free) |
| Language | Python 3.10 |
| Output | CSV format with UTF-8 encoding |
| Monitoring | Unlimited websites |
| Reliability | 99.9% uptime guaranteed by GitHub |
- Automated weekly scheduling
- Multiple URL monitoring
- CSV data export
- Error handling & reporting
- Manual trigger option
- Historical data tracking
- Email Notifications - Get notified when reports are ready
- Custom Data Extraction - Extract specific content (prices, titles, etc.)
- Multiple Schedules - Daily, hourly, or custom timing
- API Integration - Connect to your existing systems
- Dashboard View - Visual charts and graphs
- Slack/Discord Alerts - Team notifications
✅ Setup Support - Complete handover and training included
✅ Documentation - Step-by-step guides for all features
✅ Monitoring - First month of monitoring included
✅ Updates - Bug fixes and minor improvements
- No sensitive data stored
- All processing happens on GitHub's secure servers
- Code is transparent and auditable
🏆 Developed by: Amna Bibi
📧 Contact: Amnab9373@gmail.com
🌐 Platform: Upwork Professional Service
⭐ Rating: 5.0★ Web Scraping Specialist