Skip to content

Amnabibi5/Stock_scraper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

61 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🤖 Automated Web Monitoring System

Professional web scraping solution that monitors your websites automatically and generates detailed reports every week.

🎯 What This System Does

Monitors Multiple Websites - Track unlimited URLs automatically
Weekly Reports - Get CSV data every Saturday at 10 AM Pakistan Time
Zero Maintenance - Runs completely hands-free on GitHub's servers
Easy Management - Add/remove websites by editing a simple text file
Professional Data - Clean, structured CSV reports with timestamps
Error Handling - Gracefully handles website downtime or issues

📊 Sample Output

Each weekly report contains:

  • ✅ Website URL and response status
  • ✅ Content length and loading speed
  • ✅ Timestamp of when data was collected
  • ✅ Error details if a site was unreachable
  • ✅ Historical tracking of all monitored sites

🚀 Quick Start Guide

1. Adding Websites to Monitor

  1. Open the urls.txt file in this repository
  2. Add your website URLs (one per line)
  3. Save and commit the changes
  4. The system will automatically monitor these URLs

2. Getting Your Reports

  • Reports are automatically generated as output_YYYY-MM-DD.csv
  • Download them directly from this repository
  • Each report contains all your monitored websites

3. Manual Report Generation

  • Go to Actions tab above
  • Click "Weekly Stock Scraper"
  • Click "Run workflow""Run workflow"
  • Wait 1-2 minutes for completion

📁 File Structure

📦 Your Project
├── 📄 scraper.py           # Main automation code
├── 📄 urls.txt            # Your websites to monitor
├── 📄 requirements.txt    # System dependencies  
├── 📂 .github/workflows/  # Automation configuration
├── 📊 output_*.csv        # Your weekly reports
└── 📖 README.md           # This guide

⚙️ System Specifications

Feature Details
Schedule Every Saturday 5:00 AM UTC (10:00 AM PKT)
Platform GitHub Actions (Free)
Language Python 3.10
Output CSV format with UTF-8 encoding
Monitoring Unlimited websites
Reliability 99.9% uptime guaranteed by GitHub

🛠️ Advanced Features Available

Current Features ✅

  • Automated weekly scheduling
  • Multiple URL monitoring
  • CSV data export
  • Error handling & reporting
  • Manual trigger option
  • Historical data tracking

Available Upgrades 🚀

  • Email Notifications - Get notified when reports are ready
  • Custom Data Extraction - Extract specific content (prices, titles, etc.)
  • Multiple Schedules - Daily, hourly, or custom timing
  • API Integration - Connect to your existing systems
  • Dashboard View - Visual charts and graphs
  • Slack/Discord Alerts - Team notifications

📞 Support & Maintenance

Setup Support - Complete handover and training included
Documentation - Step-by-step guides for all features
Monitoring - First month of monitoring included
Updates - Bug fixes and minor improvements

🔒 Security & Privacy

  • No sensitive data stored
  • All processing happens on GitHub's secure servers
  • Code is transparent and auditable

🏆 Developed by: Amna Bibi
📧 Contact: Amnab9373@gmail.com 🌐 Platform: Upwork Professional Service
⭐ Rating: 5.0★ Web Scraping Specialist

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages