AI-assisted waste sorting for mobile devices, built with Expo and React Native.
- Vision & Goals
- Solution Highlights
- Architecture Overview
- Project Structure
- Getting Started
- Development Workflow
- Testing & Quality
- AI Model Integration
- Roadmap
- Contributing
- License
- Team
- Acknowledgments
Improper waste sorting keeps 91% of plastic out of recycling streams. EcoGuardian makes sustainable behaviour easy by putting AI-powered waste identification in your pocket. Point your camera at an item and receive instant guidance on whether it belongs in recycling, compost, or landfill alongside contextual sustainability tips.
"Empowering communities to fight climate change, one correct disposal at a time."
- AI-powered guidance – TensorFlow Lite inference on-device to classify waste materials in real time, even when offline.
- Engaging UX – Camera-based scanning with guided viewfinder, fast feedback, impact statistics, and educational tips.
- Motivated communities – EcoPoints rewards, streaks, and leaderboards to make sustainable choices stick.
- Scalable foundation – Modular Expo Router navigation and Appwrite backend hooks for authentication, storage, and telemetry.
| Layer | Purpose | Key Technologies |
|---|---|---|
| Mobile client | User-facing experience for scanning, reviewing results, and engaging with gamified features. | Expo (React Native), NativeWind, Expo Camera, Expo Router |
| AI inference | Local TensorFlow Lite model that classifies captured images into material classes with corresponding labels. | TensorFlow Lite runtime (app/model/model.tflite, labels.txt) |
| Backend services | Cloud services for authentication, persistence, leaderboards, and analytics. | Appwrite (REST + Web SDK) |
| Observability | Crash and performance tracking (planned). | Expo Application Services, Appwrite Functions |
For a deeper look at modules and navigation flows, see docs/architecture.md.
EcoGuardian-AI/
├── app/
│ ├── (root)/(tabs)/ # Bottom tabs: Home, Scan, Stats, Leaderboard, Profile
│ ├── components/ # Camera + preview UI building blocks
│ ├── constants/ # Shared icon and imagery definitions
│ ├── model/ # TensorFlow Lite model + labels used by the client
│ ├── scan-result.tsx # Result screen shown after classification
│ └── sign-in.tsx # Authentication entry point (Appwrite-ready)
├── assets/ # Fonts, icons, marketing imagery
├── docs/ # Supplemental documentation (architecture, AI, etc.)
├── tailwind.config.js # NativeWind design tokens
├── metro.config.js # Metro bundler overrides
└── package.json # Scripts, dependencies, workspace metadata
- Node.js ≥ 18 (LTS recommended)
- npm ≥ 9 or yarn ≥ 1.22
- Expo CLI (
npm install -g expo-cli) - Xcode/iOS Simulator or Android Studio/Emulator for running locally (optional but recommended)
- Access to an Appwrite project for authentication and storage features
# Clone the repository
git clone https://github.com/<your-org>/EcoGuardian-AI.git
cd EcoGuardian-AI
# Install dependencies
npm install# Start the Expo development server
npm run startFrom the Expo CLI:
- Press
ito launch the iOS Simulator. - Press
ato launch the Android Emulator. - Scan the QR code with the Expo Go app to run on a physical device.
Appwrite and other service credentials can be provided via Expo config plugins or
runtime environment variables. See docs/architecture.md
and docs/ai-model.md for integration pointers.
| Command | Description |
|---|---|
npm run start |
Boot the Expo development server with fast refresh. |
npm run android / npm run ios / npm run web |
Launch the platform-specific bundler targets directly. |
npm run lint |
Run Expo's ESLint configuration. |
npm run test |
Execute Jest unit tests (watch mode). |
npm run typecheck |
Validate TypeScript types without emitting JS. |
npm run reset-project |
Clear caches and re-initialise the Expo project (handy when Metro misbehaves). |
Automated tests are powered by Jest with the jest-expo preset. Run npm run test
regularly to exercise component logic. For TypeScript safety use npm run typecheck,
and enforce style consistency with npm run lint.
We recommend integrating Expo's EAS Update and Appwrite's Functions for runtime monitoring once the backend endpoints are live.
The default TensorFlow Lite model lives in app/model/model.tflite, with class labels
in app/model/labels.txt. Replace these assets with your trained model when you are
ready to deploy production classifiers.
Need guidance? Read docs/ai-model.md for dataset preparation,
training, and optimisation tips, including quantisation advice for mobile deployments.
- UI/UX implementation with NativeWind styling
- Camera integration and preview flow
- Basic Expo Router navigation
- Initial TensorFlow Lite model training
- Model inference pipeline wiring and result visualisation
- Waste classification heuristics + copy updates
- Appwrite-backed authentication and profiles
- Gamification (EcoPoints, streaks, leaderboards)
- Beta testing and performance optimisation
- Fork the repo and create a feature branch (
git checkout -b feature/amazing). - Make your changes and ensure
npm run lint && npm run typecheckpass. - Commit using descriptive messages and open a Pull Request that references any relevant issues.
For larger workstreams or architecture decisions, please open a discussion thread first.
Contribution guidelines and decision records live in docs/architecture.md.
This project is licensed under the MIT License.
- AI Engineer – Model training and optimisation
- Frontend Developer – React Native implementation
- Backend Developer – Appwrite integration
- UX Designer – User research, flows, and experience design
- TrashNet dataset for initial model training
- The Expo team for the best React Native developer experience
- Mentors, hackathon organisers, and the sustainability community that inspired the project
Built with 💚 for a regenerative future.