- Overview
- Architecture
- Features
- Tech Stack
- Quick Start
- API Documentation
- Machine Learning Models
- Deployment
- Development
- Contributing
- License
The Bandit Games Analytics Microservice is a specialized service within the larger Bandit Games ecosystem, designed to provide real-time analytics, statistics, and machine learning predictions for gaming platforms. This microservice processes gaming events, analyzes player behavior, and delivers actionable insights through RESTful APIs. The live/deployed functionality of this ml-analytics microservice can be viewed in the frontend(readme.md section) of the integrated project found here.
This microservice operates as part of the Bandit Games Platform alongside:
- Core Platform (Java Spring Boot) - User management, game lobbies, achievements
- Frontend (React/Vue) - Player and admin interfaces
- Analytics Microservice (Python) - This service
graph TB
subgraph "Bandit Games Ecosystem"
Frontend[Frontend<br/>React/Vue]
Core[Core Platform<br/>Java Spring Boot]
Analytics[Analytics Microservice<br/>Python FastAPI]
end
subgraph "Analytics Microservice Components"
StatsAPI[Statistics API<br/>:8001]
PredictionAPI[Prediction API<br/>:8002]
Consumer[Analytics Consumer<br/>Event Processor]
end
subgraph "Data Layer"
MySQL[(MySQL Database<br/>platform_analytics)]
RabbitMQ[RabbitMQ<br/>Event Streaming]
end
Frontend --> StatsAPI
Frontend --> PredictionAPI
Core --> RabbitMQ
RabbitMQ --> Consumer
Consumer --> MySQL
StatsAPI --> MySQL
PredictionAPI --> MySQL
- Event Ingestion: Core platform sends game events via RabbitMQ
- Real-time Processing: Analytics Consumer processes events and updates database
- Data Storage: MySQL stores player statistics and game data
- API Services: Statistics and Prediction APIs serve data to frontend
- ML Predictions: Machine learning models provide insights and forecasts
- Player Statistics: Comprehensive player performance metrics
- Game Analytics: Popular games, top players, engagement trends
- Live Dashboards: Real-time data visualization for admins
- Player Churn Prediction: Identify at-risk players
- Win Probability: Predict likelihood of winning next game
- Engagement Forecasting: Predict future player engagement levels
- Skill Classification: Categorize players (novice, intermediate, expert)
- Real-time Event Streaming: Process game events via RabbitMQ
- Automatic Statistics Updates: Database triggers maintain current stats
- Error Handling: Robust message processing with retry mechanisms
- Containerized Services: Docker-based deployment
- Horizontal Scaling: Independent scaling of each service
- Cloud Integration: Azure App Service deployment
- CI/CD Pipeline: Automated build, test, and deployment
- Python 3.9 - Core programming language
- FastAPI - Modern, fast web framework for APIs
- SQLAlchemy - Database ORM and connection management
- Pika - RabbitMQ client for Python
- Scikit-learn - Machine learning algorithms
- Pandas - Data manipulation and analysis
- NumPy - Numerical computing
- Jupyter Notebooks - Model development and experimentation
- MySQL 8.0 - Primary database
- RabbitMQ - Message broker for event streaming
- Redis - Caching layer (optional)
- Docker - Containerization
- Docker Compose - Multi-container orchestration
- Azure App Service - Cloud hosting
- GitLab CI/CD - Continuous integration and deployment
- Docker and Docker Compose
- Python 3.9+ (for local development)
- Git
# Clone the repository
git clone https://github.com/HopeyCodeDS/bandit-games-ml-analytics.git
cd bandit-games-ml-analytics
# Start all services
docker-compose up -d
# Check service status
docker-compose ps# Clone and navigate
git clone https://github.com/HopeyCodeDS/bandit-games-ml-analytics.git
cd bandit-games-ml-analytics
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Start services individually
# Terminal 1: Statistics API
cd GameAnalytics && uvicorn player_statistics_API_new:app --host 0.0.0.0 --port 8001
# Terminal 2: Prediction API
cd PredictionSystem && uvicorn unified_prediction_api:app --host 0.0.0.0 --port 8002
# Terminal 3: Analytics Consumer
cd communication && python analytics_consumer.py| Service | URL | Description |
|---|---|---|
| Statistics API | http://localhost:8001 | Player and game statistics |
| Prediction API | http://localhost:8002 | ML predictions and insights |
| API Documentation | http://localhost:8001/docs | Interactive API docs |
GET /api/stats/players?skip=0&limit=20GET /api/stats/player/{player_id}GET /api/stats/game/{game_id}GET /api/stats/summaryPOST /api/predictions
Content-Type: application/json
{
"total_games_played": 50,
"total_moves": 1250,
"total_wins": 30,
"total_losses": 20,
"total_time_played_minutes": 1200,
"gender": "Male",
"country": "USA",
"game_name": "Chess",
"age": 25
}POST /predict/churn- Churn predictionPOST /predict/win_probability- Win probabilityPOST /predict/engagement- Engagement predictionPOST /predict/classification- Skill classification
| Model | Accuracy | Purpose | Features |
|---|---|---|---|
| Churn Prediction | 95% | Identify at-risk players | Games played, win ratio, demographics |
| Win Probability | 88% | Predict next game outcome | Historical performance, player level |
| Engagement | 92% | Forecast engagement levels | Activity patterns, demographics |
| Classification | 94% | Skill level categorization | Performance metrics, playtime |
Models are trained using Jupyter notebooks in the Database_Based_PredictionSystem/ directory:
playerChurn/- Churn prediction modelswinProbability/- Win probability modelsgameEngagement/- Engagement prediction modelsplayerClassification/- Skill classification models
The service is automatically deployed to Azure using GitLab CI/CD(during the development of this project gitlab was primarily used):
# .gitlab-ci.yml stages
stages:
- build
- test
- deploy# Database Configuration
DB_USER=azureuser
DB_PASSWORD=your_password
DB_HOST=your-mysql-server.mysql.database.azure.com
DB_PORT=3306
DB_NAME=platform_analytics
# RabbitMQ Configuration
RABBITMQ_HOST=your-rabbitmq-host
RABBITMQ_PORT=5672
RABBITMQ_USERNAME=your_username
RABBITMQ_PASSWORD=your_password
# CORS Configuration
ALLOWED_ORIGINS=https://your-frontend-domain.comPre-built images are available on Docker Hub:
docker pull opeyemimomodu/statistics-api:latest
docker pull opeyemimomodu/prediction-api:latest
docker pull opeyemimomodu/analytics-consumer:latestanalytics/
├── communication/ # Event processing service
│ ├── analytics_consumer.py
│ ├── Dockerfile
│ └── requirements.txt
├── GameAnalytics/ # Statistics API
│ ├── player_statistics_API_new.py
│ ├── Dockerfile
│ └── requirements.txt
├── PredictionSystem/ # ML Prediction API
│ ├── unified_prediction_api.py
│ ├── models/ # Trained ML models
│ ├── notebooks/ # Jupyter notebooks
│ ├── Dockerfile
│ └── requirements.txt
├── Database_Based_PredictionSystem/ # Model development
├── dataCreation/ # Database schemas
├── docker-compose.yml # Local development
└── .gitlab-ci.yml # CI/CD pipeline
- New ML Model: Add to
PredictionSystem/notebooks/ - New API Endpoint: Add to respective FastAPI service
- New Event Type: Update
communication/analytics_consumer.py - Database Changes: Update schemas in
dataCreation/
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.