A full-stack analytics platform featuring edge data ingestion (MQTT), cloud processing, and comprehensive Descriptive → Predictive → Prescriptive analytics with web and mobile interfaces.
- Project Overview
- Architecture
- Features Implemented
- Technology Stack
- Prerequisites
- Installation & Setup
- Running Locally
- Running with Docker
- Testing Guide
- API Documentation
- Troubleshooting
- Project Structure
ADAPTIQ is an enterprise-grade IoT analytics platform that combines:
- Edge Data Ingestion: Real-time MQTT telemetry + CSV/Excel uploads
- Descriptive Analytics: Statistical analysis, grouping, profiling with Redis caching
- Predictive Analytics: Time-series forecasting using Prophet
- Prescriptive Analytics: Linear programming optimization for resource allocation
- Security: OIDC authentication, MFA support, RBAC, Row-Level Security
- Mobile App: React Native/Expo KPI dashboard with offline support
- Monitoring: Prometheus metrics + Grafana dashboards
┌─────────────────────────────────────────────────────────────┐
│ Client Layer │
│ ┌──────────────┐ ┌──────────────┐ │
│ │ Next.js Web │ │ React Native │ │
│ │ (Port 3002) │ │ Mobile │ │
│ └──────────────┘ └──────────────┘ │
└─────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ API Layer │
│ ┌──────────────────────────────────────────────────┐ │
│ │ Node.js/Express API (Port 3000) │ │
│ │ • Auth Middleware (OIDC/JWT) │ │
│ │ • RBAC Enforcement │ │
│ │ • Rate Limiting │ │
│ └──────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────┘
│
┌─────────────────┼─────────────────┐
▼ ▼ ▼
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ PostgreSQL │ │ Redis │ │ MinIO │
│ (Port 5432) │ │ (Port 6379) │ │ (Port 9000) │
│ • RLS │ │ • Caching │ │ • S3 Store │
│ • Multi- │ │ │ │ • Models │
│ tenant │ │ │ │ │
└──────────────┘ └──────────────┘ └──────────────┘
▲
│
┌──────────────┐ ┌──────────────┐
│ MQTT Broker │ │ ML Service │
│ (Port 1883) │ │ (Port 5000) │
│ • Mosquitto │ │ • Prophet │
└──────────────┘ └──────────────┘
- Docker Compose orchestration (7 services)
- PostgreSQL with RLS and multi-tenancy
- MQTT subscriber for real-time telemetry
- CSV/Excel file upload with validation
- MinIO object storage integration
- Statistical analysis API (mean, count, min/max, date ranges)
- Redis caching with cache invalidation
- 6+ dashboard widgets (KPI cards, charts, tables)
- Dashboard load time < 2s (Week 3 DOD met)
- Auth0 OIDC integration
- JWT validation middleware
- Role-Based Access Control (viewer, analyst, operator, admin)
- PostgreSQL Row-Level Security (RLS)
- Tenant isolation
- MFA stub (ready for enforcement)
- Prophet-based time-series forecasting
- Model training endpoint
- Forecast with confidence intervals
- Model registry (MinIO storage)
- Forecast dashboard widget
- Linear Programming optimizer
- Multi-scenario comparison (optimal vs conservative)
- Constraint handling (capacity, demand, cost)
- Dashboard optimization panel
- React Native/Expo mobile app
- KPI dashboard screen
- Auth0 mobile authentication
- Offline caching with AsyncStorage
- Push notification stub
- Playwright E2E tests (20+ test cases)
- Load testing with k6
- Prometheus metrics integration
- Grafana SLO dashboards
- Production runbook
- Go/No-Go checklist
- Runtime: Node.js 18+
- Framework: Express.js
- Database: PostgreSQL 15 (with RLS)
- Cache: Redis 7
- Storage: MinIO (S3-compatible)
- Message Broker: Eclipse Mosquitto (MQTT)
- ML Engine: Python Flask + Prophet
- Web: Next.js 15 (App Router)
- Mobile: React Native + Expo
- Styling: Tailwind CSS
- Auth: NextAuth.js + Auth0
- Charts: Recharts, Chart.js
- Containerization: Docker + Docker Compose
- E2E Testing: Playwright
- Load Testing: k6
- Monitoring: Prometheus + Grafana
- CI/CD: GitHub Actions (configured)
# Check if you have these installed:
node --version # v18+ required
npm --version # v9+ required
docker --version # v24+ required
docker-compose --version # v2+ required
# Optional for local development:
postgresql --version # v15+ if running without Docker
redis-cli --version # v7+ if running without Docker
``
---
## 🚀 Installation & Setup
### Step 1: Clone the Repository
```bash
git clone https://github.com/your-org/adaptiq-analytics.git
cd adaptiq-analyticsCreate environment files:
# Root directory
cp .env.example .env
# API directory
cp api/.env.example api/.env
# Web directory
cp web/.env.example web/.envEdit .env files with your credentials:
# Root .env
AUTH0_DOMAIN=your-tenant.us.auth0.com
AUTH0_CLIENT_ID=your_web_client_id
AUTH0_CLIENT_SECRET=your_web_client_secret
AUTH0_ISSUER=https://your-tenant.us.auth0.com/
NEXT_PUBLIC_AUTH0_AUDIENCE=https://api.adaptiq.com
NEXTAUTH_SECRET=generate_with_openssl_rand_base64_32
# Generate NEXTAUTH_SECRET:
openssl rand -base64 32# API dependencies
cd api
npm install
# Web dependencies
cd ../web
npm install
# Mobile dependencies (optional)
cd ../mobile
npm install- PostgreSQL 15 running locally
- Redis running locally
- MinIO running locally
- Mosquitto MQTT broker running locally
# Terminal 1: Start PostgreSQL (if not running as service)
pg_ctl start -D /usr/local/var/postgres
# Terminal 2: Start Redis
redis-server
# Terminal 3: Start MinIO
minio server /data --console-address ":9001"
# Terminal 4: Start Mosquitto
cd "C:\path\to\mosquitto"
mosquitto -v -c /path/to/mosquitto.conf
### Test
mosquitto_pub -h localhost -p 1885 -u username -P password -t adaptIq/demo/topic -m '{"asset_id":1,"signal":"temp","value":27.8,"quality":"good"}'
# Terminal 5: Start ML Service
cd ml
python ml_service.py
# Terminal 6: Start API
cd api
npm run dev
# Terminal 7: Start Web App
cd web
npm run dev####Initialize Database
# Seed database with sample data
cd api
npm run seedAccess:
- Web App: http://localhost:3002
- API: http://localhost:3000
- MinIO Console: http://localhost:9001
# From project root
docker-compose up -d
# Check all services are healthy
docker-compose psExpected output:
NAME STATUS PORTS
adaptiq-db Up (healthy) 5432->5432
adaptiq-redis Up (healthy) 6379->6379
adaptiq-minio Up (healthy) 9000->9000, 9001->9001
adaptiq-mqtt Up (healthy) 1884->1883
adaptiq-ml Up (healthy) 5000->5000
adaptiq-api Up (healthy) 3000->3000
adaptiq-web Up (healthy) 3001->3002
# Seed database with sample data
docker exec -it adaptiq-api node scripts/seed-data.jsExpected output:
🌱 Starting data seeding...
✅ Ensured tenant 'tenant_demo' exists
✅ Ensured demo user exists
📊 Generating temperature data...
📊 Generating pressure data...
...
✅ Seeding complete!
📈 Total records inserted: 10800
# Check API health
curl http://localhost:3000/health
# Expected: {"status":"OK","timestamp":"2025-11-09T..."}
# Check web app
curl -I http://localhost:3002
# Expected: HTTP/1.1 200 OK- Web Dashboard: http://localhost:3002
- API Swagger: http://localhost:3000/health
- MinIO Console: http://localhost:9001 (admin/admin)
- Prometheus: http://localhost:9090 (if monitoring enabled)
- Grafana: http://localhost:3001 (admin/admin, if monitoring enabled)
# Install mosquitto client tools
# macOS: brew install mosquitto
# Ubuntu: sudo apt-get install mosquitto-clients
# Windows: Download from https://mosquitto.org/download/
# Subscribe to test topic
mosquitto_sub -h localhost -p 1883 -t "adaptIq/demo/topic" -u username -P password
# In another terminal, publish test message
mosquitto_pub -h localhost -p 1883 -t "adaptIq/demo/topic" \
-u username -P password \
-m '{
"timestamp": "2025-11-09T12:00:00Z",
"asset_id": 1,
"signal": "temperature",
"value": 25.5,
"quality": "good",
"tenant_id": "tenant_demo"
}'# Check API logs
docker logs adaptiq-api --tail=50
# Expected: "Telemetry inserted into DB"
# Query database
docker exec -it adaptiq-db psql -U adaptIq_user -d adaptIq_db -c "
SET app.current_tenant = 'tenant_demo';
SELECT * FROM telemetry ORDER BY ts DESC LIMIT 5;
"# Create test CSV
cat > test-data.csv << EOF
ts,asset_id,signal,value,quality
2025-11-09 10:00:00,1,temperature,22.5,good
2025-11-09 11:00:00,1,temperature,23.1,good
2025-11-09 12:00:00,1,temperature,22.8,good
EOF
# Upload via API
curl -X POST http://localhost:3000/ingestion/upload \
-H "Authorization: Bearer YOUR_TOKEN" \
-F "[email protected]"# Access MinIO Console: http://localhost:9001
# Login: minioadmin / minioadmin
# Navigate to: adaptiq-bucket
# You should see uploaded files
# Or via CLI:
docker exec adaptiq-minio mc ls adaptiq-bucket/cd web
# Set test credentials
export TEST_USER_EMAIL="[email protected]"
export TEST_USER_PASSWORD="your-password"
# Start dev server (if not running)
npm run dev &
# Run tests
npm run test:e2e
# View report
npx playwright show-reportcd api/tests/load
# Get authentication token
export AUTH_TOKEN="your-token-here"
# Run load tests
./run-load-tests.sh
# View results
cat test-results/*/ingestion-summary.json# Create test script
cat > test-api.sh << 'EOF'
#!/bin/bash
echo "Testing ADAPTIQ API..."
# Health check
echo "1. Health Check:"
curl -s http://localhost:3000/health | jq
# Database connectivity (via analytics endpoint)
echo -e "\n2. Database Connectivity:"
curl -s -X POST http://localhost:3000/analysis/descriptive \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-d '{"groupBy":["signal"]}' | jq '.status'
# Redis check (via metrics)
echo -e "\n3. Redis Connectivity:"
curl -s http://localhost:3000/metrics | grep cache_hits_total
# MinIO check
echo -e "\n4. MinIO Connectivity:"
curl -s http://localhost:9000/minio/health/live | jq
echo -e "\n✅ All tests completed!"
EOF
chmod +x test-api.sh
./test-api.sh# Complete workflow test script
cat > test-workflow.sh << 'EOF'
#!/bin/bash
set -e
echo "🧪 ADAPTIQ End-to-End Workflow Test"
echo "===================================="
# 1. Upload data
echo -e "\n1️⃣ Uploading test data..."
curl -X POST http://localhost:3000/ingestion/upload \
-H "Authorization: Bearer $AUTH_TOKEN" \
-F "[email protected]" \
-o /dev/null -w "Status: %{http_code}\n"
# 2. Query descriptive analytics
echo -e "\n2️⃣ Fetching descriptive analytics..."
curl -s -X POST http://localhost:3000/analysis/descriptive \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-d '{"groupBy":["signal"]}' | jq '.status'
# 3. Generate forecast
echo -e "\n3️⃣ Generating forecast..."
curl -s -X POST http://localhost:3000/analysis/predictive/forecast \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-d '{"signal":"temperature","periods":7}' | jq '.status'
# 4. Run optimization
echo -e "\n4️⃣ Running optimization..."
curl -s -X POST http://localhost:3000/analysis/prescriptive/optimize \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-d '{"scenario":"optimal","demand":100}' | jq '.data[0].cost'
echo -e "\n✅ Workflow test completed successfully!"
EOF
chmod +x test-workflow.sh
./test-workflow.shAll API endpoints (except /health) require JWT authentication:
Authorization: Bearer <your-jwt-token># Upload CSV/Excel
POST /ingestion/upload
Content-Type: multipart/form-data
Body: file=<your-file.csv>
Response:
{
"schema": {...},
"preview": [[...]],
"inserted": 1000,
"cacheInvalidated": true
}# Get statistics
POST /analysis/descriptive
Content-Type: application/json
{
"groupBy": ["signal"],
"filters": {"signal": "temperature"}
}
Response:
{
"status": "OK",
"data": [{
"signal": "temperature",
"avg_value": 25.3,
"count": "1000",
"min_ts": "2025-01-01T00:00:00Z",
"max_ts": "2025-01-30T23:00:00Z"
}],
"cached": false
}# Generate forecast
POST /analysis/predictive/forecast
Content-Type: application/json
{
"signal": "temperature",
"periods": 7
}
Response:
{
"status": "OK",
"data": [{
"ds": "2025-11-10T00:00:00Z",
"yhat": 25.5,
"yhat_lower": 23.1,
"yhat_upper": 27.9
}],
"metrics": {"MAE": 1.2, "MAPE": 0.05}
}# Optimize resource allocation
POST /analysis/prescriptive/optimize
Content-Type: application/json
{
"scenario": "optimal",
"demand": 100,
"cap1": 60,
"cap2": 50,
"cost1": 10,
"cost2": 15
}
Response:
{
"data": [{
"scenario": "optimal",
"x1": 60,
"x2": 40,
"cost": 1200,
"feasible": true
}],
"totalCost": 1200
}# Check port conflicts
netstat -ano | findstr :3000
netstat -ano | findstr :5432
# Kill conflicting processes
taskkill /PID <pid> /F
# Restart Docker
docker-compose down
docker-compose up -d# Check if PostgreSQL is running
docker ps | grep adaptiq-db
# Check logs
docker logs adaptiq-db
# Restart database
docker-compose restart db
# Wait for healthy status
docker-compose ps# Verify API is running
curl http://localhost:3000/health
# Check CORS settings in api/server.js
# Ensure http://localhost:3002 is in allowed origins
# Check environment variables
cat web/.env | grep NEXT_PUBLIC_API_URL# Verify Auth0 configuration
echo $AUTH0_CLIENT_ID
echo $AUTH0_DOMAIN
# Check redirect URIs in Auth0 dashboard:
# - http://localhost:3002/api/auth/callback/auth0
# Check NextAuth configuration
cat web/.env | grep NEXTAUTH# Ensure services are running
docker-compose ps
# Set test credentials
export TEST_USER_EMAIL="[email protected]"
export TEST_USER_PASSWORD="password"
# Clear test artifacts
rm -rf web/test-results web/playwright-report
# Run with debug
npx playwright test --debugadaptiq-analytics/
├── api/ # Backend API
│ ├── config/ # Configuration files
│ │ ├── db.js # PostgreSQL connection
│ │ ├── redis.js # Redis connection
│ │ ├── minio.js # MinIO S3 client
│ │ ├── mqtt.js # MQTT client
│ │ └── logger.js # Winston logger
│ ├── controllers/ # Route controllers
│ │ ├── analysis.js # Analytics endpoints
│ │ ├── ingestion.js # Data upload
│ │ ├── datasets.js # Dataset management
│ │ ├── tenants.js # Tenant management
│ │ └── users.js # User management
│ ├── middleware/ # Express middleware
│ │ ├── auth.js # JWT validation + RBAC
│ │ ├── error.js # Error handler
│ │ ├── mfa.js # MFA enforcement
│ │ └── validate.js # Input validation
│ ├── routes/ # API routes
│ ├── services/ # Background services
│ │ └── mqttSubscriber.js # MQTT message handler
│ ├── utils/ # Utility functions
│ ├── tests/ # API tests
│ │ └── load/ # k6 load tests
│ ├── scripts/ # Utility scripts
│ │ └── seed-data.js # Database seeding
│ ├── server.js # Entry point
│ ├── package.json
│ └── .env
├── web/ # Next.js web app
│ ├── src/
│ │ ├── app/ # App router pages
│ │ │ ├── (auth)/ # Auth pages
│ │ │ │ ├── login/
│ │ │ │ └── select-tenant/
│ │ │ ├── (dashboard)/ # Protected pages
│ │ │ │ ├── dashboard/
│ │ │ │ ├── datasets/
│ │ │ │ ├── reports/
│ │ │ │ └── settings/
│ │ │ ├── api/auth/[...nextauth]/ # NextAuth
│ │ │ ├── layout.tsx
│ │ │ └── page.tsx
│ │ ├── components/ # React components
│ │ │ ├── dashboard/ # Dashboard widgets
│ │ │ ├── layout/ # Layout components
│ │ │ └── error/ # Error boundaries
│ │ ├── lib/ # Utilities
│ │ └── types/ # TypeScript types
│ ├── tests/ # E2E tests
│ │ ├── e2e/ # Playwright tests
│ │ │ ├── fixtures/
│ │ │ ├── 01-auth-flow.spec.ts
│ │ │ ├── 02-dashboard.spec.ts
│ │ │ ├── 03-data-upload.spec.ts
│ │ │ ├── 04-reports.spec.ts
│ │ │ └── 05-rbac.spec.ts
│ │ └── fixtures/ # Test data
│ ├── playwright.config.ts
│ ├── next.config.js
│ ├── tailwind.config.ts
│ ├── package.json
│ └── .env
├── mobile/ # React Native app
│ ├── app/ # Expo Router
│ │ ├── (tabs)/
│ │ │ └── KpiScreen.tsx
│ │ ├── LandingScreen.tsx
│ │ ├── LoginScreen.tsx
│ │ └── _layout.tsx
│ ├── components/
│ │ └── new/
│ │ └── AlertStub.tsx
│ ├── constants/
│ ├── assets/
│ ├── app.json
│ ├── package.json
│ └── .env
├── ml/ # ML service
│ ├── ml_service.py # Flask API
│ ├── requirements.txt
│ └── Dockerfile
├── db/ # Database
│ └── init.sql # Schema + seed
├── mqtt/ # MQTT config
│ └── config/
│ └── mosquitto.conf
├── monitoring/ # Observability (Week 8)
│ ├── prometheus.yml
│ ├── alerts.yml
│ └── grafana/
│ ├── provisioning/
│ └── dashboards/
├── docs/ # Documentation
│ ├── RUNBOOK.md # Operations guide
│ ├── GO-NOGO-CHECKLIST.md # Launch checklist
│ └── ARCHITECTURE.md
├── .github/ # CI/CD
│ └── workflows/
│ └── e2e-tests.yml
├── docker-compose.yml
├── docker-compose.monitoring.yml
├── .env.example
└── README.md # This file
- Load Testing: k6 scripts for ingestion and analytics (50-100 concurrent users)
- E2E Tests: 20+ Playwright tests covering auth, dashboard, upload, reports, RBAC
- Monitoring: Prometheus metrics + Grafana dashboards tracking 4 key SLOs
- Runbook: Complete operational guide (deploy, backup, rollback, incident response)
- Go/No-Go: Assessment checklist with all 8 weeks validated
- CI/CD: GitHub Actions workflow (configured, ready to run)
- Documentation: Comprehensive README with setup, testing, troubleshooting