A full-stack analytics platform featuring edge data ingestion (MQTT), cloud processing, and comprehensive Descriptive β Predictive β Prescriptive analytics with web and mobile interfaces.
- Project Overview
- Architecture
- Features Implemented
- Technology Stack
- Prerequisites
- Installation & Setup
- Running Locally
- Running with Docker
- Testing Guide
- API Documentation
- Troubleshooting
- Project Structure
ADAPTIQ is an enterprise-grade IoT analytics platform that combines:
- Edge Data Ingestion: Real-time MQTT telemetry + CSV/Excel uploads
- Descriptive Analytics: Statistical analysis, grouping, profiling with Redis caching
- Predictive Analytics: Time-series forecasting using Prophet
- Prescriptive Analytics: Linear programming optimization for resource allocation
- Security: OIDC authentication, MFA support, RBAC, Row-Level Security
- Mobile App: React Native/Expo KPI dashboard with offline support
- Monitoring: Prometheus metrics + Grafana dashboards
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Client Layer β
β ββββββββββββββββ ββββββββββββββββ β
β β Next.js Web β β React Native β β
β β (Port 3002) β β Mobile β β
β ββββββββββββββββ ββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β API Layer β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Node.js/Express API (Port 3000) β β
β β β’ Auth Middleware (OIDC/JWT) β β
β β β’ RBAC Enforcement β β
β β β’ Rate Limiting β β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββΌββββββββββββββββββ
βΌ βΌ βΌ
ββββββββββββββββ ββββββββββββββββ ββββββββββββββββ
β PostgreSQL β β Redis β β MinIO β
β (Port 5432) β β (Port 6379) β β (Port 9000) β
β β’ RLS β β β’ Caching β β β’ S3 Store β
β β’ Multi- β β β β β’ Models β
β tenant β β β β β
ββββββββββββββββ ββββββββββββββββ ββββββββββββββββ
β²
β
ββββββββββββββββ ββββββββββββββββ
β MQTT Broker β β ML Service β
β (Port 1883) β β (Port 5000) β
β β’ Mosquitto β β β’ Prophet β
ββββββββββββββββ ββββββββββββββββ
- Docker Compose orchestration (7 services)
- PostgreSQL with RLS and multi-tenancy
- MQTT subscriber for real-time telemetry
- CSV/Excel file upload with validation
- MinIO object storage integration
- Statistical analysis API (mean, count, min/max, date ranges)
- Redis caching with cache invalidation
- 6+ dashboard widgets (KPI cards, charts, tables)
- Dashboard load time < 2s (Week 3 DOD met)
- Auth0 OIDC integration
- JWT validation middleware
- Role-Based Access Control (viewer, analyst, operator, admin)
- PostgreSQL Row-Level Security (RLS)
- Tenant isolation
- MFA stub (ready for enforcement)
- Prophet-based time-series forecasting
- Model training endpoint
- Forecast with confidence intervals
- Model registry (MinIO storage)
- Forecast dashboard widget
- Linear Programming optimizer
- Multi-scenario comparison (optimal vs conservative)
- Constraint handling (capacity, demand, cost)
- Dashboard optimization panel
- React Native/Expo mobile app
- KPI dashboard screen
- Auth0 mobile authentication
- Offline caching with AsyncStorage
- Push notification stub
- Playwright E2E tests (20+ test cases)
- Load testing with k6
- Prometheus metrics integration
- Grafana SLO dashboards
- Production runbook
- Go/No-Go checklist
- Runtime: Node.js 18+
- Framework: Express.js
- Database: PostgreSQL 15 (with RLS)
- Cache: Redis 7
- Storage: MinIO (S3-compatible)
- Message Broker: Eclipse Mosquitto (MQTT)
- ML Engine: Python Flask + Prophet
- Web: Next.js 15 (App Router)
- Mobile: React Native + Expo
- Styling: Tailwind CSS
- Auth: NextAuth.js + Auth0
- Charts: Recharts, Chart.js
- Containerization: Docker + Docker Compose
- E2E Testing: Playwright
- Load Testing: k6
- Monitoring: Prometheus + Grafana
- CI/CD: GitHub Actions (configured)
# Check if you have these installed:
node --version # v18+ required
npm --version # v9+ required
docker --version # v24+ required
docker-compose --version # v2+ required
# Optional for local development:
postgresql --version # v15+ if running without Docker
redis-cli --version # v7+ if running without Docker
``
---
## π Installation & Setup
### Step 1: Clone the Repository
```bash
git clone https://github.com/your-org/adaptiq-analytics.git
cd adaptiq-analyticsCreate environment files:
# Root directory
cp .env.example .env
# API directory
cp api/.env.example api/.env
# Web directory
cp web/.env.example web/.envEdit .env files with your credentials:
# Root .env
AUTH0_DOMAIN=your-tenant.us.auth0.com
AUTH0_CLIENT_ID=your_web_client_id
AUTH0_CLIENT_SECRET=your_web_client_secret
AUTH0_ISSUER=https://your-tenant.us.auth0.com/
NEXT_PUBLIC_AUTH0_AUDIENCE=https://api.adaptiq.com
NEXTAUTH_SECRET=generate_with_openssl_rand_base64_32
# Generate NEXTAUTH_SECRET:
openssl rand -base64 32# API dependencies
cd api
npm install
# Web dependencies
cd ../web
npm install
# Mobile dependencies (optional)
cd ../mobile
npm install- PostgreSQL 15 running locally
- Redis running locally
- MinIO running locally
- Mosquitto MQTT broker running locally
# Terminal 1: Start PostgreSQL (if not running as service)
pg_ctl start -D /usr/local/var/postgres
# Terminal 2: Start Redis
redis-server
# Terminal 3: Start MinIO
minio server /data --console-address ":9001"
# Terminal 4: Start Mosquitto
cd "C:\path\to\mosquitto"
mosquitto -v -c /path/to/mosquitto.conf
### Test
mosquitto_pub -h localhost -p 1885 -u username -P password -t adaptIq/demo/topic -m '{"asset_id":1,"signal":"temp","value":27.8,"quality":"good"}'
# Terminal 5: Start ML Service
cd ml
python ml_service.py
# Terminal 6: Start API
cd api
npm run dev
# Terminal 7: Start Web App
cd web
npm run dev####Initialize Database
# Seed database with sample data
cd api
npm run seedAccess:
- Web App: http://localhost:3002
- API: http://localhost:3000
- MinIO Console: http://localhost:9001
# From project root
docker-compose up -d
# Check all services are healthy
docker-compose psExpected output:
NAME STATUS PORTS
adaptiq-db Up (healthy) 5432->5432
adaptiq-redis Up (healthy) 6379->6379
adaptiq-minio Up (healthy) 9000->9000, 9001->9001
adaptiq-mqtt Up (healthy) 1884->1883
adaptiq-ml Up (healthy) 5000->5000
adaptiq-api Up (healthy) 3000->3000
adaptiq-web Up (healthy) 3001->3002
# Seed database with sample data
docker exec -it adaptiq-api node scripts/seed-data.jsExpected output:
π± Starting data seeding...
β
Ensured tenant 'tenant_demo' exists
β
Ensured demo user exists
π Generating temperature data...
π Generating pressure data...
...
β
Seeding complete!
π Total records inserted: 10800
# Check API health
curl http://localhost:3000/health
# Expected: {"status":"OK","timestamp":"2025-11-09T..."}
# Check web app
curl -I http://localhost:3002
# Expected: HTTP/1.1 200 OK- Web Dashboard: http://localhost:3002
- API Swagger: http://localhost:3000/health
- MinIO Console: http://localhost:9001 (admin/admin)
- Prometheus: http://localhost:9090 (if monitoring enabled)
- Grafana: http://localhost:3001 (admin/admin, if monitoring enabled)
# Install mosquitto client tools
# macOS: brew install mosquitto
# Ubuntu: sudo apt-get install mosquitto-clients
# Windows: Download from https://mosquitto.org/download/
# Subscribe to test topic
mosquitto_sub -h localhost -p 1883 -t "adaptIq/demo/topic" -u username -P password
# In another terminal, publish test message
mosquitto_pub -h localhost -p 1883 -t "adaptIq/demo/topic" \
-u username -P password \
-m '{
"timestamp": "2025-11-09T12:00:00Z",
"asset_id": 1,
"signal": "temperature",
"value": 25.5,
"quality": "good",
"tenant_id": "tenant_demo"
}'# Check API logs
docker logs adaptiq-api --tail=50
# Expected: "Telemetry inserted into DB"
# Query database
docker exec -it adaptiq-db psql -U adaptIq_user -d adaptIq_db -c "
SET app.current_tenant = 'tenant_demo';
SELECT * FROM telemetry ORDER BY ts DESC LIMIT 5;
"# Create test CSV
cat > test-data.csv << EOF
ts,asset_id,signal,value,quality
2025-11-09 10:00:00,1,temperature,22.5,good
2025-11-09 11:00:00,1,temperature,23.1,good
2025-11-09 12:00:00,1,temperature,22.8,good
EOF
# Upload via API
curl -X POST http://localhost:3000/ingestion/upload \
-H "Authorization: Bearer YOUR_TOKEN" \
-F "[email protected]"# Access MinIO Console: http://localhost:9001
# Login: minioadmin / minioadmin
# Navigate to: adaptiq-bucket
# You should see uploaded files
# Or via CLI:
docker exec adaptiq-minio mc ls adaptiq-bucket/cd web
# Set test credentials
export TEST_USER_EMAIL="[email protected]"
export TEST_USER_PASSWORD="your-password"
# Start dev server (if not running)
npm run dev &
# Run tests
npm run test:e2e
# View report
npx playwright show-reportcd api/tests/load
# Get authentication token
export AUTH_TOKEN="your-token-here"
# Run load tests
./run-load-tests.sh
# View results
cat test-results/*/ingestion-summary.json# Create test script
cat > test-api.sh << 'EOF'
#!/bin/bash
echo "Testing ADAPTIQ API..."
# Health check
echo "1. Health Check:"
curl -s http://localhost:3000/health | jq
# Database connectivity (via analytics endpoint)
echo -e "\n2. Database Connectivity:"
curl -s -X POST http://localhost:3000/analysis/descriptive \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-d '{"groupBy":["signal"]}' | jq '.status'
# Redis check (via metrics)
echo -e "\n3. Redis Connectivity:"
curl -s http://localhost:3000/metrics | grep cache_hits_total
# MinIO check
echo -e "\n4. MinIO Connectivity:"
curl -s http://localhost:9000/minio/health/live | jq
echo -e "\nβ
All tests completed!"
EOF
chmod +x test-api.sh
./test-api.sh# Complete workflow test script
cat > test-workflow.sh << 'EOF'
#!/bin/bash
set -e
echo "π§ͺ ADAPTIQ End-to-End Workflow Test"
echo "===================================="
# 1. Upload data
echo -e "\n1οΈβ£ Uploading test data..."
curl -X POST http://localhost:3000/ingestion/upload \
-H "Authorization: Bearer $AUTH_TOKEN" \
-F "[email protected]" \
-o /dev/null -w "Status: %{http_code}\n"
# 2. Query descriptive analytics
echo -e "\n2οΈβ£ Fetching descriptive analytics..."
curl -s -X POST http://localhost:3000/analysis/descriptive \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-d '{"groupBy":["signal"]}' | jq '.status'
# 3. Generate forecast
echo -e "\n3οΈβ£ Generating forecast..."
curl -s -X POST http://localhost:3000/analysis/predictive/forecast \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-d '{"signal":"temperature","periods":7}' | jq '.status'
# 4. Run optimization
echo -e "\n4οΈβ£ Running optimization..."
curl -s -X POST http://localhost:3000/analysis/prescriptive/optimize \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-d '{"scenario":"optimal","demand":100}' | jq '.data[0].cost'
echo -e "\nβ
Workflow test completed successfully!"
EOF
chmod +x test-workflow.sh
./test-workflow.shAll API endpoints (except /health) require JWT authentication:
Authorization: Bearer <your-jwt-token># Upload CSV/Excel
POST /ingestion/upload
Content-Type: multipart/form-data
Body: file=<your-file.csv>
Response:
{
"schema": {...},
"preview": [[...]],
"inserted": 1000,
"cacheInvalidated": true
}# Get statistics
POST /analysis/descriptive
Content-Type: application/json
{
"groupBy": ["signal"],
"filters": {"signal": "temperature"}
}
Response:
{
"status": "OK",
"data": [{
"signal": "temperature",
"avg_value": 25.3,
"count": "1000",
"min_ts": "2025-01-01T00:00:00Z",
"max_ts": "2025-01-30T23:00:00Z"
}],
"cached": false
}# Generate forecast
POST /analysis/predictive/forecast
Content-Type: application/json
{
"signal": "temperature",
"periods": 7
}
Response:
{
"status": "OK",
"data": [{
"ds": "2025-11-10T00:00:00Z",
"yhat": 25.5,
"yhat_lower": 23.1,
"yhat_upper": 27.9
}],
"metrics": {"MAE": 1.2, "MAPE": 0.05}
}# Optimize resource allocation
POST /analysis/prescriptive/optimize
Content-Type: application/json
{
"scenario": "optimal",
"demand": 100,
"cap1": 60,
"cap2": 50,
"cost1": 10,
"cost2": 15
}
Response:
{
"data": [{
"scenario": "optimal",
"x1": 60,
"x2": 40,
"cost": 1200,
"feasible": true
}],
"totalCost": 1200
}# Check port conflicts
netstat -ano | findstr :3000
netstat -ano | findstr :5432
# Kill conflicting processes
taskkill /PID <pid> /F
# Restart Docker
docker-compose down
docker-compose up -d# Check if PostgreSQL is running
docker ps | grep adaptiq-db
# Check logs
docker logs adaptiq-db
# Restart database
docker-compose restart db
# Wait for healthy status
docker-compose ps# Verify API is running
curl http://localhost:3000/health
# Check CORS settings in api/server.js
# Ensure http://localhost:3002 is in allowed origins
# Check environment variables
cat web/.env | grep NEXT_PUBLIC_API_URL# Verify Auth0 configuration
echo $AUTH0_CLIENT_ID
echo $AUTH0_DOMAIN
# Check redirect URIs in Auth0 dashboard:
# - http://localhost:3002/api/auth/callback/auth0
# Check NextAuth configuration
cat web/.env | grep NEXTAUTH# Ensure services are running
docker-compose ps
# Set test credentials
export TEST_USER_EMAIL="[email protected]"
export TEST_USER_PASSWORD="password"
# Clear test artifacts
rm -rf web/test-results web/playwright-report
# Run with debug
npx playwright test --debugadaptiq-analytics/
βββ api/ # Backend API
β βββ config/ # Configuration files
β β βββ db.js # PostgreSQL connection
β β βββ redis.js # Redis connection
β β βββ minio.js # MinIO S3 client
β β βββ mqtt.js # MQTT client
β β βββ logger.js # Winston logger
β βββ controllers/ # Route controllers
β β βββ analysis.js # Analytics endpoints
β β βββ ingestion.js # Data upload
β β βββ datasets.js # Dataset management
β β βββ tenants.js # Tenant management
β β βββ users.js # User management
β βββ middleware/ # Express middleware
β β βββ auth.js # JWT validation + RBAC
β β βββ error.js # Error handler
β β βββ mfa.js # MFA enforcement
β β βββ validate.js # Input validation
β βββ routes/ # API routes
β βββ services/ # Background services
β β βββ mqttSubscriber.js # MQTT message handler
β βββ utils/ # Utility functions
β βββ tests/ # API tests
β β βββ load/ # k6 load tests
β βββ scripts/ # Utility scripts
β β βββ seed-data.js # Database seeding
β βββ server.js # Entry point
β βββ package.json
β βββ .env
βββ web/ # Next.js web app
β βββ src/
β β βββ app/ # App router pages
β β β βββ (auth)/ # Auth pages
β β β β βββ login/
β β β β βββ select-tenant/
β β β βββ (dashboard)/ # Protected pages
β β β β βββ dashboard/
β β β β βββ datasets/
β β β β βββ reports/
β β β β βββ settings/
β β β βββ api/auth/[...nextauth]/ # NextAuth
β β β βββ layout.tsx
β β β βββ page.tsx
β β βββ components/ # React components
β β β βββ dashboard/ # Dashboard widgets
β β β βββ layout/ # Layout components
β β β βββ error/ # Error boundaries
β β βββ lib/ # Utilities
β β βββ types/ # TypeScript types
β βββ tests/ # E2E tests
β β βββ e2e/ # Playwright tests
β β β βββ fixtures/
β β β βββ 01-auth-flow.spec.ts
β β β βββ 02-dashboard.spec.ts
β β β βββ 03-data-upload.spec.ts
β β β βββ 04-reports.spec.ts
β β β βββ 05-rbac.spec.ts
β β βββ fixtures/ # Test data
β βββ playwright.config.ts
β βββ next.config.js
β βββ tailwind.config.ts
β βββ package.json
β βββ .env
βββ mobile/ # React Native app
β βββ app/ # Expo Router
β β βββ (tabs)/
β β β βββ KpiScreen.tsx
β β βββ LandingScreen.tsx
β β βββ LoginScreen.tsx
β β βββ _layout.tsx
β βββ components/
β β βββ new/
β β βββ AlertStub.tsx
β βββ constants/
β βββ assets/
β βββ app.json
β βββ package.json
β βββ .env
βββ ml/ # ML service
β βββ ml_service.py # Flask API
β βββ requirements.txt
β βββ Dockerfile
βββ db/ # Database
β βββ init.sql # Schema + seed
βββ mqtt/ # MQTT config
β βββ config/
β βββ mosquitto.conf
βββ monitoring/ # Observability (Week 8)
β βββ prometheus.yml
β βββ alerts.yml
β βββ grafana/
β βββ provisioning/
β βββ dashboards/
βββ docs/ # Documentation
β βββ RUNBOOK.md # Operations guide
β βββ GO-NOGO-CHECKLIST.md # Launch checklist
β βββ ARCHITECTURE.md
βββ .github/ # CI/CD
β βββ workflows/
β βββ e2e-tests.yml
βββ docker-compose.yml
βββ docker-compose.monitoring.yml
βββ .env.example
βββ README.md # This file
- Load Testing: k6 scripts for ingestion and analytics (50-100 concurrent users)
- E2E Tests: 20+ Playwright tests covering auth, dashboard, upload, reports, RBAC
- Monitoring: Prometheus metrics + Grafana dashboards tracking 4 key SLOs
- Runbook: Complete operational guide (deploy, backup, rollback, incident response)
- Go/No-Go: Assessment checklist with all 8 weeks validated
- CI/CD: GitHub Actions workflow (configured, ready to run)
- Documentation: Comprehensive README with setup, testing, troubleshooting