Skip to content

Gopher-Industries/NutriHelp-Backend-Deployment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NutriHelp Backend API

This is the backend API for the NutriHelp project. It exposes the REST endpoints used by the frontend, integrates with Supabase, serves OpenAPI documentation, and supports optional Python-based AI features used by some endpoints.

Quick Start

If you want the fastest setup path for local development:

git clone https://github.com/Gopher-Industries/Nutrihelp-api.git
cd Nutrihelp-api
npm install
pip install -r requirements.txt

Request the shared .env file from a project maintainer and place it in the project root, then start the backend:

npm start

The backend will be available at:

  • http://localhost:80
  • http://localhost:80/api-docs

If you prefer Docker, jump to Docker Setup.

Recommended Project Structure

To run the full NutriHelp system locally, keep the frontend and backend repositories under the same parent folder:

NutriHelp/
├── Nutrihelp-web
└── Nutrihelp-api

Example:

mkdir NutriHelp
cd NutriHelp
git clone https://github.com/Gopher-Industries/Nutrihelp-web.git
git clone https://github.com/Gopher-Industries/Nutrihelp-api.git

Local Setup

1. Enter the backend repository

cd Nutrihelp-api

2. Install backend dependencies

Install Node.js dependencies:

npm install

Install Python dependencies:

pip install -r requirements.txt

Python dependencies are optional for some basic API flows, but recommended if you want the full backend runtime, including AI and image-classification features.

3. Configure environment variables

Request the shared .env file from a project leader or maintainer, then place it here:

Nutrihelp-api/.env

If you receive a file named env, rename it to .env.

If needed, create it manually:

touch .env
nano .env

The current backend expects these required values:

  • JWT_SECRET
  • SUPABASE_URL
  • SUPABASE_ANON_KEY
  • SUPABASE_SERVICE_ROLE_KEY
  • PORT

Common optional values:

  • SENDGRID_API_KEY
  • FROM_EMAIL
  • NODE_ENV
  • CORS_ORIGIN

4. Start the backend

npm start

Expected local URLs:

  • API base URL: http://localhost:80
  • API docs: http://localhost:80/api-docs

Frontend Setup

To test the full NutriHelp app locally, run the frontend in a separate terminal:

cd Nutrihelp-web
npm install
npm start

Frontend URL:

  • http://localhost:3000

End-to-End Testing Flow

After both frontend and backend are running, open:

  • http://localhost:3000

Typical manual checks:

  • User registration
  • Login
  • MFA verification

Docker Setup

Docker is supported as a full local development path for this backend. It is useful if you want the runtime dependencies installed inside the container instead of on your host machine.

Docker Compose

From the Nutrihelp-api folder:

docker compose up --build

The backend will be available at:

  • http://localhost:80
  • http://localhost:80/api-docs

Notes:

  • Docker Compose loads environment variables from .env.
  • The default compose service builds the dev target from the Dockerfile.
  • The compose setup mounts the source code, uploads, logs, and node_modules volumes for development use.

Build and run manually

Build the production image:

docker build -t nutrihelp-api --target prod .

Run it:

docker run --rm -p 80:80 --env-file .env nutrihelp-api

Optional build flag

If Python or TensorFlow dependencies are problematic during image build, you can temporarily skip Python package installation for Node-only debugging:

docker build -t nutrihelp-api --target prod --build-arg INSTALL_PY_DEPS=false .

This is for troubleshooting only and is not suitable for validating AI-related features.

Quick Validation

Validate the backend health endpoint

curl http://localhost:80/api/system/health

Validate the AI runtime in Docker

docker compose exec api python -c "import tensorflow as tf; print(tf.__version__)"
docker compose exec api python -c "import numpy, pandas, seaborn, sklearn, matplotlib; print('python-ai-runtime-ok')"

Validate the test suite in Docker

docker compose exec api npm test

Runtime Components

Required runtime components currently used by this repository:

Component Version / Source Notes
Node.js 22-bookworm image pinned by digest Backend runtime
Python 3.11 via Debian Bookworm packages Used by AI routes
TensorFlow 2.17.0 Image classification runtime
numpy 1.26.4 TensorFlow-compatible numerical runtime
matplotlib 3.9.2 Required by model/imageClassification.py imports
pandas 2.2.3 Required by model/imageClassification.py imports
seaborn 0.13.2 Required by model/imageClassification.py imports
scikit-learn 1.5.2 Required by model/imageClassification.py imports
Pillow 9.5.0 Image preprocessing
h5py 3.10.0 Keras model loading
python-docx 1.1.2 Document-processing utilities
build-essential Debian package Native build dependency for Python wheels

Optional or troubleshooting-only runtime component:

Component Notes
INSTALL_PY_DEPS=false build arg Lets the image build without Python AI dependencies for troubleshooting only

Environment Validation

You can validate the environment configuration with:

node scripts/validateEnv.js

This script checks required variables, validates the JWT setup, and attempts a Supabase connection test.

API Documentation

The API contract is defined in index.yaml.

When the server is running, open:

  • http://localhost:80/api-docs

Automated Testing

The current repository uses mocha for automated tests.

Run the full suite:

npm test

Run unit tests only:

npm run test:unit

Useful checks during development:

npm run lint
npm run format:check
npm run openapi:validate

Troubleshooting

  • If port 80 is already in use, stop the conflicting process or change the port mapping in docker-compose.yml.
  • If the AI image build is slow, let the TensorFlow wheel finish downloading. The first build is much slower than rebuilds.
  • If model-related endpoints fail, confirm the model file exists at prediction_models/best_model_class.hdf5.
  • If environment validation fails, confirm that .env exists in the project root and contains the required keys.
  • If Supabase-related requests fail immediately on startup, verify SUPABASE_URL and SUPABASE_SERVICE_ROLE_KEY.

AI Runtime Notes

The AI service is optional for some development flows, but this repository includes AI-related code and runtime dependencies.

Additional Notes

Deploying to Render

This repository includes a render.yaml file that enables one-click deployment on Render.

Steps

  1. Fork or use this repository — make sure the code is in a GitHub/GitLab repository connected to your Render account.

  2. Connect the repository to Render — in the Render dashboard click New > Web Service, select your repository, and Render will auto-detect render.yaml.

  3. Set environment variables — in the Render dashboard under Environment, add every variable listed in .env.example. Variables marked sync: false in render.yaml must be filled in manually:

    Variable Description
    SUPABASE_URL Your Supabase project URL
    SUPABASE_ANON_KEY Supabase anonymous/public key
    JWT_TOKEN Secret used to sign JWT tokens
    SENDGRID_KEY SendGrid API key for email
    GMAIL_USER Gmail address for Nodemailer
    GMAIL_APP_PASSWORD Gmail app password for Nodemailer
    ALLOWED_ORIGINS Comma-separated list of allowed frontend origins (e.g. https://nutrihelp.com,https://www.nutrihelp.com)
  4. Deploy — Render will run npm install then npm start automatically.

Health check

Render confirms a successful deploy by polling:

GET /health

Expected response:

{ "status": "ok", "timestamp": "...", "uptime": 0 }

API docs

Once deployed, the interactive API documentation is available at:

https://<your-service>.onrender.com/api-docs

File upload warning

Warning: Render uses an ephemeral filesystem. Any files written to the local uploads/ directory are permanently deleted on every deploy or restart. To persist user-uploaded files, migrate the upload destination to Supabase Storage or another external object store.

About

This repository stores resources for deploying the NutriHelp backend on Render, including server configs, Docker files, env vars, and deployment scripts. Separation keeps localhost workflows intact while supporting stable, scalable production and future usability testing.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors