Conversation
add optional ENABLE_GRAPH_RETRIEVER to env
we are using sidecar only for entity extraction and using openai for embedding to match embedding dimension
NEO4J_URI NEO4J_USERNAME NEO4J_PASSWORD
It is currently working, but the actual performance is poor. May need extra layer of entity extraction(sidecar or llm) to find entity in query
…-functionality removed study buddy agent, added upload functionality
Feature/neo4j kg integration
…ider feat: add provider-aware LLM routing (OpenAI + Ollama) for document Q&A
…-rewrite-template updated legal templates
…zation-prod-ready PDR AI Rename
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 2 potential issues.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
| .from(document) | ||
| .where(and( | ||
| eq(document.sourceArchiveName, archiveName), | ||
| eq(document.companyId, String(numericCompanyId)) |
There was a problem hiding this comment.
Wrong type passed to bigint column comparison
High Severity
The archive document query uses eq(document.companyId, String(numericCompanyId)), passing a String to compare against document.companyId, which is a bigint column (mode "bigint"). Every other usage in the codebase passes BigInt(...) for this column. This type mismatch will likely cause the query to return no results, making the archive search scope silently broken.
| GOOGLE_AI_API_KEY: ${GOOGLE_AI_API_KEY:-} | ||
| # Ollama (optional — set OLLAMA_BASE_URL to an Ollama instance) | ||
| OLLAMA_BASE_URL: ${OLLAMA_BASE_URL:-} | ||
| OLLAMA_MODEL: ${OLLAMA_MODEL:-} |
There was a problem hiding this comment.
Docker Compose YAML indentation breaks environment variables
High Severity
The new environment variables (ANTHROPIC_API_KEY, GOOGLE_AI_API_KEY, OLLAMA_BASE_URL, OLLAMA_MODEL) appear to be outdented by 2 spaces compared to the existing environment block entries. In the diff, the existing entries (lines 65–77) are indented with 6 spaces under environment:, but the new lines at 78–83 use only 4 spaces, placing them at the app service level rather than inside environment:. This means Docker Compose will not recognize them as environment variables.


Note
Medium Risk
Expands AI execution paths (provider/model selection, new retrieval scope, and output guardrails) and adds new APIs plus a DB migration, which could affect production behavior if misconfigured. Changes are mostly additive but touch core Q&A/predictive-analysis flows and external integrations (Slack, Ollama/Anthropic/Google).
Overview
Rebrands the project to Launchstack across docs/UI and updates environment/config defaults (e.g., defaulting OpenAI to
gpt-5-mini, adding optional Anthropic/Google/Ollama/Sidecar/Neo4j vars, larger middleware body size, plus ignore rules).Adds multi-provider chat model selection (OpenAI/Anthropic/Google/Ollama) with provider-specific defaults, env-driven model overrides, and friendlier provider error handling; updates document Q&A endpoints to accept
provider, run supervisor/guardrails validation (with optional disclaimers), and introduces an archive search scope backed by multi-document RAG.Introduces new capabilities/endpoints:
POST /api/marketing-pipeline/publish, CRUD APIs fordocumentNotes,GET /api/documents/[id]/textfor extracted-text HTML, andPOST /api/document-generator/legal-generatefor template-based docx/json generation; predictive analysis now emits fire-and-forget Slack alerts on critical findings. Also adds a migration to storedocument.mime_typeand improves MIME inference infetchDocument.Written by Cursor Bugbot for commit 15f69e2. This will update automatically on new commits. Configure here.