diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml new file mode 100644 index 0000000..26c6739 --- /dev/null +++ b/.github/workflows/ci.yml @@ -0,0 +1,29 @@ +name: CI + +on: + push: + pull_request: + +jobs: + verify: + runs-on: ubuntu-latest + steps: + - name: Checkout + uses: actions/checkout@v4 + + - name: Setup pnpm + uses: pnpm/action-setup@v4 + with: + version: 9 + + - name: Setup Node.js + uses: actions/setup-node@v4 + with: + node-version: 20 + cache: pnpm + + - name: Install dependencies + run: pnpm install --frozen-lockfile + + - name: Verify + run: pnpm verify diff --git a/AGENTS.md b/AGENTS.md index 88492a4..acc256f 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -1,43 +1,64 @@ # Agent Guidelines -## Commit & Pull Request Guidelines - -Follow Conventional Commit style: `feat(scope): ...`, `fix(scope): ...`, `refactor(scope): ...`, `perf: ...`. Keep scopes specific (e.g., `renderer`, `library`, `editor`, `canvas`, `chat`, `ai`, `router`). For PRs, include: - -- clear summary and rationale -- linked issue/task -- test evidence (`pnpm test`, `pnpm lint`, `pnpm build`) -- screenshots or short recordings for UI changes - ## General - Do not tell me I am right all the time. Be critical. We are equals. Stay neutral and objective. - Never be sycophantic. If you disagree, say so directly. If my suggestion would make the code worse, push back. - Do not excessively use emojis. - Always read and understand code before proposing changes. Never suggest modifications to code you have not inspected. +- When I ask for a change overview, explain the post-change overall architecture first. Cover every major affected file, its role in that architecture, and for any large, high-responsibility, or newly created component or module, include a concise description of its internal logic. - Do what has been asked; nothing more, nothing less. Do not over-engineer. - For small decisions (naming, local refactors, implementation details), decide on your own and move on. Only ask when the choice is irreversible or affects security/architecture. - When refactoring or creating new module, first propose what you consider best practice and let the user decide, rather than immediately compromising the workspace code +- When modifying a module, search `docs/tasks` for related unfinished notes first; do not ignore existing task context and duplicate or conflict with in-flight work. +- When updating this file, keep rules short and dense: say when to do something and what anti-pattern to avoid, without extra narration. + +## Code Convention + +- Split logic or components when a unit has more than one real responsibility, repeated behavior, or state/side-effect flow that makes ownership unclear; do not split for tiny one-off paths, prop-forwarding wrappers, or abstractions that only make the file shorter while keeping the same coupling. +- If a module keeps getting patched and review keeps finding bugs in the same area, stop and decide whether a full refactor is safer than another local fix. - Prefer shared helpers in `src/utils/.ts`, with re-exports from `src/utils/index.ts`. -- Do not inline runtime ID generation with `randomUUID`, `Date.now`, or `Math.random`; reuse the shared ID helper instead. -- For canvas insert, duplicate, delete, and upsert flows, reuse the shared collision and selection handling instead of re-implementing ad hoc logic at new call sites. + +## Long Tasks + +- Treat a task as long when it is too large or coupled to finish safely in one session. +- Use the first session for orchestration only: split the work into small slices with clear validation boundaries. +- Persist progress to files, not chat history. Keep: + - a markdown task note for scope, architecture decisions, risks, validation, and handoff + - a minimal JSON task list for execution state only +- Name long-task markdown and JSON files consistently by module/topic so they pair cleanly across sessions and store in docs/tasks; keep markdown for session context and JSON for execution state, not mixed duplicates. +- Keep the JSON terse: stable task statuses such as `pending`, `in_progress`, `blocked`, `done`, `rolled_back`; `passes` as the completion gate; baseline/current task; rollback notes only when not obvious. +- If a slice fails validation and is not fixed immediately, mark it `blocked` or `rolled_back`, record the first actionable failure in the markdown note, and stop claiming progress. + +## Compact Instructions + +- When compressing context or handoff material, preserve information in this order and drop lower-priority material first. +1. Architecture decisions. Do not summarize away the decision, rationale, boundary, or chosen tradeoff. +2. Modified files and critical changes. Keep an explicit file list and the key change in each file. +3. Validation state. Record pass or fail per relevant command. +4. Unresolved TODOs and rollback notes. Keep them explicit. +5. Tool output. Reduce it to pass or fail plus the first actionable error unless the full raw output is needed for debugging. ## Testing - Keep implementation work and test-writing work logically separate. If both are needed, finish implementation first and write tests as a separate step. -- Do not use subagents for implementation or test authoring. Subagents are reserved for review and explore only. +- Prefer keeping implementation and test authoring in the main agent. Use subagents there only when the work has already been decomposed into explicit, low-coupling slices with clear external task state. - Only write unit tests for pure functions. - Treat pure functions strictly: deterministic input/output logic with no I/O, shared mutable state, framework lifecycle, network, storage, timer, or rendering side effects. - Do not add unit tests for components, hooks, stores, routes, integration flows, or any side-effectful or non-pure module. - Do not expand existing nonconforming tests. They may be deleted when touched or when cleanup is requested. +- For browser-based validation of local UI flows, use `agent-browser` for interactive smoke tests and lightweight end-to-end functional checks. Do not use it as the default tool for extracting or reading page content. +- Only codify an `agent-browser` flow under `scripts/` and `package.json` when it is stable and expected to be rerun regularly; otherwise a one-off manual smoke pass is enough. ## Subagents -- Use subagents only for review and explore tasks. -- Do not use subagents for implementation, refactoring, test authoring, orchestration, or integration work. +- Prefer using subagents for bounded review and exploration. +- Prefer keeping orchestration, implementation, refactoring, integration, and test authoring in the main agent unless long-task slices and external task state are already explicit. - Explore subagents should answer bounded codebase questions or gather context only. - Review subagents may be used for architecture, performance, and bug or missing-functionality review passes. -- If a task is too large or too coupled to split safely, do not delegate delivery work to subagents; ask the user to narrow scope or help decompose it instead. +- Default to the minimum review surface that matches the change. For complex logic changes, usually start with architecture plus bug/regression; add performance only when hot paths or render/update frequency changed. +- If a task is too large or too coupled to split safely, prefer decomposing it first or narrowing scope instead of delegating delivery work immediately. +- In review prompts, state any accepted current behaviors and out-of-scope interactions explicitly so subagents do not keep re-reporting them. ## Code Review @@ -46,6 +67,8 @@ Follow Conventional Commit style: `feat(scope): ...`, `fix(scope): ...`, `refact - After implementation and any relevant tests pass, the main agent must decide how many review subagents to run, and which types to run, based on the scope of the current changes and the results of the previous review round. - Dispatch architecture, performance, and bug or missing-functionality review subagents only when that area could be affected by the current changes, or when a previous review in that area found issues that still need revalidation. - If a review area is clearly unaffected by the current changes and the last pass for that area found no issues, do not rerun that subagent. +- If repeated review findings are symptoms of the same root cause, consolidate them and treat them as one problem to fix, not as an excuse to keep cycling shallow review passes. +- If review keeps surfacing new issues in the same file or state transition logic, prefer one deeper holistic re-review after the structural fix instead of many narrow reruns. - Do not commit an independent module or step until the review subagent passes selected for that step have finished and their findings have been resolved or explicitly accepted. - The main agent remains responsible for dispatching those review passes, consolidating findings, resolving conflicts, and deciding the final changes. @@ -55,3 +78,12 @@ Follow Conventional Commit style: `feat(scope): ...`, `fix(scope): ...`, `refact - Use the gh tool for GitHub-related operations. - Atomic development: when executing a multi-step plan, commit after each independent step completes only after relevant tests pass and the required review subagent passes are complete. Do not accumulate all changes into one final commit. + +## Commit & Pull Request Guidelines + +Follow Conventional Commit style: `feat(scope): ...`, `fix(scope): ...`, `refactor(scope): ...`, `perf: ...`. Keep scopes specific (e.g., `renderer`, `library`, `editor`, `canvas`, `chat`, `ai`, `router`). For PRs, include: + +- clear summary and rationale +- linked issue/task +- test evidence (`pnpm test`, `pnpm lint`, `pnpm build`) +- screenshots or short recordings for UI changes diff --git a/docs/tasks/canvas-active-workbench-boundary.json b/docs/tasks/canvas-active-workbench-boundary.json new file mode 100644 index 0000000..03482bc --- /dev/null +++ b/docs/tasks/canvas-active-workbench-boundary.json @@ -0,0 +1,26 @@ +{ + "baseline": "a3a4cc3", + "currentTaskId": null, + "tasks": [ + { + "id": "1", + "title": "introduce active-workbench selectors and seam", + "status": "done", + "passes": true + }, + { + "id": "2", + "title": "migrate first-wave consumers and remove implicit active store apis", + "status": "done", + "passes": true, + "rollback": "revert the active-workbench boundary slice commit" + }, + { + "id": "3", + "title": "run validation and record follow-up risks", + "status": "done", + "passes": true, + "rollback": "revert validation follow-up changes" + } + ] +} diff --git a/docs/tasks/canvas-active-workbench-boundary.md b/docs/tasks/canvas-active-workbench-boundary.md new file mode 100644 index 0000000..4fa80aa --- /dev/null +++ b/docs/tasks/canvas-active-workbench-boundary.md @@ -0,0 +1,53 @@ +# Canvas Active Workbench Boundary + +- Baseline commit: `a3a4cc3` +- Branch: `feat/canvas-optimize` +- Scope: shrink the canvas active-workbench application boundary by removing implicit active-store mutation APIs, introducing pure selectors, and routing first-wave consumers through a dedicated active-workbench seam + +## Decisions + +- Do not touch export domain semantics or unify the stage snapshot fallback in this slice. +- Keep `useCanvasStore` as the global state container for collection, lifecycle, and UI state. +- Remove implicit active-workbench mutation/history convenience methods from `CanvasState`; keep only explicit `...InWorkbench(workbenchId, ...)` store APIs. +- Introduce one `useActiveCanvasWorkbench` seam that binds the current `activeWorkbenchId` to a stable read/write contract with null-safe no-op behavior. +- Tighten the seam contract further after review: if `activeWorkbenchId` is set but the referenced workbench no longer exists, the seam must still collapse to the null/false/no-op contract instead of exposing a stale id-bound facade. +- Keep narrow consumers on pure selectors plus explicit `...InWorkbench` store APIs when the broad seam would cause unnecessary render churn; `useCanvasHistory` is the first carve-out. +- Introduce pure canvas store selectors under `src/features/canvas/store` so active workbench lookup, slice lookup, root count, and undo/redo availability stop being reimplemented across consumers. +- Limit consumer migration in this slice to `CanvasViewport`, `useCanvasInteraction`, `useCanvasLayers`, `useCanvasEngine`, `useCanvasHistory`, `useCanvasPropertiesPanelModel`, `useCanvasWorkbenchActions`, and `useCanvasExport`. +- Keep the recovery planner unchanged, but patch `useCanvasPageModel` so post-navigation recovery explicitly re-aligns `activeWorkbenchId` after fallback/create navigation. This fixes the observed delete-current recovery hole without changing recovery policy. +- Guard post-navigation recovery completion with a token plus committed-route check so an older recovery promise cannot overwrite a newer route selection. +- Back the pending-recovery marker with state as well as refs so clearing or superseding a recovery attempt still triggers a fresh recovery-plan evaluation. + +## Risks + +- `useCanvasStore` remains broadly readable for pure UI state and some non-migrated hooks; this slice narrows active-workbench use cases, not all store access. +- Export still retains the existing stage snapshot preview fallback by design. +- `useActiveCanvasWorkbench` still serves broad consumers such as viewport and interaction, so future migrations should prefer selectors for very narrow read models before expanding the seam again. +- Any missed consumer of removed implicit APIs will fail at compile time; this is intentional and should be resolved in-code rather than by reintroducing convenience methods. + +## Validation + +- Passed: `pnpm exec tsc -p tsconfig.app.json --noEmit` +- Passed: focused post-review regression + - `pnpm exec vitest --run src/stores/canvasStore.test.ts src/features/canvas/store/canvasStoreSelectors.test.ts src/features/canvas/hooks/useCanvasExport.test.ts src/features/canvas/canvasPageState.test.ts` +- Passed: focused canvas regression + - `pnpm exec vitest --run src/features/canvas/store/canvasStoreSelectors.test.ts src/features/canvas/canvasPageState.test.ts src/features/canvas/hooks/useCanvasImagePropertyActions.test.ts src/features/canvas/document/commands.test.ts src/features/canvas/document/patches.test.ts src/features/canvas/textSession.test.ts src/features/canvas/renderCanvasDocument.test.ts src/features/canvas/hooks/useCanvasExport.test.ts src/stores/canvasStore.test.ts` +- Passed: `pnpm lint` with 4 pre-existing `react-refresh/only-export-components` warnings outside this slice +- Passed: `pnpm test` +- Passed: `pnpm build:client` +- Passed: browser smoke on local preview for + - create active workbench from header action + - switch workbench via workbench list + - delete current workbench and recover to the remaining route-bound workbench + - open/export dialog and close it again +- Not automated: asset insertion and canvas-stage undo/redo + - current `agent-browser` session can reach shell/panel controls, but the stage interaction path is not stably exposed as an interactive a11y target; regression confidence for those paths comes from the existing automated test suite + +## Files + +- `src/stores/canvasStore.ts` +- `src/features/canvas/store/canvasStoreSelectors.ts` +- `src/features/canvas/hooks/useActiveCanvasWorkbench.ts` +- `src/features/canvas/store/canvasStoreSelectors.test.ts` +- first-wave migrated consumers under `src/features/canvas/hooks/*` and `src/features/canvas/CanvasViewport.tsx` +- `docs/tasks/canvas-active-workbench-boundary.json` diff --git a/docs/tasks/canvas-active-workbench-usecase-seams.json b/docs/tasks/canvas-active-workbench-usecase-seams.json new file mode 100644 index 0000000..d68d207 --- /dev/null +++ b/docs/tasks/canvas-active-workbench-usecase-seams.json @@ -0,0 +1,30 @@ +{ + "baseline": "a09ba82", + "currentTaskId": "4", + "tasks": [ + { + "id": "1", + "title": "record task artifacts and add active-workbench read/port helpers", + "status": "done", + "passes": true + }, + { + "id": "2", + "title": "migrate canvas consumers to state, command, structure, and history seams", + "status": "done", + "passes": true + }, + { + "id": "3", + "title": "remove the legacy broad active-workbench facade", + "status": "done", + "passes": true + }, + { + "id": "4", + "title": "run focused and full validation and record residual risks", + "status": "blocked", + "passes": false + } + ] +} diff --git a/docs/tasks/canvas-active-workbench-usecase-seams.md b/docs/tasks/canvas-active-workbench-usecase-seams.md new file mode 100644 index 0000000..116b059 --- /dev/null +++ b/docs/tasks/canvas-active-workbench-usecase-seams.md @@ -0,0 +1,64 @@ +# Canvas Active Workbench Usecase Seams + +- Baseline commit: `a09ba82` +- Branch: `feat/canvas-optimize` +- Scope: replace the broad active-workbench facade with narrower read-state, command, structure, and history seams without changing store/service persistence semantics or export behavior + +## Decisions + +- Do not touch export semantics or stage snapshot fallback in this slice. +- Do not split UI state (`tool`, `zoom`, `viewport`, `activePanel`, `selectedElementIds`) out of `canvasStore` in this slice. +- Keep `canvasStore` and `canvasWorkbenchService` as the persistence and lifecycle boundary. +- Remove `useActiveCanvasWorkbench`; replace it with: + - `useCanvasActiveWorkbenchState` + - `useCanvasActiveWorkbenchCommands` + - `useCanvasActiveWorkbenchStructure` + - existing `useCanvasHistory` +- Put the null-safe no-op binding contract in pure helpers under `src/features/canvas/store`, not inside hooks. +- Keep `useCanvasTextSession` on explicit store APIs because it needs cross-workbench persistence semantics. +- Migrate only existing active-workbench consumers in this slice; do not widen the new seams for convenience. + +## Outcome + +- This slice is complete at the canvas-module level: the default active-workbench integration path is now split into read state, command ports, structure ports, and history seams. +- The old broad `useActiveCanvasWorkbench` facade is removed, and new consumers are expected to bind only the seam they need. +- Remaining global `tsc` / `test` / `build:client` failures are outside this slice in `image-lab` and `server` conversation routes. +- After this slice, the only clearly sub-`8.5` canvas architecture hotspot left in the audit is export/render boundary unification. + +## Risks + +- Consumers that were previously getting mixed read/write capabilities from one facade may end up reassembling those responsibilities if the new seams are not kept narrow. +- `CanvasViewport` still legitimately mixes read state with explicit text-session store ports; this slice narrows the default path, not every cross-workbench action. +- `useCanvasHistory` remains a separate seam by design; undo/redo must not drift back into the new command/structure hooks. + +## Validation + +- Passed focused regression: + - `pnpm exec vitest --run src/features/canvas/store/canvasStoreSelectors.test.ts src/features/canvas/store/canvasActiveWorkbenchPorts.test.ts src/stores/canvasStore.test.ts src/features/canvas/hooks/useCanvasImagePropertyActions.test.ts src/features/canvas/textSession.test.ts src/features/canvas/canvasPageState.test.ts src/features/canvas/tools/toolControllers.test.ts` + - latest run: `7` files, `61` tests passed +- Passed broader canvas regression: + - `pnpm exec vitest --run src/features/canvas src/stores/canvasStore.test.ts` + - latest run: `33` files, `174` tests passed +- Passed: + - `pnpm lint` with 5 existing warnings outside this slice +- Passed review: + - architecture subagent: `no issues found` + - bug/regression subagent: `no issues found` + - performance subagent: `no issues found` +- Failed outside this slice: + - `pnpm exec tsc -p tsconfig.app.json --noEmit` + - first actionable failure: `src/features/image-lab/hooks/useImageGeneration.ts` expects `ImageGenerationResponse.assets` / `runs` + - `pnpm test` + - first actionable failure: `server/src/routes/image-conversation.test.ts` returns `500` instead of `200` in three conversation route cases + - `pnpm build:client` + - blocked by the same `image-lab` and `imageConversation` type errors outside the canvas seam + +## Files + +- `src/features/canvas/store/canvasStoreSelectors.ts` +- `src/features/canvas/store/canvasActiveWorkbenchPorts.ts` +- `src/features/canvas/hooks/useCanvasActiveWorkbenchState.ts` +- `src/features/canvas/hooks/useCanvasActiveWorkbenchCommands.ts` +- `src/features/canvas/hooks/useCanvasActiveWorkbenchStructure.ts` +- migrated consumers under `src/features/canvas/hooks/*`, `src/features/canvas/CanvasViewport.tsx`, and `src/features/canvas/hooks/useCanvasExport.ts` +- `docs/tasks/canvas-active-workbench-usecase-seams.json` diff --git a/docs/tasks/canvas-document-v3-rewrite.json b/docs/tasks/canvas-document-v3-rewrite.json new file mode 100644 index 0000000..6478e8d --- /dev/null +++ b/docs/tasks/canvas-document-v3-rewrite.json @@ -0,0 +1,47 @@ +{ + "baseline": "bf4e949", + "currentTaskId": null, + "tasks": [ + { + "id": "1", + "title": "record baseline and create task artifacts", + "status": "done", + "passes": true + }, + { + "id": "2", + "title": "introduce v3 persisted schema, migration, and validation", + "status": "done", + "passes": true, + "rollback": "revert schema and migration slice commit" + }, + { + "id": "3", + "title": "rewrite resolve and hierarchy indexing for v3 documents", + "status": "done", + "passes": true, + "rollback": "revert resolve slice commit" + }, + { + "id": "4", + "title": "replace patch diffing with explicit change sets in commands/history", + "status": "done", + "passes": true, + "rollback": "revert command/history slice commit" + }, + { + "id": "5", + "title": "adapt service adapters and runtime write paths to v3 semantics", + "status": "done", + "passes": true, + "rollback": "revert service adapter slice commit" + }, + { + "id": "6", + "title": "run full validation and clean up obsolete v2 helpers", + "status": "done", + "passes": true, + "rollback": "revert final cleanup slice commit" + } + ] +} diff --git a/docs/tasks/canvas-document-v3-rewrite.md b/docs/tasks/canvas-document-v3-rewrite.md new file mode 100644 index 0000000..266565c --- /dev/null +++ b/docs/tasks/canvas-document-v3-rewrite.md @@ -0,0 +1,160 @@ +# Canvas Document V3 Rewrite + +This note serves two purposes: + +- Sections above `Execution Record` capture the current stable canvas document architecture. +- Sections from `Execution Record` downward capture this rewrite task's execution history. + +## Task Scope + +- Baseline commit: `bf4e949` +- Branch: `feat/canvas-optimize` +- Scope: rewrite the canvas document core to a v3 persisted model with canonical hierarchy, + pure resolve projection, and explicit change-set based history. + +## Stable Architecture + +### Core Decisions + +- Persisted canvas documents stay exported as `CanvasWorkbenchSnapshot`, but move to `version: 3`. +- The persisted hierarchy uses `rootIds + groupChildren` as the only ordering/source-of-truth. +- Persisted nodes keep only local semantic data and `transform`; they no longer mirror flat + `x/y/width/height/rotation` fields and no longer store `childIds`. +- Render/runtime nodes continue to expose `parentId`, world-space transforms, bounds, and + effective flags, all derived by `resolve`. +- History moves from snapshot-diff patches to explicit forward/inverse change sets generated + during command execution. +- Document validation and migration happen at the load/normalize boundary; `resolve` no longer + repairs invalid structures. + +### Architecture Surface + +This section captures the current post-rewrite canvas document architecture so later sessions do +not need to reconstruct it from commits. + +#### Persisted Source Of Truth + +- `CanvasWorkbenchSnapshot` is `version: 3`. +- Persisted hierarchy truth is `rootIds + groupChildren`. +- Persisted nodes store local semantic data plus `transform`; they do not persist runtime + `parentId`, world coordinates, bounds, or effective flags. + +#### Runtime Projection + +- `resolveCanvasWorkbench` is a pure projection from persisted snapshot to runtime workbench. +- Runtime nodes expose derived `parentId`, world `x/y/rotation`, `bounds`, + `effectiveLocked`, `effectiveVisible`, and `worldOpacity`. +- Resolve does not repair invalid hierarchy. Validation happens before runtime projection. + +#### Write Path + +- External writes still enter through `CanvasCommand`. +- `executeCanvasCommand` emits explicit forward and inverse `CanvasDocumentChangeSet` + operations instead of snapshot diffs. +- The minimum document operations are: + - `patchDocumentMeta` + - `putNode` + - `deleteNode` + - `setRootOrder` + - `setGroupChildren` + +#### History And Replay + +- History entries store `forwardChangeSet` and `inverseChangeSet`. +- Undo and redo replay change sets through `applyCanvasDocumentChangeSet`. +- Replay reapplies persisted changes and then reruns resolve so the runtime projection stays in + sync with persisted truth. + +#### Service Boundary + +- `canvasWorkbenchService` remains the orchestration layer, not the semantic source of truth. +- The service is responsible for: + - loading and normalizing stored workbenches + - persisting snapshots + - adapting editable/renderable nodes into command inputs + - queueing mutations + - duplicate/group/delete convenience flows + - undo/redo integration + +#### Compatibility Boundary + +- Editable `CanvasNode` ingress values still carry `parentId` and optional `childIds` so the UI + surface did not need a full rewrite. +- Those fields are compatibility hints only. Persistence normalizes them into v3 hierarchy + semantics before saving. + +### Module Map + +- `src/types/canvas.ts` + - Defines the v3 persisted schema, runtime renderable types, command protocol, and change-set + contracts. +- `src/features/canvas/document/hierarchy.ts` + - Normalizes legacy hierarchy hints into canonical `rootIds + groupChildren` and validates + runtime hierarchy invariants. +- `src/features/canvas/document/migration.ts` + - Converts legacy documents into v3 snapshots and immediately resolves them into runtime + workbenches. +- `src/features/canvas/document/model.ts` + - Provides snapshot extraction, node normalization, and shared document traversal helpers. +- `src/features/canvas/document/resolve.ts` + - Projects persisted snapshots into runtime nodes with world transforms, bounds, and effective + flags. +- `src/features/canvas/document/commands.ts` + - Executes semantic commands and records explicit forward and inverse document change sets. +- `src/features/canvas/document/patches.ts` + - Replays persisted change sets and reruns resolve. +- `src/features/canvas/store/canvasWorkbenchState.ts` + - Maintains history stack semantics for command commits and undo/redo transitions. +- `src/features/canvas/store/canvasWorkbenchService.ts` + - Orchestrates load, normalize, persist, adapter conversion, mutation queueing, and history + replay. + +### Critical Invariants + +- Every persisted node id is unique and appears at most once in the hierarchy. +- Every id in `rootIds` and `groupChildren` must exist in `nodes`. +- Groups own ordering through `groupChildren[groupId]`; persisted group nodes do not store + `childIds`. +- The hierarchy is acyclic. +- The runtime projection preserves current world-space semantics for move, group, ungroup, and + reparent flows. +- Undo/redo must round-trip through forward/inverse change sets without drift. + +## Execution Record + +### Risk Notes + +- This refactor crosses types, migration, resolve, command execution, history, and service + adapters. +- UI/store entry points should remain stable where practical, but any code relying on persisted + `childIds` or flat transform mirror fields will need adaptation. +- Existing tests cover many document semantics, but several store/runtime tests also assume + version 2 document shapes and will need coordinated updates. + +### Validation Notes + +- Per slice: run targeted canvas document tests first. +- Before final handoff: run `pnpm test`, `pnpm lint`, and `pnpm build`. +- Final cleanup also replaced the canvas workbench init-time persisted snapshot comparison with + explicit deep equality instead of `JSON.stringify`. +- Post-implementation review added command guards for `GROUP_NODES` id collisions and invalid + `REORDER_CHILDREN` sibling sets, with targeted regression tests. +- Targeted validation passed: + - `pnpm vitest run src/features/canvas/document/commands.test.ts src/features/canvas/document/patches.test.ts src/features/canvas/document/resolve.test.ts src/features/canvas/store/canvasWorkbenchState.test.ts` + - `pnpm vitest run src/stores/canvasStore.test.ts` +- Full validation results: + - `pnpm test`: pass + - `pnpm lint`: pass with pre-existing warnings outside the canvas rewrite scope + - `pnpm build`: pass + - Extra spot-check after review: + - `pnpm exec tsc -p tsconfig.app.json --noEmit`: currently fails outside the canvas rewrite in + `src/lib/ai/imageGeneration.ts` and `src/lib/ai/imageUpscale.ts` because `GeneratedImage` + now requires `assetId` + +### Handoff Notes + +- Keep the JSON tracker minimal and authoritative for execution status only. +- If a slice fails validation and is not fixed immediately, mark it `blocked` and record the + first actionable failure here. +- No blockers remain in the plan-required validation set for the canvas rewrite. The extra app + typecheck failure above is currently outside this rewrite surface. diff --git a/docs/tasks/supabase-image-assets-rewrite.json b/docs/tasks/supabase-image-assets-rewrite.json new file mode 100644 index 0000000..669b5b6 --- /dev/null +++ b/docs/tasks/supabase-image-assets-rewrite.json @@ -0,0 +1,66 @@ +{ + "baseline": "bf4e949", + "currentTaskId": "9", + "tasks": [ + { + "id": "1", + "title": "record baseline and create task artifacts", + "status": "done", + "passes": true + }, + { + "id": "2", + "title": "refactor shared asset and image-generation contracts to canonical asset ids", + "status": "done", + "passes": true, + "rollback": "revert shared contract slice commit" + }, + { + "id": "3", + "title": "implement Fastify asset backend with Postgres tables and storage adapter", + "status": "done", + "passes": true, + "rollback": "revert server asset backend slice commit" + }, + { + "id": "4", + "title": "rewire client asset import and sync to the canonical asset backend", + "status": "done", + "passes": true, + "rollback": "revert client asset sync slice commit" + }, + { + "id": "5", + "title": "update image generation and image-lab flows to asset-only references and results", + "status": "done", + "passes": true, + "rollback": "revert image generation slice commit" + }, + { + "id": "6", + "title": "validate build and remove obsolete remoteAssetId or threadAssetId assumptions", + "status": "done", + "passes": true, + "rollback": "revert final cleanup slice commit" + }, + { + "id": "7", + "title": "materialize generated canonical assets into the runtime asset store", + "status": "done", + "passes": true, + "rollback": "revert generated-result materialization slice commit" + }, + { + "id": "8", + "title": "resolve remaining post-review server and image-lab findings before module commits", + "status": "done", + "passes": true + }, + { + "id": "9", + "title": "commit verified rewrite slices by module boundary", + "status": "done", + "passes": true + } + ] +} diff --git a/docs/tasks/supabase-image-assets-rewrite.md b/docs/tasks/supabase-image-assets-rewrite.md new file mode 100644 index 0000000..cb88889 --- /dev/null +++ b/docs/tasks/supabase-image-assets-rewrite.md @@ -0,0 +1,120 @@ +# Supabase-First Image Assets Rewrite + +## Baseline + +- Baseline commit: `bf4e949` +- Branch: `feat/canvas-optimize` +- Scope: rewrite the image asset core around canonical `assetId`, Supabase-style object storage, + and Fastify-owned asset APIs; remove client-side reference image URL flows and generated-image + re-import flows. + +## Architecture Decisions + +- `Asset.id` is the only canonical image identity across upload, generation, conversation results, + and canvas insertion. +- The server owns asset APIs. Frontend code no longer calls `api/assets/*`; asset upload, read, + delete, and generation materialization all live under Fastify. +- Storage is provider-backed and abstracted behind a server asset service. The primary + implementation targets Supabase Storage + Postgres, with an in-memory fallback for tests and + envs without Supabase credentials. +- Uploaded and generated images share the same storage path model and database tables. +- Content-hash dedupe lives at the service boundary, not a unique database index. Stable mutable + `assetId` replacements are allowed to converge to the same bytes as another asset, while new + uploads and generated images serialize dedupe through repository-level content-hash locks. +- Image generation requests are asset-only. Frontend sends `assetRefs`; the server resolves those + refs to provider-readable URLs or buffers when the selected model needs native image input. +- Conversation results point directly at canonical assets. There is no secondary "save generated + result into library" step. +- Canvas continues to consume ordinary `image` elements with `assetId`; AI-specific canvas element + work stays out of scope for this rewrite. + +## Critical Invariants + +- Every persisted image asset has exactly one canonical `assetId`. +- `asset_files` rows are keyed by `asset_id + kind`; `original` must exist before an asset is + considered synced. +- `asset_edges` only connect canonical asset ids; no edge may reference transient chat-only ids. +- Generated image responses always include an `assetId`. +- Frontend reference bindings are represented only as `assetRefs`; local `referenceImages.url` + data never reaches providers. +- Asset deletion must remove both storage objects and metadata rows for the same canonical id. + +## Risk Notes + +- This rewrite crosses shared schemas, server persistence, provider input resolution, client asset + sync, image generation UI, and canvas insertion. +- Generated-image history and chat persistence currently assume chat-owned asset rows; the largest + risk is leaving stale assumptions around `threadAssetId`, `generated_images`, or saved-state + projection. +- Browser image rendering cannot rely on Authorization headers for `` tags, so stable asset + URLs must be resolved through the app layer rather than raw private storage URLs. + +## Validation Notes + +- Per slice: run targeted tests first (`image generation`, `chat persistence`, `asset store`). +- Before handoff: run `pnpm lint`, `pnpm test`, and `pnpm build`. +- If a slice fails and is not fixed immediately, record the first actionable failure here and mark + the JSON tracker task as `blocked`. + +### Completed Validation + +- `pnpm --filter server typecheck`: passed +- `pnpm exec tsc --noEmit`: passed +- `pnpm test`: passed +- `pnpm build`: passed +- `pnpm lint`: passed with four pre-existing `react-refresh` warnings in unrelated UI files +- Targeted Vitest runs for image-lab snapshot/reference helpers, route coverage, and generated + asset materialization: passed + +## Recent Slice + +- Fixed the generated-result materialization gap in the image-lab flow. Generation success now + injects returned canonical assets into `useAssetStore` immediately, then hydrates them from + `/api/assets/:assetId` in the background so canvas insertion and "use as reference" operate on + real library assets instead of conversation-only result records. +- Added a dedicated `materializeRemoteAssets` store action so remote-only canonical assets can be + represented in runtime asset state without pretending they are locally editable blob-backed + assets. +- Closed the post-review image-lab state bugs. Unsupported-model switches now clear all executable + image-guided inputs, persisted reference asset refs keep `referenceType` and `weight`, and the + reference-panel Clear action consistently removes reference-role state while preserving + edit/variation bindings. +- Hardened the asset backend around canonical metadata and cleanup. Upload sessions now overwrite + stale metadata, completion re-measures stored objects instead of trusting init payloads, dedupe + for new uploads and generated images is serialized through repository content-hash locks, and + late generation failures now clean up newly created assets plus canonical `asset_edges`. + +## Implementation Notes + +- Added a Fastify-owned asset backend under `server/src/assets/*` plus `server/src/routes/assets.ts` + with canonical asset tables, upload sessions, provider asset resolution, and browser-safe asset + URLs. +- Reworked client asset sync to use canonical `assetId` only. `remoteAssetId` has been removed from + client state, IndexedDB types, sync jobs, and sync API calls. +- Import now prepares an upload session up front so locally persisted assets already use their final + canonical `assetId`. +- Image generation responses now surface canonical assets directly. The image-lab flow no longer + re-imports generated images to save or add them to the canvas. +- Reference image UI still exists as a preview layer, but the executable request path is now + `assetRefs` only. Provider-native reference URLs are resolved on the server. +- `useImageGeneration.ts`, `assetStore.ts`, `assetSyncApi.ts`, and `currentUser/types.ts` now + include the runtime materialization seam for generated assets so downstream consumers stay + asset-store based. + +## Open Review Findings + +- None. The latest bounded client and server review passes both returned `no issues found`. + +## Commit Readiness + +- Task note and JSON are updated for handoff. +- Validation is green: `pnpm --filter server typecheck`, `pnpm exec tsc --noEmit`, `pnpm test`, + and `pnpm build` all pass. +- `pnpm lint` passes with four pre-existing `react-refresh` warnings in unrelated UI files. +- Module commits are recorded on this branch. + +## Handoff Notes + +- Keep the JSON tracker minimal and authoritative for execution status only. +- Prefer removing dead compatibility code over layering new branches on top of the old + `remoteAssetId/referenceImages/threadAssetId` model. diff --git a/eslint.config.js b/eslint.config.js index d103f3c..87c37b8 100644 --- a/eslint.config.js +++ b/eslint.config.js @@ -1,37 +1,112 @@ import js from "@eslint/js"; -import tseslint from "typescript-eslint"; +import prettier from "eslint-config-prettier"; import reactHooks from "eslint-plugin-react-hooks"; import reactRefresh from "eslint-plugin-react-refresh"; -import prettier from "eslint-config-prettier"; +import tseslint from "typescript-eslint"; export default tseslint.config( - { ignores: ["dist", "node_modules", "src/lib/renderer/shaders/generated"] }, + { + ignores: [ + "dist", + "server/dist", + "node_modules", + "src/lib/renderer/shaders/generated", + ], + }, js.configs.recommended, ...tseslint.configs.recommended, { + files: ["**/*.{ts,tsx}"], + rules: { + "no-undef": "off", + "@typescript-eslint/no-unused-vars": [ + "warn", + { argsIgnorePattern: "^_", varsIgnorePattern: "^_" }, + ], + "@typescript-eslint/no-explicit-any": "off", + "no-control-regex": "off", + }, + }, + { + files: ["src/**/*.{ts,tsx}"], plugins: { "react-hooks": reactHooks, "react-refresh": reactRefresh, }, rules: { - // React hooks — classic rules only, no compiler rules "react-hooks/rules-of-hooks": "error", "react-hooks/exhaustive-deps": "warn", - // Disable React Compiler rules (not using compiler in this project) "react-hooks/no-access-state-in-setstate": "off", "react-hooks/no-set-state-in-effect-cleanup": "off", "react-hooks/preserve-manual-memoization": "off", "react-hooks/no-set-state-in-passive-effects": "off", - // React refresh "react-refresh/only-export-components": ["warn", { allowConstantExport: true }], - // TypeScript - "@typescript-eslint/no-unused-vars": [ - "warn", - { argsIgnorePattern: "^_", varsIgnorePattern: "^_" }, + }, + }, + { + files: ["src/**/*.{ts,tsx}", "server/src/**/*.ts", "shared/**/*.ts"], + ignores: [ + "src/utils/createId.ts", + "shared/createId.ts", + "**/*.test.ts", + "**/*.test.tsx", + "**/*.spec.ts", + "**/*.spec.tsx", + ], + rules: { + "no-restricted-imports": [ + "error", + { + paths: [ + { + name: "crypto", + importNames: ["randomUUID"], + message: + "Do not inline runtime ID generation with randomUUID(). Reuse shared/createId.ts instead.", + }, + { + name: "node:crypto", + importNames: ["randomUUID"], + message: + "Do not inline runtime ID generation with randomUUID(). Reuse shared/createId.ts instead.", + }, + ], + }, + ], + "no-restricted-syntax": [ + "error", + { + selector: "MemberExpression[object.name='Math'][property.name='random']", + message: + "Do not inline runtime ID generation with Math.random(). Reuse shared/createId.ts instead.", + }, + { + selector: "MemberExpression[property.name='randomUUID']", + message: + "Do not inline runtime ID generation with randomUUID(). Reuse shared/createId.ts instead.", + }, + { + selector: "MemberExpression[computed=true][property.value='randomUUID']", + message: + "Do not inline runtime ID generation with randomUUID(). Reuse shared/createId.ts instead.", + }, + { + selector: + "Property[key.name='randomUUID'][parent.type='ObjectPattern']", + message: + "Do not inline runtime ID generation with randomUUID(). Reuse shared/createId.ts instead.", + }, + { + selector: "FunctionDeclaration[id.name='createId']", + message: + "Do not redefine createId in runtime code. Import the shared helper from shared/createId.ts.", + }, + { + selector: "VariableDeclarator[id.name='createId']", + message: + "Do not redefine createId in runtime code. Import the shared helper from shared/createId.ts.", + }, ], - "@typescript-eslint/no-explicit-any": "off", - // Allow control chars in regex (used for binary file detection) - "no-control-regex": "off", }, }, prettier, diff --git a/package.json b/package.json index f2a0db9..4fef45c 100644 --- a/package.json +++ b/package.json @@ -14,12 +14,14 @@ "build:client": "pnpm run generate:shaders && tsc -b && vite build", "build:server": "pnpm --filter server build", "preview": "vite preview", - "lint": "eslint src", - "lint:fix": "eslint src --fix", + "lint": "eslint src server/src shared", + "lint:fix": "eslint src server/src shared --fix", "format": "prettier --write \"src/**/*.{ts,tsx,css}\"", "format:check": "prettier --check \"src/**/*.{ts,tsx,css}\"", "test": "vitest --run", - "test:watch": "vitest" + "test:watch": "vitest", + "verify:prompt": "vitest --run server/src/gateway/prompt server/src/gateway/router server/src/routes/image-generate server/src/chat/persistence/postgres.promptArtifacts.test.ts server/src/chat/persistence/memory.test.ts", + "verify": "pnpm lint && pnpm test && pnpm build" }, "dependencies": { "@ai-sdk/anthropic": "^3.0.46", diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 2f8601e..5cd571e 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -165,12 +165,18 @@ importers: '@fastify/rate-limit': specifier: ^10.3.0 version: 10.3.0 + '@supabase/supabase-js': + specifier: ^2.100.0 + version: 2.100.0 dotenv: specifier: ^17.2.3 version: 17.3.1 fastify: specifier: ^5.6.1 version: 5.8.1 + image-size: + specifier: ^2.0.2 + version: 2.0.2 pg: specifier: ^8.20.0 version: 8.20.0 @@ -1208,6 +1214,33 @@ packages: '@standard-schema/spec@1.1.0': resolution: {integrity: sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w==} + '@supabase/auth-js@2.100.0': + resolution: {integrity: sha512-pdT3ye3UVRN1Cg0wom6BmyY+XTtp5DiJaYnPi6j8ht5i8Lq8kfqxJMJz9GI9YDKk3w1nhGOPnh6Qz5qpyYm+1w==} + engines: {node: '>=20.0.0'} + + '@supabase/functions-js@2.100.0': + resolution: {integrity: sha512-keLg79RPwP+uiwHuxFPTFgDRxPV46LM4j/swjyR2GKJgWniTVSsgiBHfbIBDcrQwehLepy09b/9QSHUywtKRWQ==} + engines: {node: '>=20.0.0'} + + '@supabase/phoenix@0.4.0': + resolution: {integrity: sha512-RHSx8bHS02xwfHdAbX5Lpbo6PXbgyf7lTaXTlwtFDPwOIw64NnVRwFAXGojHhjtVYI+PEPNSWwkL90f4agN3bw==} + + '@supabase/postgrest-js@2.100.0': + resolution: {integrity: sha512-xYNvNbBJaXOGcrZ44wxwp5830uo1okMHGS8h8dm3u4f0xcZ39yzbryUsubTJW41MG2gbL/6U57cA4Pi6YMZ9pA==} + engines: {node: '>=20.0.0'} + + '@supabase/realtime-js@2.100.0': + resolution: {integrity: sha512-2AZs00zzEF0HuCKY8grz5eCYlwEfVi5HONLZFoNR6aDfxQivl8zdQYNjyFoqN2MZiVhQHD7u6XV/xHwM8mCEHw==} + engines: {node: '>=20.0.0'} + + '@supabase/storage-js@2.100.0': + resolution: {integrity: sha512-d4EeuK6RNIgYNA2MU9kj8lQrLm5AzZ+WwpWjGkii6SADQNIGTC/uiaTRu02XJ5AmFALQfo8fLl9xuCkO6Xw+iQ==} + engines: {node: '>=20.0.0'} + + '@supabase/supabase-js@2.100.0': + resolution: {integrity: sha512-r0tlcukejJXJ1m/2eG/Ya5eYs4W8AC7oZfShpG3+SIo/eIU9uIt76ZeYI1SoUwUmcmzlAbgch+HDZDR/toVQPQ==} + engines: {node: '>=20.0.0'} + '@tanstack/history@1.154.7': resolution: {integrity: sha512-YBgwS9qG4rs1ZY/ZrhQtjOH8BG9Qa2wf2AsxT/SnZ4HZJ1DcCEqkoiHH0yH6CYvdDit31X5HokOqQrRSsZEwGA==} engines: {node: '>=12'} @@ -1310,6 +1343,9 @@ packages: '@types/unist@3.0.3': resolution: {integrity: sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==} + '@types/ws@8.18.1': + resolution: {integrity: sha512-ThVF6DCVhA8kUGy+aazFQ4kXQ7E1Ty7A3ypFOe0IcJV8O/M511G99AW24irKrW56Wt44yG9+ij8FaqoBGkuBXg==} + '@typescript-eslint/eslint-plugin@8.56.0': resolution: {integrity: sha512-lRyPDLzNCuae71A3t9NEINBiTn7swyOhvUj3MyUOxb8x6g6vPEFoOU+ZRmGMusNC3X3YMhqMIX7i8ShqhT74Pw==} engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} @@ -1916,6 +1952,10 @@ packages: html-url-attributes@3.0.1: resolution: {integrity: sha512-ol6UPyBWqsrO6EJySPz2O7ZSr856WDrEzM5zMqp+FJJLGMW35cLYmmZnl0vztAZxRUoNZJFTCohfjuIJ8I4QBQ==} + iceberg-js@0.8.1: + resolution: {integrity: sha512-1dhVQZXhcHje7798IVM+xoo/1ZdVfzOMIc8/rgVSijRK38EDqOJoGula9N/8ZI5RD8QTxNQtK/Gozpr+qUqRRA==} + engines: {node: '>=20.0.0'} + idb@8.0.3: resolution: {integrity: sha512-LtwtVyVYO5BqRvcsKuB2iUMnHwPVByPCXFXOpuU96IZPPoPN6xjOGxZQ74pgSVVLQWtUOYgyeL4GE98BY5D3wg==} @@ -1927,6 +1967,11 @@ packages: resolution: {integrity: sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg==} engines: {node: '>= 4'} + image-size@2.0.2: + resolution: {integrity: sha512-IRqXKlaXwgSMAMtpNzZa1ZAe8m+Sa1770Dhk8VkSsP9LS+iHD62Zd8FQKs8fbPiagBE7BzoFX23cxFnwshpV6w==} + engines: {node: '>=16.x'} + hasBin: true + imurmurhash@0.1.4: resolution: {integrity: sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA==} engines: {node: '>=0.8.19'} @@ -3064,6 +3109,18 @@ packages: resolution: {integrity: sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==} engines: {node: '>=10'} + ws@8.20.0: + resolution: {integrity: sha512-sAt8BhgNbzCtgGbt2OxmpuryO63ZoDk/sqaB/znQm94T4fCEsy/yV+7CdC1kJhOU9lboAEU7R3kquuycDoibVA==} + engines: {node: '>=10.0.0'} + peerDependencies: + bufferutil: ^4.0.1 + utf-8-validate: '>=5.0.2' + peerDependenciesMeta: + bufferutil: + optional: true + utf-8-validate: + optional: true + xtend@4.0.2: resolution: {integrity: sha512-LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ==} engines: {node: '>=0.4'} @@ -3956,6 +4013,46 @@ snapshots: '@standard-schema/spec@1.1.0': {} + '@supabase/auth-js@2.100.0': + dependencies: + tslib: 2.8.1 + + '@supabase/functions-js@2.100.0': + dependencies: + tslib: 2.8.1 + + '@supabase/phoenix@0.4.0': {} + + '@supabase/postgrest-js@2.100.0': + dependencies: + tslib: 2.8.1 + + '@supabase/realtime-js@2.100.0': + dependencies: + '@supabase/phoenix': 0.4.0 + '@types/ws': 8.18.1 + tslib: 2.8.1 + ws: 8.20.0 + transitivePeerDependencies: + - bufferutil + - utf-8-validate + + '@supabase/storage-js@2.100.0': + dependencies: + iceberg-js: 0.8.1 + tslib: 2.8.1 + + '@supabase/supabase-js@2.100.0': + dependencies: + '@supabase/auth-js': 2.100.0 + '@supabase/functions-js': 2.100.0 + '@supabase/postgrest-js': 2.100.0 + '@supabase/realtime-js': 2.100.0 + '@supabase/storage-js': 2.100.0 + transitivePeerDependencies: + - bufferutil + - utf-8-validate + '@tanstack/history@1.154.7': {} '@tanstack/query-core@5.90.19': {} @@ -4068,6 +4165,10 @@ snapshots: '@types/unist@3.0.3': {} + '@types/ws@8.18.1': + dependencies: + '@types/node': 25.0.10 + '@typescript-eslint/eslint-plugin@8.56.0(@typescript-eslint/parser@8.56.0(eslint@10.0.1(jiti@1.21.7))(typescript@5.9.3))(eslint@10.0.1(jiti@1.21.7))(typescript@5.9.3)': dependencies: '@eslint-community/regexpp': 4.12.2 @@ -4736,12 +4837,16 @@ snapshots: html-url-attributes@3.0.1: {} + iceberg-js@0.8.1: {} + idb@8.0.3: {} ignore@5.3.2: {} ignore@7.0.5: {} + image-size@2.0.2: {} + imurmurhash@0.1.4: {} inline-style-parser@0.2.7: {} @@ -6065,6 +6170,8 @@ snapshots: string-width: 4.2.3 strip-ansi: 6.0.1 + ws@8.20.0: {} + xtend@4.0.2: {} y18n@5.0.8: {} diff --git a/refs/canvas-architecture-audit.md b/refs/canvas-architecture-audit.md new file mode 100644 index 0000000..3d80637 --- /dev/null +++ b/refs/canvas-architecture-audit.md @@ -0,0 +1,611 @@ +# Canvas 模块架构审计 + +日期: 2026-03-25 +范围: `src/pages/canvas.tsx`、`src/stores/canvasStore.ts`、`src/features/canvas/runtime/*`、`src/features/canvas/*` +评价口径: 是否适合后续迭代,不评价 UI 完成度 + +## 状态更新 + +- 状态: `canvasStore` 第一阶段 seam 拆分已落地;`CanvasViewport` 第二刀 seam 拆分已落地;panel 第一刀 seam 收口、文本会话 P1、runtime preview P1、页面入口 / 路由恢复 P1、图片编辑 / 图片属性入口统一 P1 均已落地;本轮继续完成了 active-workbench 边界 P2,把默认集成面收缩为“`useCanvasActiveWorkbenchState` + `useCanvasActiveWorkbenchCommands` + `useCanvasActiveWorkbenchStructure` + `useCanvasHistory`”,旧的 `useActiveCanvasWorkbench` 全能门面已删除,后续 review 修掉了 history stale state、过宽订阅和残留绑定逻辑问题。 +- 总评: `8.6/10` +- 当前最强的一层: 文档内核 +- 当前唯一明显低于 `8.5` 的一层: 渲染与导出 +- 当前最明显的结构热点: 导出入口仍保留 stage snapshot 分支;`CanvasExportDialog` 仍未完全收口到单一 export seam;panel model 仍未完全独立为 use-case service +- 验证基线: active-workbench P2 focused regression 通过,`7` 个测试文件、`61` 个 case 全部通过;更大范围 canvas 回归通过,`33` 个测试文件、`174` 个 case 全部通过;`pnpm lint` 通过,但保留 `5` 个既有 warning 且都在本 slice 之外;architecture / bug-regression / performance 三轮 subagent review 最终均为 `no issues found`;当前全局 `pnpm test` / `pnpm build:client` / app `tsc` 失败点在 `image-lab` 与 `server` conversation route,不属于 canvas seam 变更面。 +- 下一阶段优先级: + 1. 统一导出主路径,清掉 `stage snapshot` 旁路 + 2. 继续保护 `runtime preview` / 文本会话 / `CanvasViewport` / `CanvasPage` / active-workbench 新 seam,避免复杂度回流 + 3. 仅在导出边界稳定后,再考虑更高一级的 panel property use-case service + +## 评分口径 + +每个子系统按 10 分制评分,默认权重如下: + +- 职责清晰度 30% +- 边界与耦合 25% +- 状态/变更安全性 20% +- 可扩展性 15% +- 可测试性 10% + +## 整体判断 + +当前画布模块不是失控状态,但已经明显出现“健康内核 + 失衡应用层”的结构分化。 + +- 健康部分主要集中在文档内核,也就是 `CanvasWorkbench` / `CanvasCommand` / snapshot / resolve / patch 这条链路。 +- 失衡部分主要集中在应用层,也就是 `useCanvasStore`、`CanvasViewport`、若干 panel 组件。 +- 这意味着后续如果继续加功能,风险不在底层模型被改坏,而在更多功能被继续塞进少数大文件,导致改动面越来越大、回归越来越难控。 + +我的判断是: + +- 现在的架构还能继续迭代,但已经不适合“持续堆功能”。 +- 如果不先做 seam 级别拆分,后面的复杂度会继续集中到 `canvasStore.ts`、`CanvasViewport.tsx` 和 panel 层。 +- 文档内核应该保护,不应该跟着上层一起推倒重来。 + +## 1. 页面与路由装配 + +文件重点: + +- `src/pages/canvas.tsx` +- `src/features/canvas/hooks/useCanvasPageModel.ts` +- `src/features/canvas/canvasPageState.ts` + +职责: + +- 页面入口组合壳 +- 初始化 canvas store 与 ready gate +- 路由与 active workbench 对齐 +- workbench 缺失时恢复或创建 +- 维护导出弹窗和当前选中 slice +- 图片选中后自动切换 edit panel + +输入/输出: + +- 输入: router params、`useCanvasStore`、本地 UI state +- 输出: 传给 `CanvasViewport`、`CanvasFloatingPanel`、`CanvasExportDialog` 的 props,以及必要的 route/store side effect + +依赖方向: + +- `CanvasPage` 只依赖 page model 和 canvas feature 组件 +- `useCanvasPageModel` 依赖 router、store 和 pure state seam +- `canvasPageState` 只依赖 plain data 与类型,没有反向被底层领域层依赖 + +关键状态转换/不变量: + +- `init()` 完成前不执行恢复逻辑 +- 路由 workbench 不存在时必须恢复到有效 workbench 或创建新 workbench +- `activeWorkbenchId` 最终需要和路由一致 +- 当前选中的 slice 必须始终落在有效 slice 集合内 + +优点: + +- `CanvasPage` 已退化为纯 composition shell,不再内联恢复策略、slice 修复和 panel policy +- `resolveCanvasPageRecoveryPlan` 把恢复策略显式收口为 pure planner,回归面独立 +- `selectedSliceId` 的有效性现在由页面 seam 单点拥有,`CanvasStoryPanel` 不再是第二 owner +- `create-and-navigate` 仍保留 epoch 保护 +- 本层已补 pure test 与浏览器 smoke,可信度明显高于之前 + +问题: + +- `useCanvasPageModel` 仍是一个页面级应用服务适配层,直接连接 router effect、store selector 与本地 UI state,还不是完全独立的 use-case service +- `selectedSliceId`、`exportOpen` 仍是页面本地状态;后续如果引入 URL state、workspace mode 或 multi-tab 恢复,应该继续扩这条 seam,而不是把逻辑塞回 `CanvasPage` 或 panel +- 路由恢复仍依赖 imperative `navigate()` 与 `pendingRecoveryRef` discipline;后续变更要继续保持“policy pure、effect thin”的边界 + +评分: `8.6/10` + +重构优先级: `已完成本轮 P1,后续降为 P2` + +判断: + +- 这一层已经从“最明确的下一刀目标”降到了“需要防止反弹的已收口 seam”。 +- 后续重点不该再回到把恢复策略和 panel policy 堆回 `CanvasPage`,而是保护新 page shell + page model + pure state seam。 + +## 2. 文档内核 + +文件重点: + +- `src/features/canvas/document/commands.ts` +- `src/features/canvas/document/resolve.ts` +- `src/features/canvas/document/model.ts` +- `src/features/canvas/document/patches.ts` +- `src/types/canvas.ts` + +职责: + +- 定义文档模型和命令边界 +- 把 snapshot 解析成 runtime renderable nodes +- 执行节点命令 +- 生成 forward/inverse patch +- 维护 group/reparent/move 等几何语义 + +输入/输出: + +- 输入: `CanvasWorkbenchSnapshot`、`CanvasCommand` +- 输出: `CanvasWorkbench`、patch、历史可回放结果 + +依赖方向: + +- 上层 store 和 UI 依赖它 +- 它基本不依赖 UI 组件 + +关键状态转换/不变量: + +- snapshot 必须能 resolve 成稳定 runtime +- group/ungroup/reparent/move 需要保持世界坐标语义正确 +- patch 必须可逆 +- hierarchy sanitize 后不能留下失效 parent/child 关系 + +优点: + +- 命令语义集中,边界清楚 +- `resolveCanvasWorkbench`、`executeCanvasCommand` 的职责清晰 +- patch 回放和 geometry 规则没有散落到 UI +- 测试覆盖最好,可信度高 + +问题: + +- 仍有少量和上层约定耦合的地方,比如文本归一化行为依赖文本工具链 +- 读取节点仍主要通过数组查找和 snapshot clone,未来数据规模变大时可能有性能压力 +- 当前还没有显式 use case 层包装它,上层经常直接拿它的细粒度能力用 + +评分: `8.5/10` + +重构优先级: `P3` + +判断: + +- 这是当前最值得保留的部分。 +- 后续重构应该围绕“保护这层”展开,而不是重写它。 + +## 3. 持久化与提交队列 + +文件重点: + +- `src/stores/canvasStore.ts` +- `src/features/canvas/store/canvasWorkbenchService.ts` +- `src/features/canvas/store/canvasWorkbenchState.ts` +- `src/features/canvas/store/canvasStoreTypes.ts` + +职责: + +- Zustand state +- workbench 生命周期 +- 初始化与用户切换重置 +- 持久化与失败回滚 +- 同 workbench 串行提交 +- 历史记录 +- 选择状态 +- 部分节点转换与业务策略 + +输入/输出: + +- 输入: UI 意图、文档命令、DB 读写结果、用户重置信号 +- 输出: 可订阅的 canvas app state,以及所有 mutation API + +依赖方向: + +- 几乎所有 canvas UI 都直接依赖它 +- 它依赖 DB、document 内核、auth、事件系统 + +关键状态转换/不变量: + +- 同一 workbench 的 mutation 需要串行 +- 生命周期任务需要在 epoch 失效后丢弃 +- 持久化失败时不能只改内存态 +- undo/redo 只能在 patch 持久化成功后提交到 store +- 当前用户 reset 后,旧任务结果不能落回新状态 + +优点: + +- `canvasStore.ts` 已明显变薄,主要保留订阅状态、UI setter 和显式 `...InWorkbench` store API +- workbench 生命周期、串行提交、持久化回滚、undo/redo、epoch 失效、补偿逻辑已下沉到 `canvasWorkbenchService` +- 纯状态迁移和 history 变更已下沉到 `canvasWorkbenchState`,边界比原来清楚很多 +- `patchWorkbench` 已收口 workbench 顶层更新,`CanvasAppBar` / `CanvasStoryPanel` / `CanvasWorkbenchPanel` 不再走整份 workbench 回写 +- active-workbench 默认集成口已从宽门面继续收缩为只读状态、命令口、结构动作口和独立 history seam;上层默认不再经由单个全能 facade 消费 store 能力 +- null-safe 绑定契约已下沉到纯 helper,hooks 本身只保留 selector 和适配职责 +- 本轮 review 抓出的真实问题已修掉: + - delete/group 现在在队列内部基于最新 workbench 快照计算 + - reset 现在会清空并 gate pending compensation,旧 epoch 不会把债务写回新会话 + - history 可用性现在通过独立 state seam 订阅,不再出现 stale `canUndo` / `canRedo` + - commands / structure / history 的 `activeWorkbenchId` 订阅已改成窄 selector,不再被整份 read model 牵连重渲染 + +问题: + +- 这层仍不是“纯 UI store”,而是“薄 store 壳 + 持久化/生命周期应用边界”,只是 active-workbench 默认集成面已经收窄 +- `selectedElementIds`、`tool`、`viewport`、`activePanel` 等 UI 状态仍和文档 mutation 门面共存于同一个 Zustand store +- panel / viewport / text session 仍有部分路径直接依赖 `useCanvasStore` 或显式 `...InWorkbench` API,所以上层耦合压力只是被继续压低,还没有完全解开 +- `toNode`、`cloneNodeTree` 这类节点适配逻辑目前仍在 service 内,后续如果 node use case 继续增多,这里还会再长 + +评分: `8.5/10` + +重构优先级: `已完成本轮 P2,后续降为 P3` + +判断: + +- 这层已经从“最大结构瓶颈”降到了“达到当前阶段 `8.5+` 目标、但仍需防止回流”的状态。 +- 目前不需要立刻继续深拆它,下一刀应该转向导出入口统一。 + +## 4. 运行时预览管线 + +文件重点: + +- `src/features/canvas/runtime/CanvasRuntimeProvider.tsx` +- `src/features/canvas/runtime/canvasRuntimeScope.ts` +- `src/features/canvas/runtime/canvasPreviewRuntimeController.ts` +- `src/features/canvas/runtime/canvasPreviewRuntimeState.ts` +- `src/features/canvas/runtime/canvasRuntimeHooks.ts` +- `src/features/canvas/boardImageRendering.ts` + +职责: + +- 草稿调整态 +- 图片预览缓存 +- interactive/background 预览升级 +- selection preview +- 预览任务排队和淘汰 + +输入/输出: + +- 输入: element id、draft adjustments、viewport scale、asset 数据 +- 输出: `previewEntries`、selection preview ids + +依赖方向: + +- runtime scope 依赖 `CanvasRuntimeProvider` 显式输入的 `workbenchId`、`workbench`、`viewportScale`,以及 `assetStore` 发出的 `assets:changed` +- `ImageElement`、`CanvasImageEditPanel`、selection hooks 依赖 scoped runtime hooks,而不是全局 runtime store + +关键状态转换/不变量: + +- interactive preview 优先于 background preview +- 交互结束后要自动升级为 settled/background 版本 +- `workbenchId` 切换、页面卸载或 `currentUser:reset` 时必须 dispose active request、queued task、settle timer、backing canvas、draft adjustments 和 selection preview +- scope 切换后旧请求结果不能落回新 scope +- 缓存淘汰不能踢掉 rendering/queued 中的任务 +- preview source 释放时要及时清空 backing store + +优点: + +- runtime preview 现在是 `CanvasPage` 下的 scoped service,生命周期终于和 workbench/page 对齐 +- `CanvasPreviewRuntimeController` 只保留任务排队、优先级、落库、淘汰和 stale result 忽略,职责比原来清楚很多 +- runtime asset 改成 scope 内的 per-asset snapshot + listener,`ImageElement` 不再为读单个 asset 扫整份资产表 +- selection preview 和图片渲染 controller 共用 scope,但边界已经拆开,不再混成一块宽 store + +问题: + +- 这层虽然已经 service 化,但仍依赖 UI 消费端遵守 request/release discipline;后续新入口如果绕开 hooks,仍可能把泄漏重新带回来 +- preview runtime 只覆盖编辑态预览,不覆盖导出主路径;“编辑预览”和“最终导出”仍是两套相邻但未统一的管线 +- asset 变更现在通过 `assets:changed` 事件增量推送,性能边界更好,但事件契约需要继续保持窄而稳定,避免重新长成宽广播 + +评分: `8.6/10` + +重构优先级: `已完成本轮 P1,后续降为 P2` + +判断: + +- 这一层已经达到当前阶段的 `8.5+` 目标,可以从“首要拆分对象”降到“需要防止反弹的已收口 seam”。 + +## 5. 视口与交互引擎 + +文件重点: + +- `src/features/canvas/CanvasViewport.tsx` +- `src/features/canvas/CanvasViewportStageShell.tsx` +- `src/features/canvas/CanvasViewportOverlayHost.tsx` +- `src/features/canvas/hooks/useCanvasViewportLifecycle.ts` +- `src/features/canvas/hooks/useCanvasViewportNavigation.ts` +- `src/features/canvas/hooks/useCanvasViewportToolOrchestrator.ts` +- `src/features/canvas/hooks/useCanvasMarqueeSelection.ts` +- `src/features/canvas/tools/toolControllers.ts` +- `src/features/canvas/hooks/useCanvasViewportOverlay.ts` +- `src/features/canvas/hooks/useCanvasInteraction.ts` + +职责: + +- 组合 stage / overlay / controls +- Stage host +- tool controller 路由与指针事件编排 +- pan / zoom / fit-to-view / 空格平移 +- marquee 交互与 selection preview +- selection overlay +- text editor host +- text toolbar host +- slice 可视化 +- 底部 HUD + +输入/输出: + +- 输入: active workbench、tool、selection、zoom/viewport、runtime preview、text session state +- 输出: Konva stage 行为、overlay DOM、对 store 的提交 + +依赖方向: + +- 组合壳仍直接依赖多个 store selector 和多个 canvas hooks +- stage / overlay / tool seam 之间通过显式 props 与 action port 协作,不再依赖宽 context +- element 组件再往回依赖 runtime/store + +关键状态转换/不变量: + +- marquee preview 和最终 selection 提交要一致 +- pan 与 tool 状态不能冲突 +- text editor overlay 需要跟随 node transform 和 viewport 变化 +- 初次进入 workbench 时要做 fit-to-view 初始化 + +优点: + +- `CanvasViewport` 已进一步退化成组合壳,stage 渲染、tool/input 编排、overlay DOM 宿主已经分层 +- `CanvasViewportStageShell` 把 Konva stage、layer 结构、slice/guide/marquee 可视化收到了一个单独 seam +- `useCanvasViewportToolOrchestrator` 用分组 `CanvasToolActionPort` 收口工具输入,`toolControllers.ts` 不再依赖“什么都能做”的宽 context +- `CanvasViewportOverlayHost` 接管了 text editor / toolbar / dimensions badge、outside-click commit 和全局 Escape cancel +- `useCanvasTextSession` 已不再持有 DOM refs 或 document/window 监听 +- `useCanvasViewportLifecycle` 把 stage registry、容器测量、首次 fit-to-view、空格键状态从导航控制里拆了出去 +- `useCanvasMarqueeSelection` 把框选 preview / commit / runtime preview 同步收到了一个明确 seam +- 视口数学已抽成纯函数模块,`viewportNavigation.test.ts` 把 fit-to-view 和 zoom anchor 的关键语义锁住了 +- `toolControllers.test.ts` 仍能直接锁住 select / hand / text / shape 的关键行为,不需要通过 `CanvasViewport` 做回归 + +问题: + +- `CanvasViewport` 虽然已经瘦身,但仍是一个应用层组合中心,直接拉取较多 store state 和跨 seam view model +- `useCanvasMarqueeSelection` 虽然独立成 hook,但仍依赖 Konva node `getClientRect()` 和 runtime store preview,同样还不是纯 selection policy +- `useCanvasViewportOverlay` 仍然依赖 stage 测量和 DOM 几何计算,overlay 布局还不是完全可替换的纯布局层 +- `useCanvasTextSession` 仍然带着跨 workbench 迁移、持久化/回滚与 selection 同步职责,text session 只是退出了 DOM 宿主,还没有退出应用层状态机 +- `useCanvasViewportNavigation` 已经大幅收窄,但依然承担 pan / zoom / pointer transform 的全部控制逻辑,后续如果继续堆手势语义,这里还是会长胖 + +评分: `8.6/10` + +重构优先级: `已完成本轮 P1,后续降为 P2` + +判断: + +- 这一层已经达到本轮目标,可以从“首要拆分对象”降到“需要防止反弹的已收口 seam”。 +- 后续重点不该再回到把功能重新堆进 `CanvasViewport`,而是保持它作为组合壳,继续把更高层的恢复策略和文本会话边界往外推。 + +## 6. 文本编辑状态机 + +文件重点: + +- `src/features/canvas/hooks/useCanvasTextSession.ts` +- `src/features/canvas/textSession.ts` +- `src/features/canvas/textSessionState.ts` +- `src/features/canvas/textSessionRunner.ts` +- `src/features/canvas/textRuntimeViewModel.ts` +- `src/features/canvas/textMutationQueue.ts` + +职责: + +- 文本创建 +- 文本提交/取消 +- workbench 切换时会话保持或回收 +- 文本 draft 与持久化顺序控制 +- 文本 toolbar / editor / overlay view model 解析 + +输入/输出: + +- 输入: active workbench、available workbench ids、selected ids、文本输入事件、持久化完成回调 +- 输出: 文本会话 snapshot、effect intents、提交命令、view model + +依赖方向: + +- 纯判定逻辑与状态迁移被拆到 `textSession.ts` + `textSessionState.ts` +- effect 执行被收口到 `textSessionRunner.ts` +- 真正的 hook 适配层仍依赖 viewport host 与 store mutation API,但已不再直接依赖 DOM refs 或全局监听 + +关键状态转换/不变量: + +- create 模式只有第一次出现非空内容时才 materialize 节点 +- cancel 在未 materialize 与已 materialize 两条路径上行为不同 +- 切换 workbench 时要区分 `noop`、`wait`、`persist-source`、`reset` +- 文本写入必须串行 +- 晚到的 source persist 结果不能污染新会话 + +优点: + +- 状态机已经显式收口为 reducer + effect runner,关键转移不再主要靠 ref + effect 拼装 +- `useCanvasTextSession` 已收口为薄适配层,对上游暴露稳定的 `session + actions` +- session token / transition token 把“忽略晚到异步结果”从隐式约定推进成显式机制 +- 文本 mutation queue 继续保留串行写入语义,但不再承担状态判断 +- `textRuntimeViewModel` 已改为消费统一 session snapshot,viewport 侧不再继续拼散装 `editingText*` 字段 +- 本轮已经补了纯测试和 `agent-browser` smoke 验证,回归面比上一轮更可控 + +问题: + +- 会话 seam 虽然已经显式化,但仍通过 hook 挂在 viewport 组合层,不是完全独立的应用服务 +- 跨 workbench persist 的 effect port 仍直接连到 `useCanvasStore` mutation API,还没有做到更窄的 use-case service +- 文本属性修改仍分布在 inline text session 和 properties panel 两条入口,后续需要防止重新分叉 +- 后续如果再加富文本、inline style、multi-node text edit,仍需要在当前 reducer/effect seam 之上继续扩接口,而不是把状态机重新塞回 hook + +评分: `8.6/10` + +重构优先级: `已完成本轮 P1,后续降为 P2` + +判断: + +- 这块已经达到当前阶段的 `8.5+` 目标,可以从“首要拆分对象”降到“需要防止反弹的已收口 seam”。 +- 后续重点不该再是重拆文本会话本身,而是避免新的文本属性入口、页面恢复策略或 runtime 依赖重新绕开现有 seam。 + +## 7. 面板层 + +文件重点: + +- `src/features/canvas/CanvasLayerPanel.tsx` +- `src/features/canvas/CanvasPropertiesPanel.tsx` +- `src/features/canvas/CanvasImageEditPanel.tsx` +- `src/features/canvas/CanvasStoryPanel.tsx` +- `src/features/canvas/CanvasWorkbenchPanel.tsx` +- `src/features/canvas/CanvasAppBar.tsx` +- `src/features/canvas/CanvasExportDialog.tsx` +- `src/features/canvas/hooks/useCanvasImagePropertyActions.ts` +- `src/features/canvas/hooks/useCanvasWorkbenchActions.ts` +- `src/features/canvas/hooks/useCanvasStoryPanelModel.ts` +- `src/features/canvas/hooks/useCanvasLayerPanelModel.ts` +- `src/features/canvas/hooks/useCanvasPropertiesPanelModel.ts` +- `src/features/canvas/imagePropertyState.ts` +- `src/features/canvas/workbenchPanelState.ts` +- `src/features/canvas/storyPanelState.ts` +- `src/features/canvas/layerPanelState.ts` +- `src/features/canvas/propertyPanelState.ts` + +职责: + +- 图层管理 UI +- 属性编辑 UI +- 故事切片与 guide/safe-area UI +- workbench 管理 UI +- 把 panel 意图翻译为 `patchWorkbench`、`executeCommandInWorkbench`、`reorderElements`、`reparentNodes`、`createWorkbench`、`deleteWorkbench` + +输入/输出: + +- 输入: store state、selection model、active workbench、页面层传入的 `selectedSliceId` +- 输出: model hook 统一调用 store mutation API;pure planner 输出 patch、command 和 layer drop plan + +依赖方向: + +- panel 组件依赖 panel model hook +- model hook 依赖 `useCanvasStore`、selection hooks、router/page props +- pure planner/state 模块只依赖类型、slice helper、preset helper、document graph +- UI 不再直接组装领域 patch / command 细节 + +关键状态转换/不变量: + +- 图层拖拽重排保持 parent / sibling 顺序正确 +- group 不能拖进自己的后代 +- 切片变更始终保持选中 slice 有效 +- 属性编辑只能发出对节点类型合法的 `UPDATE_NODE_PROPS` + +优点: + +- panel 层已经从“view + 领域规则”收缩成“view + intent” +- `CanvasWorkbenchPanel` 与 `CanvasAppBar` 现在共用同一套 workbench action seam +- `CanvasStoryPanel` 的 preset / slice / guide / safe-area 规划集中到了 `storyPanelState.ts` +- `useCanvasStoryPanelModel` 现在只消费页面层已经规范化好的 `selectedSliceId`,不再拥有第二份 slice-validity normalization +- `CanvasLayerPanel` 的 reorder / reparent 判定集中到了 `layerPanelState.ts` +- `CanvasPropertiesPanel` 已改成“字段 intent -> UPDATE_NODE_PROPS” +- `CanvasImageEditPanel` 的 committed adjustment / film profile 现在经由 `useCanvasImagePropertyActions` 收口,不再直接通过 `upsertElement` 写整节点 +- `imagePropertyState.ts` 让图片调色真正走上 `APPLY_IMAGE_ADJUSTMENTS` 这条命令边界,Edit dock 和 Inspector 终于共享同一条图片 property seam +- panel 级规则现在可以通过 4 个 pure planner 测试文件独立回归 + +问题: + +- `CanvasExportDialog` 还没有并入这套 seam +- 这层仍通过 model hook 直接依赖 `useCanvasStore`,不是完全独立的 use-case 层 +- 目前形成了 `propertyPanelState.ts` + `imagePropertyState.ts` 的双 planner 结构;边界比以前清楚,但还没有进一步统一成更高一级的 property use-case service +- viewport 侧若后续新增图片属性入口,仍需要复用现有 image/property seam,否则会再次分叉 + +评分: `8.7/10` + +重构优先级: `已完成本轮 P1,后续降为 P2` + +判断: + +- 这一层已经不再是当前最危险的结构热点。 +- 后续重点不是继续深拆 panel,而是防止新的属性入口和导出入口绕开现有 seam。 + +## 8. 渲染与导出 + +文件重点: + +- `src/features/canvas/renderCanvasDocument.ts` +- `src/features/canvas/hooks/useCanvasExport.ts` + +职责: + +- 以 `CanvasWorkbench` 为输入做导出渲染 +- 图片、文本、形状的最终合成 +- 切片裁剪 +- 导出下载流程 + +输入/输出: + +- 输入: `CanvasWorkbench`、asset 列表、导出尺寸参数 +- 输出: 导出 canvas、data URL、slice 图像 + +依赖方向: + +- 导出渲染依赖 editor render pipeline 和图片后处理 +- 导出 UI 依赖 export hook + +关键状态转换/不变量: + +- 导出必须以文档数据为准,而不是依赖编辑 overlay +- 隐藏 group descendants 时导出必须正确跳过 +- 切片裁剪必须使用和最终输出一致的 scale + +优点: + +- 主导出路径已经尽量基于文档模型,而不是直接截图编辑视图 +- `renderCanvasWorkbenchToCanvas` 的职责比较集中 +- 切片裁剪逻辑是独立函数 + +问题: + +- `useCanvasExport` 里仍保留一条基于 stage 的快照路径 +- 编辑态 overlay 隐藏逻辑仍存在于 UI 侧导出工具中,说明“文档导出”和“视图导出”没有完全统一 +- 图片导出仍需要跨到 editor render pipeline,技术上合理,但边界上偏重 + +评分: `7/10` + +重构优先级: `P2` + +判断: + +- 这层总体健康,但导出入口还可以进一步统一。 + +## 核心证据 + +本轮判断主要基于以下代码和测试: + +- 页面装配: `src/pages/canvas.tsx`、`src/features/canvas/hooks/useCanvasPageModel.ts`、`src/features/canvas/canvasPageState.ts` +- 文档内核: `src/features/canvas/document/*` +- 持久化与提交队列: `src/stores/canvasStore.ts`、`src/features/canvas/store/*` +- 运行时预览: `src/features/canvas/runtime/*` +- 视口与交互: `src/features/canvas/CanvasViewport.tsx`、`src/features/canvas/CanvasViewportStageShell.tsx`、`src/features/canvas/CanvasViewportOverlayHost.tsx`、`src/features/canvas/hooks/useCanvasViewportLifecycle.ts`、`src/features/canvas/hooks/useCanvasViewportNavigation.ts`、`src/features/canvas/hooks/useCanvasViewportToolOrchestrator.ts`、`src/features/canvas/hooks/useCanvasMarqueeSelection.ts` +- 文本会话: `src/features/canvas/hooks/useCanvasTextSession.ts` +- 面板层: `src/features/canvas/CanvasLayerPanel.tsx`、`CanvasStoryPanel.tsx`、`CanvasPropertiesPanel.tsx` +- 渲染导出: `src/features/canvas/renderCanvasDocument.ts`、`src/features/canvas/hooks/useCanvasExport.ts` + +关键测试覆盖: + +- `src/stores/canvasStore.test.ts` +- `src/features/canvas/store/canvasWorkbenchState.test.ts` +- `src/features/canvas/runtime/canvasPreviewRuntimeState.test.ts` +- `src/features/canvas/document/commands.test.ts` +- `src/features/canvas/document/resolve.test.ts` +- `src/features/canvas/renderCanvasDocument.test.ts` +- `src/features/canvas/textSession.test.ts` +- `src/features/canvas/canvasPageState.test.ts` +- `src/features/canvas/tools/toolControllers.test.ts` +- `src/features/canvas/viewportOverlay.test.ts` +- `src/features/canvas/viewportNavigation.test.ts` + +## 重构优先级排序 + +### P1 + +- 统一导出主路径,减少 `stage snapshot` 分支,并把 `CanvasExportDialog` 收口到更明确的 export seam + +### P2 + +- 保护 `CanvasImageEditPanel` / `CanvasPropertiesPanel` 新的 image property seam,不让新的图片属性入口重新绕开它 +- 保护新的 `CanvasPage` shell + `useCanvasPageModel` + `canvasPageState` seam,不让新的恢复策略或 URL state 重新堆回页面壳层 +- 保护 runtime preview 新的 scoped service seam,不让新的图片编辑或选择逻辑重新绕开 provider / narrow hooks +- 保护文本会话新的 reducer / effect runner / snapshot seam,不让新的文本能力重新堆回 hook +- 保护 active-workbench 新的 state / command / structure / history seam,不让新的默认集成点重新长回全能 facade +- 保护 `CanvasViewport` 新的 stage / tool / overlay seam,不让新的交互策略重新堆回组合壳 + +### P3 + +- 保护文档内核,不做推倒重写 + +## 推荐的拆分顺序 + +1. 先统一导出主路径,减少 `stage snapshot` 旁路并收口 `CanvasExportDialog` +2. 同时保护 `CanvasImageEditPanel` / `CanvasPropertiesPanel` 新的 image property seam,不让新的入口重新回流到整节点写回 +3. 再继续保护新的页面入口 / 路由恢复、runtime preview、文本会话与 active-workbench seam,避免复杂度回流 +4. 文档内核继续只做保护性演进,不做推倒重写 + +## 最终判断 + +这轮之后,页面入口 / 路由恢复、图片编辑 / 图片属性入口统一,以及 active-workbench 边界收口这三刀都已经落地,最紧急的结构风险进一步收缩到导出入口。 + +- 如果继续堆功能,复杂度会优先集中在导出相关入口,而不是重新首先压垮页面入口、runtime preview、文本会话、`CanvasViewport`、panel 层、图片属性入口或 active-workbench seam +- panel 层现在已经能继续承接下一轮拆分,但还不值得宣布“整个 canvas 完成” +- 页面层、runtime preview、文本会话、`CanvasViewport` 与 active-workbench 这几层已经可以承接后续功能,但前提是不要把新的恢复策略、导出旁路或工具策略重新塞回这些 seam +- 文档内核这条健康链路仍然应该继续被保护,后续重构仍应围绕上层应用边界展开 diff --git a/server/package.json b/server/package.json index a9e9a86..c9c99b2 100644 --- a/server/package.json +++ b/server/package.json @@ -12,8 +12,10 @@ "dependencies": { "@fastify/cors": "^11.0.1", "@fastify/rate-limit": "^10.3.0", + "@supabase/supabase-js": "^2.100.0", "dotenv": "^17.2.3", "fastify": "^5.6.1", + "image-size": "^2.0.2", "pg": "^8.20.0", "zod": "^4.3.6" }, diff --git a/server/src/assets/capability.ts b/server/src/assets/capability.ts new file mode 100644 index 0000000..5d140ce --- /dev/null +++ b/server/src/assets/capability.ts @@ -0,0 +1,56 @@ +import { createHmac, timingSafeEqual } from "node:crypto"; +import type { AssetFileKind } from "./types"; + +const toBase64Url = (value: Buffer | string) => + Buffer.from(value) + .toString("base64") + .replace(/\+/g, "-") + .replace(/\//g, "_") + .replace(/=+$/g, ""); + +const fromBase64Url = (value: string) => + Buffer.from( + value + .replace(/-/g, "+") + .replace(/_/g, "/") + .padEnd(Math.ceil(value.length / 4) * 4, "="), + "base64" + ); + +const sign = (secret: string, userId: string, assetId: string, kind: AssetFileKind) => + createHmac("sha256", secret).update(`${userId}:${assetId}:${kind}`).digest(); + +export const createAssetCapabilityToken = (input: { + secret: string; + userId: string; + assetId: string; + kind: AssetFileKind; +}) => + toBase64Url( + `${input.userId}:${toBase64Url(sign(input.secret, input.userId, input.assetId, input.kind))}` + ); + +export const verifyAssetCapabilityToken = (input: { + secret: string; + token: string; + assetId: string; + kind: AssetFileKind; +}) => { + try { + const decoded = fromBase64Url(input.token).toString("utf8"); + const separatorIndex = decoded.indexOf(":"); + if (separatorIndex <= 0) { + return null; + } + + const userId = decoded.slice(0, separatorIndex).trim(); + const signature = fromBase64Url(decoded.slice(separatorIndex + 1).trim()); + const expected = sign(input.secret, userId, input.assetId, input.kind); + if (signature.length !== expected.length) { + return null; + } + return timingSafeEqual(signature, expected) ? userId : null; + } catch { + return null; + } +}; diff --git a/server/src/assets/repository.ts b/server/src/assets/repository.ts new file mode 100644 index 0000000..f7df9c4 --- /dev/null +++ b/server/src/assets/repository.ts @@ -0,0 +1,702 @@ +import { Pool } from "pg"; +import { createId } from "../../../shared/createId"; +import type { + AssetChangeRecord, + AssetEdgeInsert, + AssetFileKind, + AssetFileRecord, + AssetRecord, + AssetRepository, + AssetUploadSession, +} from "./types"; + +const parseJsonObject = (value: unknown): Record => + value && typeof value === "object" && !Array.isArray(value) + ? (value as Record) + : {}; + +const parseJsonStringArray = (value: unknown): string[] => + Array.isArray(value) + ? value.filter((entry): entry is string => typeof entry === "string" && entry.trim().length > 0) + : []; + +const rowToFile = (row: Record): AssetFileRecord => ({ + assetId: String(row.file_asset_id), + kind: String(row.file_kind) as AssetFileKind, + bucket: String(row.file_bucket), + path: String(row.file_storage_path), + mimeType: String(row.file_mime_type), + sizeBytes: Number(row.file_size_bytes), + width: row.file_width == null ? null : Number(row.file_width), + height: row.file_height == null ? null : Number(row.file_height), + createdAt: String(row.file_created_at), + updatedAt: String(row.file_updated_at), +}); + +const aggregateAssets = (rows: Array>): AssetRecord[] => { + const assets = new Map(); + for (const row of rows) { + const assetId = String(row.id); + let asset = assets.get(assetId); + if (!asset) { + asset = { + id: assetId, + ownerUserId: String(row.owner_user_id), + name: String(row.name), + mimeType: String(row.mime_type), + sizeBytes: Number(row.size_bytes), + source: String(row.source) as AssetRecord["source"], + origin: String(row.origin) as AssetRecord["origin"], + contentHash: String(row.content_hash), + tags: parseJsonStringArray(row.tags), + metadata: parseJsonObject(row.metadata), + createdAt: String(row.created_at), + updatedAt: String(row.updated_at), + deletedAt: row.deleted_at ? String(row.deleted_at) : null, + files: [], + }; + assets.set(assetId, asset); + } + if (row.file_kind) { + asset.files.push(rowToFile(row)); + } + } + return [...assets.values()]; +}; + +class PostgresAssetRepository implements AssetRepository { + private initPromise: Promise | null = null; + + constructor(private readonly pool: Pool, private readonly bucket: string) {} + + private async ensureReady() { + if (!this.initPromise) { + this.initPromise = this.pool.query(` + CREATE TABLE IF NOT EXISTS asset_upload_sessions ( + asset_id TEXT PRIMARY KEY, + owner_user_id TEXT NOT NULL, + name TEXT NOT NULL, + mime_type TEXT NOT NULL, + size_bytes BIGINT NOT NULL, + source TEXT NOT NULL, + origin TEXT NOT NULL, + content_hash TEXT NOT NULL, + tags JSONB NOT NULL DEFAULT '[]'::jsonb, + metadata JSONB NOT NULL DEFAULT '{}'::jsonb, + created_at TIMESTAMPTZ NOT NULL, + updated_at TIMESTAMPTZ NOT NULL, + original_path TEXT NOT NULL, + thumbnail_path TEXT NULL, + original_uploaded_at TIMESTAMPTZ NULL, + thumbnail_uploaded_at TIMESTAMPTZ NULL + ); + + CREATE TABLE IF NOT EXISTS assets ( + id TEXT PRIMARY KEY, + owner_user_id TEXT NOT NULL, + name TEXT NOT NULL, + mime_type TEXT NOT NULL, + size_bytes BIGINT NOT NULL, + source TEXT NOT NULL, + origin TEXT NOT NULL, + content_hash TEXT NOT NULL, + tags JSONB NOT NULL DEFAULT '[]'::jsonb, + metadata JSONB NOT NULL DEFAULT '{}'::jsonb, + created_at TIMESTAMPTZ NOT NULL, + updated_at TIMESTAMPTZ NOT NULL, + deleted_at TIMESTAMPTZ NULL + ); + DROP INDEX IF EXISTS assets_owner_hash_active_idx; + CREATE INDEX IF NOT EXISTS assets_owner_hash_active_idx + ON assets(owner_user_id, content_hash) + WHERE deleted_at IS NULL; + CREATE INDEX IF NOT EXISTS assets_owner_updated_active_idx + ON assets(owner_user_id, updated_at DESC) + WHERE deleted_at IS NULL; + + CREATE TABLE IF NOT EXISTS asset_files ( + id TEXT PRIMARY KEY, + asset_id TEXT NOT NULL REFERENCES assets(id) ON DELETE CASCADE, + kind TEXT NOT NULL, + bucket TEXT NOT NULL, + storage_path TEXT NOT NULL, + mime_type TEXT NOT NULL, + size_bytes BIGINT NOT NULL, + width INTEGER NULL, + height INTEGER NULL, + created_at TIMESTAMPTZ NOT NULL, + updated_at TIMESTAMPTZ NOT NULL, + UNIQUE (asset_id, kind) + ); + CREATE INDEX IF NOT EXISTS asset_files_asset_kind_idx + ON asset_files(asset_id, kind); + + CREATE TABLE IF NOT EXISTS asset_edges ( + id TEXT PRIMARY KEY, + source_asset_id TEXT NOT NULL REFERENCES assets(id) ON DELETE CASCADE, + target_asset_id TEXT NOT NULL REFERENCES assets(id) ON DELETE CASCADE, + edge_type TEXT NOT NULL, + conversation_id TEXT NULL, + run_id TEXT NULL, + created_at TIMESTAMPTZ NOT NULL + ); + CREATE INDEX IF NOT EXISTS asset_edges_source_idx + ON asset_edges(source_asset_id, created_at DESC); + CREATE INDEX IF NOT EXISTS asset_edges_target_idx + ON asset_edges(target_asset_id, created_at DESC); + `).then(() => undefined); + } + await this.initPromise; + } + + private async loadAssets(whereSql: string, params: unknown[]) { + await this.ensureReady(); + const result = await this.pool.query( + ` + SELECT + assets.id, + assets.owner_user_id, + assets.name, + assets.mime_type, + assets.size_bytes, + assets.source, + assets.origin, + assets.content_hash, + assets.tags, + assets.metadata, + assets.created_at, + assets.updated_at, + assets.deleted_at, + asset_files.asset_id AS file_asset_id, + asset_files.kind AS file_kind, + asset_files.bucket AS file_bucket, + asset_files.storage_path AS file_storage_path, + asset_files.mime_type AS file_mime_type, + asset_files.size_bytes AS file_size_bytes, + asset_files.width AS file_width, + asset_files.height AS file_height, + asset_files.created_at AS file_created_at, + asset_files.updated_at AS file_updated_at + FROM assets + LEFT JOIN asset_files + ON asset_files.asset_id = assets.id + ${whereSql} + ORDER BY assets.updated_at DESC, asset_files.kind ASC + `, + params + ); + return aggregateAssets(result.rows); + } + + async close() {} + + async findAssetById(userId: string, assetId: string) { + const assets = await this.loadAssets( + "WHERE assets.owner_user_id = $1 AND assets.id = $2 AND assets.deleted_at IS NULL", + [userId, assetId] + ); + return assets[0] ?? null; + } + + async findAssetByContentHash(userId: string, contentHash: string) { + const assets = await this.loadAssets( + "WHERE assets.owner_user_id = $1 AND assets.content_hash = $2 AND assets.deleted_at IS NULL", + [userId, contentHash] + ); + return assets[0] ?? null; + } + + async listAssetChanges(userId: string, since?: string) { + await this.ensureReady(); + const result = await this.pool.query( + ` + SELECT id AS asset_id, content_hash, updated_at, deleted_at + FROM assets + WHERE owner_user_id = $1 + AND ($2::timestamptz IS NULL OR updated_at > $2::timestamptz) + ORDER BY updated_at DESC + `, + [userId, since ?? null] + ); + return result.rows.map( + (row): AssetChangeRecord => ({ + assetId: String(row.asset_id), + contentHash: String(row.content_hash), + updatedAt: String(row.updated_at), + ...(row.deleted_at ? { deletedAt: String(row.deleted_at) } : {}), + }) + ); + } + + async createUploadSession(session: AssetUploadSession) { + await this.ensureReady(); + await this.pool.query( + ` + INSERT INTO asset_upload_sessions ( + asset_id, + owner_user_id, + name, + mime_type, + size_bytes, + source, + origin, + content_hash, + tags, + metadata, + created_at, + updated_at, + original_path, + thumbnail_path, + original_uploaded_at, + thumbnail_uploaded_at + ) VALUES ( + $1, $2, $3, $4, $5, $6, $7, $8, $9::jsonb, $10::jsonb, $11, $12, $13, $14, $15, $16 + ) + ON CONFLICT (asset_id) DO UPDATE SET + owner_user_id = EXCLUDED.owner_user_id, + name = EXCLUDED.name, + mime_type = EXCLUDED.mime_type, + size_bytes = EXCLUDED.size_bytes, + source = EXCLUDED.source, + origin = EXCLUDED.origin, + content_hash = EXCLUDED.content_hash, + tags = EXCLUDED.tags, + metadata = EXCLUDED.metadata, + created_at = EXCLUDED.created_at, + updated_at = EXCLUDED.updated_at, + original_path = EXCLUDED.original_path, + thumbnail_path = EXCLUDED.thumbnail_path, + original_uploaded_at = EXCLUDED.original_uploaded_at, + thumbnail_uploaded_at = EXCLUDED.thumbnail_uploaded_at + `, + [ + session.assetId, + session.ownerUserId, + session.name, + session.mimeType, + session.sizeBytes, + session.source, + session.origin, + session.contentHash, + JSON.stringify(session.tags), + JSON.stringify(session.metadata), + session.createdAt, + session.updatedAt, + session.originalPath, + session.thumbnailPath, + session.originalUploadedAt, + session.thumbnailUploadedAt, + ] + ); + } + + async getUploadSession(userId: string, assetId: string) { + await this.ensureReady(); + const result = await this.pool.query( + "SELECT * FROM asset_upload_sessions WHERE owner_user_id = $1 AND asset_id = $2", + [userId, assetId] + ); + const row = result.rows[0]; + if (!row) { + return null; + } + return { + assetId: String(row.asset_id), + ownerUserId: String(row.owner_user_id), + name: String(row.name), + mimeType: String(row.mime_type), + sizeBytes: Number(row.size_bytes), + source: String(row.source) as AssetUploadSession["source"], + origin: String(row.origin) as AssetUploadSession["origin"], + contentHash: String(row.content_hash), + tags: parseJsonStringArray(row.tags), + metadata: parseJsonObject(row.metadata), + createdAt: String(row.created_at), + updatedAt: String(row.updated_at), + originalPath: String(row.original_path), + thumbnailPath: row.thumbnail_path ? String(row.thumbnail_path) : null, + originalUploadedAt: row.original_uploaded_at ? String(row.original_uploaded_at) : null, + thumbnailUploadedAt: row.thumbnail_uploaded_at ? String(row.thumbnail_uploaded_at) : null, + }; + } + + async findUploadSessionByContentHash(userId: string, contentHash: string) { + await this.ensureReady(); + const result = await this.pool.query( + ` + SELECT * + FROM asset_upload_sessions + WHERE owner_user_id = $1 AND content_hash = $2 + ORDER BY updated_at DESC + LIMIT 1 + `, + [userId, contentHash] + ); + const row = result.rows[0]; + if (!row) { + return null; + } + return { + assetId: String(row.asset_id), + ownerUserId: String(row.owner_user_id), + name: String(row.name), + mimeType: String(row.mime_type), + sizeBytes: Number(row.size_bytes), + source: String(row.source) as AssetUploadSession["source"], + origin: String(row.origin) as AssetUploadSession["origin"], + contentHash: String(row.content_hash), + tags: parseJsonStringArray(row.tags), + metadata: parseJsonObject(row.metadata), + createdAt: String(row.created_at), + updatedAt: String(row.updated_at), + originalPath: String(row.original_path), + thumbnailPath: row.thumbnail_path ? String(row.thumbnail_path) : null, + originalUploadedAt: row.original_uploaded_at ? String(row.original_uploaded_at) : null, + thumbnailUploadedAt: row.thumbnail_uploaded_at ? String(row.thumbnail_uploaded_at) : null, + }; + } + + async markUploadSessionUploaded( + userId: string, + assetId: string, + kind: AssetFileKind, + uploadedAt: string + ) { + await this.ensureReady(); + const column = kind === "thumbnail" ? "thumbnail_uploaded_at" : "original_uploaded_at"; + await this.pool.query( + ` + UPDATE asset_upload_sessions + SET ${column} = $3, updated_at = $3 + WHERE owner_user_id = $1 AND asset_id = $2 + `, + [userId, assetId, uploadedAt] + ); + } + + async deleteUploadSession(userId: string, assetId: string) { + await this.ensureReady(); + await this.pool.query( + "DELETE FROM asset_upload_sessions WHERE owner_user_id = $1 AND asset_id = $2", + [userId, assetId] + ); + } + + async withContentHashLock( + userId: string, + contentHash: string, + operation: () => Promise + ) { + await this.ensureReady(); + const client = await this.pool.connect(); + try { + await client.query( + "SELECT pg_advisory_lock(hashtext($1), hashtext($2))", + [userId, contentHash] + ); + return await operation(); + } finally { + try { + await client.query( + "SELECT pg_advisory_unlock(hashtext($1), hashtext($2))", + [userId, contentHash] + ); + } finally { + client.release(); + } + } + } + + async saveAsset(record: AssetRecord) { + await this.ensureReady(); + const client = await this.pool.connect(); + try { + await client.query("BEGIN"); + await client.query( + ` + INSERT INTO assets ( + id, + owner_user_id, + name, + mime_type, + size_bytes, + source, + origin, + content_hash, + tags, + metadata, + created_at, + updated_at, + deleted_at + ) VALUES ( + $1, $2, $3, $4, $5, $6, $7, $8, $9::jsonb, $10::jsonb, $11, $12, $13 + ) + ON CONFLICT (id) DO UPDATE SET + owner_user_id = EXCLUDED.owner_user_id, + name = EXCLUDED.name, + mime_type = EXCLUDED.mime_type, + size_bytes = EXCLUDED.size_bytes, + source = EXCLUDED.source, + origin = EXCLUDED.origin, + content_hash = EXCLUDED.content_hash, + tags = EXCLUDED.tags, + metadata = EXCLUDED.metadata, + updated_at = EXCLUDED.updated_at, + deleted_at = EXCLUDED.deleted_at + `, + [ + record.id, + record.ownerUserId, + record.name, + record.mimeType, + record.sizeBytes, + record.source, + record.origin, + record.contentHash, + JSON.stringify(record.tags), + JSON.stringify(record.metadata), + record.createdAt, + record.updatedAt, + record.deletedAt, + ] + ); + + await client.query("DELETE FROM asset_files WHERE asset_id = $1", [record.id]); + + for (const file of record.files) { + await client.query( + ` + INSERT INTO asset_files ( + id, + asset_id, + kind, + bucket, + storage_path, + mime_type, + size_bytes, + width, + height, + created_at, + updated_at + ) VALUES ( + $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11 + ) + `, + [ + createId("asset-file"), + file.assetId, + file.kind, + file.bucket, + file.path, + file.mimeType, + file.sizeBytes, + file.width, + file.height, + file.createdAt, + file.updatedAt, + ] + ); + } + + await client.query("COMMIT"); + return record; + } catch (error) { + await client.query("ROLLBACK"); + throw error; + } finally { + client.release(); + } + } + + async createAssetEdges(edges: AssetEdgeInsert[]) { + await this.ensureReady(); + if (edges.length === 0) { + return; + } + + const client = await this.pool.connect(); + try { + await client.query("BEGIN"); + for (const edge of edges) { + await client.query( + ` + INSERT INTO asset_edges ( + id, + source_asset_id, + target_asset_id, + edge_type, + conversation_id, + run_id, + created_at + ) VALUES ($1, $2, $3, $4, $5, $6, $7) + `, + [ + edge.id, + edge.sourceAssetId, + edge.targetAssetId, + edge.edgeType, + edge.conversationId ?? null, + edge.runId ?? null, + edge.createdAt, + ] + ); + } + await client.query("COMMIT"); + } catch (error) { + await client.query("ROLLBACK"); + throw error; + } finally { + client.release(); + } + } + + async deleteAssetEdges(edgeIds: string[]) { + await this.ensureReady(); + if (edgeIds.length === 0) { + return; + } + + await this.pool.query( + ` + DELETE FROM asset_edges + WHERE id = ANY($1::text[]) + `, + [edgeIds] + ); + } + + async softDeleteAsset(userId: string, assetId: string, deletedAt: string) { + await this.ensureReady(); + await this.pool.query( + ` + UPDATE assets + SET deleted_at = $3, updated_at = $3 + WHERE owner_user_id = $1 AND id = $2 AND deleted_at IS NULL + `, + [userId, assetId, deletedAt] + ); + } +} + +class MemoryAssetRepository implements AssetRepository { + private readonly assets = new Map(); + private readonly uploadSessions = new Map(); + private readonly contentHashLocks = new Map>(); + + async close() {} + + async findAssetById(userId: string, assetId: string) { + const asset = this.assets.get(assetId); + return asset && asset.ownerUserId === userId && !asset.deletedAt ? structuredClone(asset) : null; + } + + async findAssetByContentHash(userId: string, contentHash: string) { + for (const asset of this.assets.values()) { + if (asset.ownerUserId === userId && asset.contentHash === contentHash && !asset.deletedAt) { + return structuredClone(asset); + } + } + return null; + } + + async listAssetChanges(userId: string, since?: string) { + return [...this.assets.values()] + .filter( + (asset) => + asset.ownerUserId === userId && + (!since || Date.parse(asset.updatedAt) > Date.parse(since)) + ) + .sort((left, right) => Date.parse(right.updatedAt) - Date.parse(left.updatedAt)) + .map((asset) => ({ + assetId: asset.id, + contentHash: asset.contentHash, + updatedAt: asset.updatedAt, + ...(asset.deletedAt ? { deletedAt: asset.deletedAt } : {}), + })); + } + + async createUploadSession(session: AssetUploadSession) { + this.uploadSessions.set(session.assetId, structuredClone(session)); + } + + async getUploadSession(userId: string, assetId: string) { + const session = this.uploadSessions.get(assetId); + return session && session.ownerUserId === userId ? structuredClone(session) : null; + } + + async findUploadSessionByContentHash(userId: string, contentHash: string) { + const matches = [...this.uploadSessions.values()] + .filter((session) => session.ownerUserId === userId && session.contentHash === contentHash) + .sort((left, right) => Date.parse(right.updatedAt) - Date.parse(left.updatedAt)); + return matches[0] ? structuredClone(matches[0]) : null; + } + + async markUploadSessionUploaded( + userId: string, + assetId: string, + kind: AssetFileKind, + uploadedAt: string + ) { + const session = this.uploadSessions.get(assetId); + if (!session || session.ownerUserId !== userId) { + return; + } + if (kind === "thumbnail") { + session.thumbnailUploadedAt = uploadedAt; + } else { + session.originalUploadedAt = uploadedAt; + } + session.updatedAt = uploadedAt; + } + + async deleteUploadSession(userId: string, assetId: string) { + const session = this.uploadSessions.get(assetId); + if (session?.ownerUserId === userId) { + this.uploadSessions.delete(assetId); + } + } + + async withContentHashLock( + userId: string, + contentHash: string, + operation: () => Promise + ) { + const key = `${userId}:${contentHash}`; + const previous = this.contentHashLocks.get(key) ?? Promise.resolve(); + let release!: () => void; + const current = new Promise((resolve) => { + release = resolve; + }); + const chained = previous.then(() => current); + this.contentHashLocks.set(key, chained); + await previous; + try { + return await operation(); + } finally { + release(); + if (this.contentHashLocks.get(key) === chained) { + this.contentHashLocks.delete(key); + } + } + } + + async saveAsset(record: AssetRecord) { + this.assets.set(record.id, structuredClone(record)); + return structuredClone(record); + } + + async createAssetEdges(_edges: AssetEdgeInsert[]) {} + + async deleteAssetEdges(_edgeIds: string[]) {} + + async softDeleteAsset(userId: string, assetId: string, deletedAt: string) { + const asset = this.assets.get(assetId); + if (asset && asset.ownerUserId === userId && !asset.deletedAt) { + asset.deletedAt = deletedAt; + asset.updatedAt = deletedAt; + } + } +} + +export const createAssetRepository = (pool: Pool | null, bucket: string): AssetRepository => + pool ? new PostgresAssetRepository(pool, bucket) : new MemoryAssetRepository(); diff --git a/server/src/assets/service.ts b/server/src/assets/service.ts new file mode 100644 index 0000000..f6b05b7 --- /dev/null +++ b/server/src/assets/service.ts @@ -0,0 +1,641 @@ +import { createHash } from "node:crypto"; +import { imageSize } from "image-size"; +import { createId } from "../../../shared/createId"; +import { getUserIdFromAuthorizationHeader } from "../auth/user"; +import { getConfig } from "../config"; +import { createAssetCapabilityToken, verifyAssetCapabilityToken } from "./capability"; +import type { + AssetApiRecord, + AssetEdgeInsert, + AssetFileKind, + AssetRecord, + AssetRepository, + AssetStorage, + AssetStorageObject, + AssetUploadSession, + CreateGeneratedAssetInput, + PrepareAssetUploadInput, + ResolvedProviderAssetRef, +} from "./types"; + +const DEFAULT_SIGNED_READ_TTL_SECONDS = 15 * 60; + +const normalizeMetadata = (value: Record | undefined) => + Object.fromEntries( + Object.entries(value ?? {}).filter(([, entry]) => entry !== undefined) + ); + +const toFileExtension = (mimeType: string) => { + if (mimeType.includes("png")) return "png"; + if (mimeType.includes("webp")) return "webp"; + if (mimeType.includes("jpeg")) return "jpg"; + if (mimeType.includes("avif")) return "avif"; + return "bin"; +}; + +const sha256 = (buffer: Buffer) => createHash("sha256").update(buffer).digest("hex"); + +const IMAGE_TYPE_TO_MIME: Record = { + avif: "image/avif", + bmp: "image/bmp", + gif: "image/gif", + heif: "image/heif", + heic: "image/heic", + ico: "image/x-icon", + jpeg: "image/jpeg", + jpg: "image/jpeg", + png: "image/png", + tiff: "image/tiff", + webp: "image/webp", +}; + +const resolveDetectedMimeType = (mimeType: string, detectedType?: string) => { + if (mimeType.startsWith("image/")) { + return mimeType; + } + + return detectedType ? IMAGE_TYPE_TO_MIME[detectedType] ?? mimeType : mimeType; +}; + +const measureStoredImageObject = (object: AssetStorageObject) => { + let dimensions: ReturnType; + try { + dimensions = imageSize(object.buffer); + } catch { + throw new Error("Stored asset object is not a valid image."); + } + + return { + mimeType: resolveDetectedMimeType(object.mimeType, dimensions.type), + sizeBytes: object.buffer.byteLength, + width: typeof dimensions.width === "number" ? dimensions.width : null, + height: typeof dimensions.height === "number" ? dimensions.height : null, + contentHash: sha256(object.buffer), + }; +}; + +const isUniqueConstraintError = (error: unknown): error is { code: string } => + typeof error === "object" && + error !== null && + "code" in error && + (error as { code?: unknown }).code === "23505"; + +export class AssetService { + private readonly config = getConfig(); + + constructor( + private readonly repository: AssetRepository, + private readonly storage: AssetStorage + ) {} + + async close() { + await this.repository.close(); + } + + private buildStoragePaths(userId: string, assetId: string, mimeType: string) { + const extension = toFileExtension(mimeType); + return { + originalPath: `users/${userId}/assets/${assetId}/original.${extension}`, + thumbnailPath: `users/${userId}/assets/${assetId}/thumbnail.${extension}`, + }; + } + + private buildBrowserUrl(assetId: string, userId: string, kind: AssetFileKind) { + const token = createAssetCapabilityToken({ + secret: this.config.assetUrlSecret ?? "filmlab-dev-asset-url-secret", + userId, + assetId, + kind, + }); + return `/api/assets/${encodeURIComponent(assetId)}/${kind}?token=${encodeURIComponent(token)}`; + } + + private toApiRecord(record: AssetRecord): AssetApiRecord { + const hasThumbnail = record.files.some((file) => file.kind === "thumbnail"); + return { + assetId: record.id, + name: record.name, + type: record.mimeType, + size: record.sizeBytes, + source: record.source, + origin: record.origin, + contentHash: record.contentHash, + tags: [...record.tags], + metadata: + Object.keys(record.metadata).length > 0 ? { ...record.metadata } : undefined, + createdAt: record.createdAt, + updatedAt: record.updatedAt, + objectUrl: this.buildBrowserUrl(record.id, record.ownerUserId, "original"), + thumbnailUrl: this.buildBrowserUrl( + record.id, + record.ownerUserId, + hasThumbnail ? "thumbnail" : "original" + ), + }; + } + + private async removeObjectsBestEffort(paths: Array) { + const nextPaths = paths.filter( + (path): path is string => typeof path === "string" && path.trim().length > 0 + ); + if (nextPaths.length === 0) { + return; + } + + await this.storage.removeObjects(nextPaths).catch(() => undefined); + } + + private async loadStoredImage(path: string, label: string) { + const object = await this.storage.getObject(path); + if (!object) { + throw new Error(`${label} is missing from storage.`); + } + + return { + path, + ...measureStoredImageObject(object), + }; + } + + private buildPreparedUpload(assetId: string, includeThumbnail: boolean) { + return { + existing: false as const, + assetId, + upload: { + method: "PUT" as const, + url: `/api/assets/upload/${encodeURIComponent(assetId)}/original`, + }, + ...(includeThumbnail + ? { + thumbnailUpload: { + method: "PUT" as const, + url: `/api/assets/upload/${encodeURIComponent(assetId)}/thumbnail`, + }, + } + : {}), + }; + } + + async prepareUpload(input: PrepareAssetUploadInput) { + const requestedAssetId = + typeof input.assetId === "string" && input.assetId.trim().length > 0 + ? input.assetId.trim() + : null; + + if (requestedAssetId) { + const existingById = await this.repository.findAssetById(input.userId, requestedAssetId); + if (existingById && existingById.contentHash === input.contentHash) { + return { + existing: true as const, + assetId: existingById.id, + asset: this.toApiRecord(existingById), + }; + } + if (!existingById) { + return this.repository.withContentHashLock( + input.userId, + input.contentHash, + async () => { + const existingByHash = await this.repository.findAssetByContentHash( + input.userId, + input.contentHash + ); + if (existingByHash) { + return { + existing: true as const, + assetId: existingByHash.id, + asset: this.toApiRecord(existingByHash), + }; + } + + const existingSession = await this.repository.findUploadSessionByContentHash( + input.userId, + input.contentHash + ); + if (existingSession) { + return this.buildPreparedUpload( + existingSession.assetId, + Boolean(existingSession.thumbnailPath) + ); + } + + const assetId = requestedAssetId; + const paths = this.buildStoragePaths(input.userId, assetId, input.mimeType); + const now = new Date().toISOString(); + const session: AssetUploadSession = { + assetId, + ownerUserId: input.userId, + name: input.name, + mimeType: input.mimeType, + sizeBytes: input.sizeBytes, + source: input.source, + origin: input.origin, + contentHash: input.contentHash, + tags: [...input.tags], + metadata: normalizeMetadata(input.metadata), + createdAt: input.createdAt, + updatedAt: now, + originalPath: paths.originalPath, + thumbnailPath: input.includeThumbnail ? paths.thumbnailPath : null, + originalUploadedAt: null, + thumbnailUploadedAt: null, + }; + await this.repository.createUploadSession(session); + return this.buildPreparedUpload(assetId, input.includeThumbnail ?? false); + } + ); + } + } + + if (!requestedAssetId) { + return this.repository.withContentHashLock(input.userId, input.contentHash, async () => { + const existing = await this.repository.findAssetByContentHash( + input.userId, + input.contentHash + ); + if (existing) { + return { + existing: true as const, + assetId: existing.id, + asset: this.toApiRecord(existing), + }; + } + + const existingSession = await this.repository.findUploadSessionByContentHash( + input.userId, + input.contentHash + ); + if (existingSession) { + return this.buildPreparedUpload( + existingSession.assetId, + Boolean(existingSession.thumbnailPath) + ); + } + + const assetId = createId("asset"); + const paths = this.buildStoragePaths(input.userId, assetId, input.mimeType); + const now = new Date().toISOString(); + const session: AssetUploadSession = { + assetId, + ownerUserId: input.userId, + name: input.name, + mimeType: input.mimeType, + sizeBytes: input.sizeBytes, + source: input.source, + origin: input.origin, + contentHash: input.contentHash, + tags: [...input.tags], + metadata: normalizeMetadata(input.metadata), + createdAt: input.createdAt, + updatedAt: now, + originalPath: paths.originalPath, + thumbnailPath: input.includeThumbnail ? paths.thumbnailPath : null, + originalUploadedAt: null, + thumbnailUploadedAt: null, + }; + await this.repository.createUploadSession(session); + return this.buildPreparedUpload(assetId, input.includeThumbnail ?? false); + }); + } + + const assetId = requestedAssetId ?? createId("asset"); + const paths = this.buildStoragePaths(input.userId, assetId, input.mimeType); + const previousSession = await this.repository.getUploadSession(input.userId, assetId); + const now = new Date().toISOString(); + const session: AssetUploadSession = { + assetId, + ownerUserId: input.userId, + name: input.name, + mimeType: input.mimeType, + sizeBytes: input.sizeBytes, + source: input.source, + origin: input.origin, + contentHash: input.contentHash, + tags: [...input.tags], + metadata: normalizeMetadata(input.metadata), + createdAt: input.createdAt, + updatedAt: now, + originalPath: paths.originalPath, + thumbnailPath: input.includeThumbnail ? paths.thumbnailPath : null, + originalUploadedAt: null, + thumbnailUploadedAt: null, + }; + await this.repository.createUploadSession(session); + await this.removeObjectsBestEffort([ + previousSession?.originalPath !== session.originalPath ? previousSession?.originalPath : null, + previousSession?.thumbnailPath !== session.thumbnailPath + ? previousSession?.thumbnailPath + : null, + ]); + + return this.buildPreparedUpload(assetId, input.includeThumbnail ?? false); + } + + async uploadSessionObject(input: { + userId: string; + assetId: string; + kind: AssetFileKind; + buffer: Buffer; + mimeType: string; + }) { + const session = await this.repository.getUploadSession(input.userId, input.assetId); + if (!session) { + throw new Error("Upload session not found."); + } + const path = input.kind === "thumbnail" ? session.thumbnailPath : session.originalPath; + if (!path) { + throw new Error("Upload kind is not enabled for this session."); + } + if (input.kind === "original" && input.mimeType !== session.mimeType) { + throw new Error("Original upload MIME type does not match the upload session."); + } + + await this.storage.putObject({ + path, + buffer: input.buffer, + mimeType: input.mimeType, + }); + await this.repository.markUploadSessionUploaded( + input.userId, + input.assetId, + input.kind, + new Date().toISOString() + ); + } + + async completeUpload(userId: string, assetId: string) { + const session = await this.repository.getUploadSession(userId, assetId); + if (!session) { + throw new Error("Upload session not found."); + } + if (!session.originalUploadedAt) { + throw new Error("Original image has not been uploaded."); + } + + const existingAsset = await this.repository.findAssetById(userId, assetId); + const original = await this.loadStoredImage(session.originalPath, "Uploaded original image"); + const thumbnail = + session.thumbnailPath && session.thumbnailUploadedAt + ? await this.loadStoredImage(session.thumbnailPath, "Uploaded thumbnail image") + : null; + const duplicateAsset = await this.repository.findAssetByContentHash( + userId, + original.contentHash + ); + + if (duplicateAsset && duplicateAsset.id !== assetId && !existingAsset) { + await this.removeObjectsBestEffort([session.originalPath, session.thumbnailPath]); + await this.repository.deleteUploadSession(userId, assetId); + return this.toApiRecord(duplicateAsset); + } + + const now = new Date().toISOString(); + const metadata = { + ...normalizeMetadata(session.metadata), + ...(original.width != null ? { width: original.width } : {}), + ...(original.height != null ? { height: original.height } : {}), + }; + const record: AssetRecord = { + id: session.assetId, + ownerUserId: session.ownerUserId, + name: session.name, + mimeType: original.mimeType, + sizeBytes: original.sizeBytes, + source: session.source, + origin: session.origin, + contentHash: original.contentHash, + tags: [...session.tags], + metadata, + createdAt: existingAsset?.createdAt ?? session.createdAt, + updatedAt: now, + deletedAt: null, + files: [ + { + assetId: session.assetId, + kind: "original", + bucket: this.config.supabaseStorageBucket ?? "assets", + path: session.originalPath, + mimeType: original.mimeType, + sizeBytes: original.sizeBytes, + width: original.width, + height: original.height, + createdAt: session.originalUploadedAt, + updatedAt: now, + }, + ...(thumbnail && session.thumbnailPath && session.thumbnailUploadedAt + ? [ + { + assetId: session.assetId, + kind: "thumbnail" as const, + bucket: this.config.supabaseStorageBucket ?? "assets", + path: session.thumbnailPath, + mimeType: thumbnail.mimeType, + sizeBytes: thumbnail.sizeBytes, + width: thumbnail.width, + height: thumbnail.height, + createdAt: session.thumbnailUploadedAt, + updatedAt: now, + }, + ] + : []), + ], + }; + + const saved = await this.repository.saveAsset(record); + await this.repository.deleteUploadSession(userId, assetId); + await this.removeObjectsBestEffort( + (existingAsset?.files ?? []) + .map((file) => + saved.files.some((nextFile) => nextFile.kind === file.kind && nextFile.path === file.path) + ? null + : file.path + ) + ); + return this.toApiRecord(saved); + } + + async getAsset(userId: string, assetId: string) { + const asset = await this.repository.findAssetById(userId, assetId); + return asset ? this.toApiRecord(asset) : null; + } + + async listChanges(userId: string, since?: string) { + return this.repository.listAssetChanges(userId, since); + } + + async resolveBrowserAssetFile(input: { + assetId: string; + kind: AssetFileKind; + token?: string; + authorization?: string | string[]; + }) { + const tokenUserId = + input.token && input.token.trim() + ? verifyAssetCapabilityToken({ + secret: this.config.assetUrlSecret ?? "filmlab-dev-asset-url-secret", + token: input.token.trim(), + assetId: input.assetId, + kind: input.kind, + }) + : null; + const headerUserId = getUserIdFromAuthorizationHeader(input.authorization); + const userId = tokenUserId ?? headerUserId; + if (!userId) { + return null; + } + + const record = await this.repository.findAssetById(userId, input.assetId); + if (!record) { + return null; + } + + const selected = + record.files.find((file) => file.kind === input.kind) ?? + record.files.find((file) => file.kind === "original") ?? + null; + if (!selected) { + return null; + } + + return this.storage.getObject(selected.path); + } + + async deleteAsset(userId: string, assetId: string) { + const record = await this.repository.findAssetById(userId, assetId); + if (!record) { + return; + } + await this.storage.removeObjects(record.files.map((file) => file.path)); + await this.repository.softDeleteAsset(userId, assetId, new Date().toISOString()); + } + + async createGeneratedAsset(input: CreateGeneratedAssetInput) { + const measured = measureStoredImageObject({ + buffer: input.buffer, + mimeType: input.mimeType, + }); + return this.repository.withContentHashLock(input.userId, measured.contentHash, async () => { + const existing = await this.repository.findAssetByContentHash( + input.userId, + measured.contentHash + ); + if (existing) { + return { + ...this.toApiRecord(existing), + created: false, + }; + } + + const assetId = createId("asset"); + const paths = this.buildStoragePaths(input.userId, assetId, measured.mimeType); + await this.storage.putObject({ + path: paths.originalPath, + buffer: input.buffer, + mimeType: measured.mimeType, + }); + + const metadata = { + ...normalizeMetadata(input.metadata), + ...(measured.width != null ? { width: measured.width } : {}), + ...(measured.height != null ? { height: measured.height } : {}), + }; + const now = new Date().toISOString(); + const record: AssetRecord = { + id: assetId, + ownerUserId: input.userId, + name: input.name, + mimeType: measured.mimeType, + sizeBytes: measured.sizeBytes, + source: input.source, + origin: input.origin, + contentHash: measured.contentHash, + tags: [...(input.tags ?? [])], + metadata, + createdAt: input.createdAt, + updatedAt: now, + deletedAt: null, + files: [ + { + assetId, + kind: "original", + bucket: this.config.supabaseStorageBucket ?? "assets", + path: paths.originalPath, + mimeType: measured.mimeType, + sizeBytes: measured.sizeBytes, + width: measured.width, + height: measured.height, + createdAt: now, + updatedAt: now, + }, + ], + }; + try { + const saved = await this.repository.saveAsset(record); + return { + ...this.toApiRecord(saved), + created: true, + }; + } catch (error) { + if (isUniqueConstraintError(error)) { + await this.removeObjectsBestEffort([paths.originalPath]); + const duplicate = await this.repository.findAssetByContentHash( + input.userId, + measured.contentHash + ); + if (duplicate) { + return { + ...this.toApiRecord(duplicate), + created: false, + }; + } + } + await this.removeObjectsBestEffort([paths.originalPath]); + throw error; + } + }); + } + + async createAssetEdges(edges: AssetEdgeInsert[]) { + await this.repository.createAssetEdges(edges); + } + + async deleteAssetEdges(edgeIds: string[]) { + await this.repository.deleteAssetEdges(edgeIds); + } + + async resolveProviderAssetRefs( + userId: string, + assetRefs: Array<{ + assetId: string; + role: "reference" | "edit" | "variation"; + referenceType?: "style" | "content" | "controlnet"; + weight?: number; + }> + ): Promise { + const resolved: ResolvedProviderAssetRef[] = []; + for (const assetRef of assetRefs) { + const asset = await this.repository.findAssetById(userId, assetRef.assetId); + if (!asset) { + throw new Error(`Referenced asset ${assetRef.assetId} was not found.`); + } + + const original = asset.files.find((file) => file.kind === "original"); + if (!original) { + throw new Error(`Referenced asset ${assetRef.assetId} is missing its original file.`); + } + + resolved.push({ + assetId: assetRef.assetId, + role: assetRef.role, + referenceType: assetRef.referenceType ?? "content", + weight: typeof assetRef.weight === "number" ? assetRef.weight : 1, + signedUrl: await this.storage.createSignedReadUrl( + original.path, + DEFAULT_SIGNED_READ_TTL_SECONDS + ), + mimeType: original.mimeType, + }); + } + return resolved; + } +} diff --git a/server/src/assets/storage.ts b/server/src/assets/storage.ts new file mode 100644 index 0000000..d9fcd0c --- /dev/null +++ b/server/src/assets/storage.ts @@ -0,0 +1,103 @@ +import { createClient } from "@supabase/supabase-js"; +import type { AppConfig } from "../config"; +import type { AssetStorage, AssetStorageObject } from "./types"; + +class MemoryAssetStorage implements AssetStorage { + private readonly objects = new Map(); + + async putObject(input: { path: string; buffer: Buffer; mimeType: string }) { + this.objects.set(input.path, { + buffer: Buffer.from(input.buffer), + mimeType: input.mimeType, + }); + } + + async getObject(path: string) { + const stored = this.objects.get(path); + return stored + ? { + buffer: Buffer.from(stored.buffer), + mimeType: stored.mimeType, + } + : null; + } + + async removeObjects(paths: string[]) { + paths.forEach((path) => { + this.objects.delete(path); + }); + } + + async createSignedReadUrl(path: string) { + return `memory://${encodeURIComponent(path)}`; + } +} + +class SupabaseAssetStorage implements AssetStorage { + private readonly client; + private readonly bucket: string; + + constructor(config: AppConfig) { + if (!config.supabaseUrl || !config.supabaseServiceRoleKey) { + throw new Error("Supabase storage config is incomplete."); + } + + this.client = createClient(config.supabaseUrl, config.supabaseServiceRoleKey, { + auth: { + autoRefreshToken: false, + persistSession: false, + }, + }); + this.bucket = config.supabaseStorageBucket ?? "assets"; + } + + async putObject(input: { path: string; buffer: Buffer; mimeType: string }) { + const { error } = await this.client.storage.from(this.bucket).upload(input.path, input.buffer, { + contentType: input.mimeType, + upsert: true, + }); + if (error) { + throw new Error(`Supabase upload failed: ${error.message}`); + } + } + + async getObject(path: string) { + const { data, error } = await this.client.storage.from(this.bucket).download(path); + if (error) { + if (error.message.toLowerCase().includes("not found")) { + return null; + } + throw new Error(`Supabase download failed: ${error.message}`); + } + + return { + buffer: Buffer.from(await data.arrayBuffer()), + mimeType: data.type || "application/octet-stream", + }; + } + + async removeObjects(paths: string[]) { + if (paths.length === 0) { + return; + } + const { error } = await this.client.storage.from(this.bucket).remove(paths); + if (error) { + throw new Error(`Supabase delete failed: ${error.message}`); + } + } + + async createSignedReadUrl(path: string, expiresInSeconds: number) { + const { data, error } = await this.client.storage + .from(this.bucket) + .createSignedUrl(path, expiresInSeconds); + if (error || !data?.signedUrl) { + throw new Error(`Supabase signed URL failed: ${error?.message ?? "unknown error"}`); + } + return data.signedUrl; + } +} + +export const createAssetStorage = (config: AppConfig): AssetStorage => + config.supabaseUrl && config.supabaseServiceRoleKey + ? new SupabaseAssetStorage(config) + : new MemoryAssetStorage(); diff --git a/server/src/assets/types.ts b/server/src/assets/types.ts new file mode 100644 index 0000000..0d01496 --- /dev/null +++ b/server/src/assets/types.ts @@ -0,0 +1,182 @@ +import type { PersistedAssetEdgeType } from "../../../shared/chatImageTypes"; + +export type AssetSource = "imported" | "ai-generated"; +export type AssetFileKind = "original" | "thumbnail"; +export type AssetOrigin = "file" | "url" | "ai"; + +export interface AssetFileRecord { + assetId: string; + kind: AssetFileKind; + bucket: string; + path: string; + mimeType: string; + sizeBytes: number; + width: number | null; + height: number | null; + createdAt: string; + updatedAt: string; +} + +export interface AssetRecord { + id: string; + ownerUserId: string; + name: string; + mimeType: string; + sizeBytes: number; + source: AssetSource; + origin: AssetOrigin; + contentHash: string; + tags: string[]; + metadata: Record; + createdAt: string; + updatedAt: string; + deletedAt: string | null; + files: AssetFileRecord[]; +} + +export interface AssetUploadSession { + assetId: string; + ownerUserId: string; + name: string; + mimeType: string; + sizeBytes: number; + source: AssetSource; + origin: AssetOrigin; + contentHash: string; + tags: string[]; + metadata: Record; + createdAt: string; + updatedAt: string; + originalPath: string; + thumbnailPath: string | null; + originalUploadedAt: string | null; + thumbnailUploadedAt: string | null; +} + +export interface AssetApiRecord { + assetId: string; + name: string; + type: string; + size: number; + source: AssetSource; + origin: AssetOrigin; + contentHash: string; + tags: string[]; + metadata?: Record; + createdAt: string; + updatedAt: string; + objectUrl: string; + thumbnailUrl: string; +} + +export interface AssetChangeRecord { + assetId: string; + contentHash: string; + updatedAt: string; + deletedAt?: string; +} + +export interface AssetUploadTarget { + method: "PUT"; + url: string; + headers?: Record; +} + +export interface PreparedAssetUpload { + existing: boolean; + assetId: string; + asset?: AssetApiRecord; + upload?: AssetUploadTarget; + thumbnailUpload?: AssetUploadTarget; +} + +export interface PrepareAssetUploadInput { + assetId?: string; + userId: string; + name: string; + mimeType: string; + sizeBytes: number; + createdAt: string; + source: AssetSource; + origin: AssetOrigin; + contentHash: string; + tags: string[]; + metadata?: Record; + includeThumbnail?: boolean; +} + +export interface CreateGeneratedAssetInput { + userId: string; + name: string; + mimeType: string; + buffer: Buffer; + createdAt: string; + source: "ai-generated"; + origin: "ai"; + tags?: string[]; + metadata?: Record; +} + +export interface AssetEdgeInsert { + id: string; + sourceAssetId: string; + targetAssetId: string; + edgeType: PersistedAssetEdgeType; + conversationId?: string | null; + runId?: string | null; + createdAt: string; +} + +export interface ResolvedProviderAssetRef { + assetId: string; + role: "reference" | "edit" | "variation"; + referenceType: "style" | "content" | "controlnet"; + weight: number; + signedUrl: string; + mimeType: string; +} + +export interface AssetStorageObject { + buffer: Buffer; + mimeType: string; +} + +export interface AssetStorage { + putObject(input: { + path: string; + buffer: Buffer; + mimeType: string; + }): Promise; + getObject(path: string): Promise; + removeObjects(paths: string[]): Promise; + createSignedReadUrl(path: string, expiresInSeconds: number): Promise; +} + +export interface AssetRepository { + close(): Promise; + findAssetById(userId: string, assetId: string): Promise; + findAssetByContentHash(userId: string, contentHash: string): Promise; + listAssetChanges(userId: string, since?: string): Promise; + createUploadSession(session: AssetUploadSession): Promise; + getUploadSession(userId: string, assetId: string): Promise; + findUploadSessionByContentHash( + userId: string, + contentHash: string + ): Promise; + markUploadSessionUploaded( + userId: string, + assetId: string, + kind: AssetFileKind, + uploadedAt: string + ): Promise; + deleteUploadSession(userId: string, assetId: string): Promise; + withContentHashLock( + userId: string, + contentHash: string, + operation: () => Promise + ): Promise; + saveAsset(record: AssetRecord): Promise; + createAssetEdges(edges: AssetEdgeInsert[]): Promise; + deleteAssetEdges(edgeIds: string[]): Promise; + softDeleteAsset(userId: string, assetId: string, deletedAt: string): Promise; +} diff --git a/server/src/chat/persistence/memory.test.ts b/server/src/chat/persistence/memory.test.ts index 852a9f5..fe82e24 100644 --- a/server/src/chat/persistence/memory.test.ts +++ b/server/src/chat/persistence/memory.test.ts @@ -99,6 +99,7 @@ const createGenerationInput = (overrides?: { createdAt: "2026-03-12T00:00:00.000Z", completedAt: null, telemetry: { + traceId: overrides?.runId ? `trace-${overrides.runId}` : "trace-run-1", providerRequestId: null, providerTaskId: null, latencyMs: null, @@ -197,6 +198,7 @@ describe("MemoryChatStateRepository", () => { assetIds: ["thread-asset-1"], referencedAssetIds: [], telemetry: { + traceId: "trace-run-1", providerRequestId: "req-1", providerTaskId: null, latencyMs: 5000, @@ -337,6 +339,7 @@ describe("MemoryChatStateRepository", () => { assetIds: ["thread-asset-1"], referencedAssetIds: [], telemetry: { + traceId: "trace-run-1", providerRequestId: null, providerTaskId: null, latencyMs: 5000, @@ -373,6 +376,7 @@ describe("MemoryChatStateRepository", () => { id: "prompt-version-1", runId: "run-1", turnId: "turn-1", + traceId: "trace-run-1", version: 1, stage: "dispatch", targetKey: "ark:doubao-seedream-5-0-260128", @@ -479,6 +483,7 @@ describe("MemoryChatStateRepository", () => { id: "artifact-2", runId: "run-1", turnId: "turn-1", + traceId: "trace-run-1", version: 2, stage: "dispatch", targetKey: "dashscope:qwen-image-2.0-pro", @@ -508,6 +513,7 @@ describe("MemoryChatStateRepository", () => { id: "artifact-1", runId: "run-1", turnId: "turn-1", + traceId: "trace-run-1", version: 1, stage: "rewrite", targetKey: null, @@ -550,6 +556,7 @@ describe("MemoryChatStateRepository", () => { id: "artifact-other", runId: "run-2", turnId: "turn-2", + traceId: "trace-run-2", version: 1, stage: "rewrite", targetKey: null, @@ -654,6 +661,7 @@ describe("MemoryChatStateRepository", () => { id: "artifact-compile", runId: "run-1", turnId: "turn-1", + traceId: "trace-run-1", version: 1, stage: "compile", targetKey: "dashscope:qwen-image-2.0-pro", @@ -691,6 +699,7 @@ describe("MemoryChatStateRepository", () => { id: "artifact-dispatch", runId: "run-1", turnId: "turn-1", + traceId: "trace-run-1", version: 2, stage: "dispatch", targetKey: "dashscope:qwen-image-2.0-pro", @@ -821,6 +830,7 @@ describe("MemoryChatStateRepository", () => { assetIds: [], referencedAssetIds: [], telemetry: { + traceId: "trace-run-1", providerRequestId: null, providerTaskId: null, latencyMs: 5000, diff --git a/server/src/chat/persistence/memory.ts b/server/src/chat/persistence/memory.ts index dcab81d..80fb07d 100644 --- a/server/src/chat/persistence/memory.ts +++ b/server/src/chat/persistence/memory.ts @@ -16,6 +16,7 @@ import { createInitialConversationCreativeState, } from "../../gateway/prompt/types"; import type { PromptVersionRecord } from "../../gateway/prompt/types"; +import { createId } from "../../../../shared/createId"; import { ChatConversationNotFoundError, ChatPromptStateConflictError, @@ -124,7 +125,7 @@ export class MemoryChatStateRepository implements ChatStateRepository { const createdAt = new Date().toISOString(); const conversation: MemoryConversationRecord = { - id: crypto.randomUUID(), + id: createId("conversation"), userId, promptState: createInitialConversationCreativeState(), isActive: true, @@ -531,10 +532,7 @@ export class MemoryChatStateRepository implements ChatStateRepository { updatedAt: input.acceptedAt, }); - const edgeId = - typeof crypto !== "undefined" && "randomUUID" in crypto - ? crypto.randomUUID() - : `accepted-edge-${Date.now()}-${Math.random().toString(16).slice(2, 8)}`; + const edgeId = createId("accepted-edge"); this.assetEdges.set(edgeId, { id: edgeId, conversationId: conversation.id, diff --git a/server/src/chat/persistence/postgres.promptArtifacts.test.ts b/server/src/chat/persistence/postgres.promptArtifacts.test.ts index 204cfdb..0a638a9 100644 --- a/server/src/chat/persistence/postgres.promptArtifacts.test.ts +++ b/server/src/chat/persistence/postgres.promptArtifacts.test.ts @@ -15,6 +15,7 @@ describe("PostgresChatStateRepository#getPromptArtifactsForTurn", () => { id: "artifact-1", run_id: "run-1", turn_id: "turn-1", + trace_id: "trace-run-1", version: 1, stage: "rewrite", target_key: null, @@ -94,6 +95,7 @@ describe("PostgresChatStateRepository#getPromptArtifactsForTurn", () => { id: "artifact-2", run_id: "run-1", turn_id: "turn-1", + trace_id: "trace-run-1", version: 2, stage: "dispatch", target_key: "dashscope:qwen-image-2.0-pro", @@ -256,6 +258,7 @@ describe("PostgresChatStateRepository#getPromptObservabilityForConversation", () id: "artifact-1", run_id: "run-1", turn_id: "turn-1", + trace_id: "trace-run-1", version: 1, stage: "dispatch", target_key: "dashscope:qwen-image-2.0-pro", @@ -293,6 +296,7 @@ describe("PostgresChatStateRepository#getPromptObservabilityForConversation", () id: "artifact-2", run_id: "run-1", turn_id: "turn-1", + trace_id: "trace-run-1", version: 2, stage: "dispatch", target_key: "dashscope:qwen-image-2.0-pro-fallback", diff --git a/server/src/chat/persistence/postgres.ts b/server/src/chat/persistence/postgres.ts index 6fd96a6..2815bac 100644 --- a/server/src/chat/persistence/postgres.ts +++ b/server/src/chat/persistence/postgres.ts @@ -17,11 +17,11 @@ import { cloneConversationCreativeState, createInitialConversationCreativeState, } from "../../gateway/prompt/types"; -import type { PromptVersionRecord } from "../../gateway/prompt/types"; import { ChatConversationNotFoundError, ChatPromptStateConflictError, } from "./types"; +import { createId } from "../../../../shared/createId"; import type { AcceptConversationTurnInput, ChatConversationRecord, @@ -301,6 +301,13 @@ const MIGRATIONS = [ ON chat_prompt_versions(conversation_id, created_at DESC); `, }, + { + name: "006_prompt_trace_id", + sql: ` + ALTER TABLE chat_prompt_versions + ADD COLUMN IF NOT EXISTS trace_id TEXT NULL; + `, + }, ] as const; interface ChatTurnRow { @@ -410,6 +417,7 @@ interface ChatPromptArtifactRow { id: string; run_id: string; turn_id: string; + trace_id: string | null; version: number; stage: PersistedPromptArtifactRecord["stage"]; target_key: string | null; @@ -470,6 +478,7 @@ const parsePromptSnapshot = (value: unknown): PersistedRunRecord["prompt"] => { const parseTelemetry = (value: unknown): PersistedRunRecord["telemetry"] => { if (typeof value !== "object" || value === null) { return { + traceId: null, providerRequestId: null, providerTaskId: null, latencyMs: null, @@ -478,6 +487,7 @@ const parseTelemetry = (value: unknown): PersistedRunRecord["telemetry"] => { const telemetry = value as Record; return { + traceId: typeof telemetry.traceId === "string" ? telemetry.traceId : null, providerRequestId: typeof telemetry.providerRequestId === "string" ? telemetry.providerRequestId : null, providerTaskId: typeof telemetry.providerTaskId === "string" ? telemetry.providerTaskId : null, @@ -702,6 +712,7 @@ const toPromptArtifactRecord = ( id: row.id, runId: row.run_id, turnId: row.turn_id, + traceId: row.trace_id, version: row.version, stage: row.stage, targetKey: row.target_key, @@ -818,7 +829,7 @@ export class PostgresChatStateRepository implements ChatStateRepository { } const createdAt = new Date().toISOString(); - const conversationId = crypto.randomUUID(); + const conversationId = createId("conversation"); await this.pool.query( ` INSERT INTO chat_conversations ( @@ -1200,6 +1211,7 @@ export class PostgresChatStateRepository implements ChatStateRepository { compiler_version, capability_version, original_prompt, + trace_id, prompt_intent, turn_delta, committed_state_before, @@ -1290,6 +1302,7 @@ export class PostgresChatStateRepository implements ChatStateRepository { compiler_version, capability_version, original_prompt, + trace_id, prompt_intent, turn_delta, committed_state_before, @@ -1337,7 +1350,7 @@ export class PostgresChatStateRepository implements ChatStateRepository { async clearActiveConversation(userId: string) { await this.ensureReady(); const createdAt = new Date().toISOString(); - const conversationId = crypto.randomUUID(); + const conversationId = createId("conversation"); await this.withTransaction(async (client) => { await client.query( @@ -1803,6 +1816,7 @@ export class PostgresChatStateRepository implements ChatStateRepository { compiler_version, capability_version, original_prompt, + trace_id, prompt_intent, turn_delta, committed_state_before, @@ -1817,9 +1831,9 @@ export class PostgresChatStateRepository implements ChatStateRepository { created_at ) VALUES ( - $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12::jsonb, $13::jsonb, - $14::jsonb, $15::jsonb, $16::jsonb, $17, $18, $19, $20::jsonb, $21::jsonb, - $22::jsonb, $23::timestamptz + $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13::jsonb, $14::jsonb, + $15::jsonb, $16::jsonb, $17::jsonb, $18, $19, $20, $21::jsonb, $22::jsonb, + $23::jsonb, $24::timestamptz ); `, [ @@ -1834,6 +1848,7 @@ export class PostgresChatStateRepository implements ChatStateRepository { version.compilerVersion, version.capabilityVersion, version.originalPrompt, + version.traceId, JSON.stringify(version.promptIntent), JSON.stringify(version.turnDelta), JSON.stringify(version.committedStateBefore), @@ -2036,7 +2051,7 @@ export class PostgresChatStateRepository implements ChatStateRepository { VALUES ($1, $2, $3, $4, 'accepted_as_final', $5, $6, $7::timestamptz); `, [ - crypto.randomUUID(), + createId("accepted-edge"), turnRow.conversation_id, previousBaseAssetId ?? input.assetId, input.assetId, diff --git a/server/src/chat/persistence/repository.ts b/server/src/chat/persistence/repository.ts index 41ac46a..e8364f8 100644 --- a/server/src/chat/persistence/repository.ts +++ b/server/src/chat/persistence/repository.ts @@ -5,8 +5,13 @@ import { PostgresChatStateRepository } from "./postgres"; import type { ChatStateRepository } from "./types"; export const createChatStateRepository = ( - databaseUrl = getConfig().databaseUrl + database: Pool | string | undefined = getConfig().databaseUrl ): ChatStateRepository => { + if (database instanceof Pool) { + return new PostgresChatStateRepository(database); + } + + const databaseUrl = database; if (!databaseUrl) { return new MemoryChatStateRepository(); } diff --git a/server/src/config.test.ts b/server/src/config.test.ts index e90e303..50b140e 100644 --- a/server/src/config.test.ts +++ b/server/src/config.test.ts @@ -46,4 +46,13 @@ describe("server config startup validation", () => { expect(getConfig().allowUnsignedDevAuth).toBe(true); }); + + it("does not trust proxy request ids by default", async () => { + vi.stubEnv("NODE_ENV", "development"); + + const { getConfig, resetConfigForTests } = await import("./config"); + resetConfigForTests(); + + expect(getConfig().trustProxyRequestId).toBe(false); + }); }); diff --git a/server/src/config.ts b/server/src/config.ts index 36ed966..5e95a98 100644 --- a/server/src/config.ts +++ b/server/src/config.ts @@ -89,9 +89,14 @@ const envSchema = z.object({ KLING_SECRET_KEY: optionalTrimmedString(), KLING_API_BASE_URL: optionalUrlString().default("https://api-beijing.klingai.com"), DATABASE_URL: optionalTrimmedString(), + SUPABASE_URL: optionalUrlString(), + SUPABASE_SERVICE_ROLE_KEY: optionalTrimmedString(), + SUPABASE_STORAGE_BUCKET: optionalTrimmedString(), + ASSET_URL_SECRET: optionalTrimmedString(), AUTH_JWT_SECRET: optionalTrimmedString(), AUTH_JWT_ISSUER: optionalTrimmedString(), AUTH_JWT_AUDIENCE: optionalTrimmedString(), + TRUST_PROXY_REQUEST_ID: optionalBooleanString(), ALLOW_UNSIGNED_DEV_AUTH: optionalBooleanString(), DEV_AUTH_ALLOWED_USER_IDS: optionalTrimmedString(), }); @@ -125,9 +130,14 @@ export interface AppConfig { klingSecretKey?: string; klingApiBaseUrl: string; databaseUrl?: string; + supabaseUrl?: string; + supabaseServiceRoleKey?: string; + supabaseStorageBucket?: string; + assetUrlSecret?: string; authJwtSecret?: string; authJwtIssuer?: string; authJwtAudience?: string; + trustProxyRequestId: boolean; allowUnsignedDevAuth: boolean; devAuthAllowedUserIds: string[]; } @@ -187,9 +197,17 @@ export const getConfig = (): AppConfig => { klingSecretKey: env.KLING_SECRET_KEY, klingApiBaseUrl: env.KLING_API_BASE_URL.replace(/\/+$/, ""), databaseUrl: env.DATABASE_URL, + supabaseUrl: env.SUPABASE_URL?.replace(/\/+$/, ""), + supabaseServiceRoleKey: env.SUPABASE_SERVICE_ROLE_KEY, + supabaseStorageBucket: env.SUPABASE_STORAGE_BUCKET ?? "assets", + assetUrlSecret: + env.ASSET_URL_SECRET ?? + env.AUTH_JWT_SECRET ?? + "filmlab-dev-asset-url-secret", authJwtSecret: env.AUTH_JWT_SECRET, authJwtIssuer: env.AUTH_JWT_ISSUER, authJwtAudience: env.AUTH_JWT_AUDIENCE, + trustProxyRequestId: env.TRUST_PROXY_REQUEST_ID ?? false, allowUnsignedDevAuth: env.ALLOW_UNSIGNED_DEV_AUTH ?? ((env.NODE_ENV ?? "development") === "development"), devAuthAllowedUserIds: (env.DEV_AUTH_ALLOWED_USER_IDS ?? "local-user") .split(",") diff --git a/server/src/fastify.d.ts b/server/src/fastify.d.ts index f534493..e5eb7d8 100644 --- a/server/src/fastify.d.ts +++ b/server/src/fastify.d.ts @@ -1,8 +1,10 @@ import "fastify"; +import type { AssetService } from "./assets/service"; import type { ChatStateRepository } from "./chat/persistence/types"; declare module "fastify" { interface FastifyInstance { + assetService: AssetService; chatStateRepository: ChatStateRepository; } } diff --git a/server/src/gateway/prompt/compiler.test.ts b/server/src/gateway/prompt/compiler.test.ts index c4dfdcf..98679cc 100644 --- a/server/src/gateway/prompt/compiler.test.ts +++ b/server/src/gateway/prompt/compiler.test.ts @@ -120,10 +120,10 @@ describe("prompt compiler", () => { expect(promptIr.operation).toBe(expectedOperation); expect(promptIr.goal).toBe("Refine the neon skyline with cleaner typography"); expect(promptIr.sourceAssets).toEqual( - assetRefs.filter((entry) => entry.role !== "reference") + request.assetRefs.filter((entry) => entry.role !== "reference") ); expect(promptIr.referenceAssets).toEqual( - assetRefs.filter((entry) => entry.role === "reference") + request.assetRefs.filter((entry) => entry.role === "reference") ); }); diff --git a/server/src/gateway/prompt/evals.test.ts b/server/src/gateway/prompt/evals.test.ts new file mode 100644 index 0000000..5a1a384 --- /dev/null +++ b/server/src/gateway/prompt/evals.test.ts @@ -0,0 +1,283 @@ +import { afterEach, describe, expect, it, vi } from "vitest"; +import type { AppConfig } from "../../config"; +import type { ImageModelPromptCompilerCapabilities } from "../../../../shared/imageModelCatalog"; +import { getImageModelCapabilityFactByModelId } from "../../../../shared/imageModelCapabilityFacts"; +import { imageGenerationRequestSchema } from "../../shared/imageGenerationSchema"; +import { + applyTurnDelta, + buildPromptIR, + compilePromptForTarget, + createPromptCompilationContext, +} from "./compiler"; +import { buildFallbackTurnDelta, rewriteTurn } from "./rewrite"; +import { createInitialConversationCreativeState } from "./types"; +import type { ResolvedRouteTarget } from "../router/types"; + +const createConfig = (overrides: Partial = {}): AppConfig => ({ + nodeEnv: "test", + host: "127.0.0.1", + port: 3001, + corsOrigin: "http://localhost:5173", + requestBodyLimitBytes: 12 * 1024 * 1024, + providerRequestTimeoutMs: 120_000, + rateLimitMax: 20, + rateLimitTimeWindowMs: 60_000, + imageGenerateRateLimitMax: 20, + imageGenerateRateLimitTimeWindowMs: 60_000, + imageUpscaleRateLimitMax: 20, + imageUpscaleRateLimitTimeWindowMs: 60_000, + generatedImageGetRateLimitMax: 120, + generatedImageGetRateLimitTimeWindowMs: 60_000, + generatedImageDownloadMaxBytes: 32 * 1024 * 1024, + referenceImageDownloadMaxBytes: 8 * 1024 * 1024, + promptRewriteTimeoutMs: 15_000, + arkApiBaseUrl: "https://ark.example.com", + dashscopeApiBaseUrl: "https://dashscope.example.com", + klingApiBaseUrl: "https://kling.example.com", + allowUnsignedDevAuth: false, + devAuthAllowedUserIds: ["local-user"], + ...overrides, + trustProxyRequestId: overrides.trustProxyRequestId ?? false, +}); + +const createRequest = ( + overrides: Partial[0]> = {} +) => + imageGenerationRequestSchema.parse({ + prompt: "Refine the neon skyline", + modelId: "qwen-image-2-pro", + aspectRatio: "1:1", + batchSize: 1, + style: "none", + referenceImages: [], + assetRefs: [], + modelParams: { + promptExtend: true, + }, + ...overrides, + }); + +const createTarget = ( + modelId: Parameters[0], + promptCompiler?: ImageModelPromptCompilerCapabilities +): ResolvedRouteTarget => { + const fact = getImageModelCapabilityFactByModelId(modelId); + if (!fact) { + throw new Error(`Missing capability fact for ${modelId}.`); + } + + const providerId = + fact.modelFamily === "seedream" + ? "ark" + : fact.modelFamily === "kling" + ? "kling" + : "dashscope"; + + return { + frontendModel: { + id: fact.modelId, + label: fact.modelId, + logicalModel: fact.logicalModel, + modelFamily: fact.modelFamily, + capability: "image.generate", + routingPolicy: "default", + visible: true, + description: `${fact.modelId} eval fixture`, + constraints: fact.constraints, + parameterDefinitions: fact.parameterDefinitions, + defaults: fact.defaults, + promptCompiler: promptCompiler ?? fact.promptCompiler, + supportsUpscale: fact.supportsUpscale, + }, + deployment: { + id: `${fact.modelId}-deployment`, + logicalModel: fact.logicalModel, + provider: providerId, + providerModel: `${fact.modelId}-provider-model`, + capability: "image.generate", + enabled: true, + priority: 1, + }, + provider: { + id: providerId, + name: providerId === "ark" ? "Ark" : providerId === "kling" ? "Kling" : "DashScope", + credentialSlot: providerId, + operations: ["image.generate"], + healthScope: "model_operation", + family: "http", + }, + }; +}; + +const createCandidateState = () => + applyTurnDelta( + createInitialConversationCreativeState(), + { + prompt: "Refine the neon skyline with cleaner typography", + preserve: ["main character silhouette"], + avoid: ["muddy shadows"], + styleDirectives: ["cinematic"], + continuityTargets: ["subject", "text"], + editOps: [{ op: "remove", target: "coffee cup" }], + referenceAssetIds: ["thread-asset-ref-1"], + }, + "turn-1" + ); + +afterEach(() => { + vi.restoreAllMocks(); + vi.unstubAllGlobals(); +}); + +describe("prompt evals", () => { + it("falls back deterministically when rewrite config is unavailable", async () => { + const request = createRequest({ + prompt: " Refine the neon skyline ", + promptIntent: { + preserve: ["Poster text", "poster text"], + avoid: [" watermark ", "Watermark"], + styleDirectives: ["cinematic", " Cinematic "], + continuityTargets: ["text"], + editOps: [{ op: "remove", target: "coffee cup" }], + }, + assetRefs: [{ assetId: "thread-asset-ref-1", role: "reference" }], + }); + + const result = await rewriteTurn( + request, + createInitialConversationCreativeState(), + createConfig() + ); + + expect(result).toEqual({ + turnDelta: buildFallbackTurnDelta(request), + degraded: true, + warning: "Prompt rewrite degraded to deterministic fallback.", + }); + }); + + it("normalizes upstream rewrite output into the persisted turn delta shape", async () => { + const fetchMock = vi.fn().mockResolvedValue({ + ok: true, + json: async () => ({ + choices: [ + { + message: { + content: JSON.stringify({ + prompt: " Refine the neon skyline ", + preserve: ["Poster text", "poster text", "Lead subject"], + avoid: [" watermark ", "Watermark"], + styleDirectives: ["cinematic", " Cinematic "], + continuityTargets: ["subject", "text"], + editOps: [{ op: "remove", target: "coffee cup" }], + referenceAssetIds: ["thread-asset-ref-1", " thread-asset-ref-1 "], + }), + }, + }, + ], + }), + }); + vi.stubGlobal("fetch", fetchMock); + + const result = await rewriteTurn( + createRequest(), + createInitialConversationCreativeState(), + createConfig({ + promptRewriteBaseUrl: "https://rewrite.example.com", + promptRewriteApiKey: "rewrite-key", + promptRewriteModel: "gpt-rewrite", + }) + ); + + expect(fetchMock).toHaveBeenCalledOnce(); + expect(result).toEqual({ + turnDelta: { + prompt: "Refine the neon skyline", + preserve: ["Poster text", "Lead subject"], + avoid: ["watermark"], + styleDirectives: ["cinematic"], + continuityTargets: ["subject", "text"], + editOps: [{ op: "remove", target: "coffee cup" }], + referenceAssetIds: ["thread-asset-ref-1"], + }, + degraded: false, + warning: null, + }); + }); + + it.each([ + { + name: "negative prompt merge is emitted as semantic loss on text-only targets", + request: createRequest({ + negativePrompt: "avoid lens flare, avoid watermark", + }), + target: createTarget("seedream-v5", { + ...getImageModelCapabilityFactByModelId("seedream-v5")!.promptCompiler, + negativePromptStrategy: "merge_into_main", + }), + operation: "image.generate" as const, + expectedCodes: ["NEGATIVE_PROMPT_DEGRADED_TO_TEXT"], + expectedNegativePrompt: null, + compiledPromptIncludes: ["avoid lens flare", "avoid watermark"], + }, + { + name: "degraded edit flows preserve stable hashes and explicit loss codes", + request: createRequest({ + negativePrompt: "avoid lens flare", + assetRefs: [{ assetId: "thread-asset-source-1", role: "edit" }], + }), + target: createTarget("qwen-image-2-pro", { + acceptedOperations: ["image.generate", "image.edit", "image.variation"], + executableOperations: ["image.generate"], + negativePromptStrategy: "merge_into_main", + sourceImageExecution: "reference_guided", + referenceRoleHandling: { + reference: "native", + edit: "compiled_to_reference", + variation: "compiled_to_reference", + }, + continuityStrength: { + subject: "strong", + style: "strong", + composition: "moderate", + text: "weak", + }, + promptSurface: "natural_language", + }), + operation: "image.edit" as const, + expectedCodes: [ + "OPERATION_DEGRADED_TO_IMAGE_GENERATE", + "APPROXIMATED_AS_REGENERATION", + "ASSET_ROLE_DEGRADED_TO_REFERENCE_GUIDANCE", + "STYLE_REFERENCE_ROLE_COLLAPSED", + "EXACT_TEXT_CONTINUITY_AT_RISK", + "NEGATIVE_PROMPT_DEGRADED_TO_TEXT", + ], + expectedNegativePrompt: null, + compiledPromptIncludes: ["Compiled Operation: image.generate"], + }, + ])("$name", ({ request, target, operation, expectedCodes, expectedNegativePrompt, compiledPromptIncludes }) => { + const state = createCandidateState(); + const promptIr = buildPromptIR(request, state); + const context = createPromptCompilationContext( + state, + "deterministic-fallback", + operation, + "recompile" + ); + + const first = compilePromptForTarget(request, promptIr, state, target, context); + const second = compilePromptForTarget(request, promptIr, state, target, context); + + expect(first.negativePrompt).toBe(expectedNegativePrompt); + expect(first.semanticLosses.map((loss) => loss.code)).toEqual( + expect.arrayContaining(expectedCodes) + ); + expect(first.prefixHash).toBe(second.prefixHash); + expect(first.payloadHash).toBe(second.payloadHash); + + for (const snippet of compiledPromptIncludes) { + expect(first.compiledPrompt).toContain(snippet); + } + }); +}); diff --git a/server/src/gateway/router/router.ts b/server/src/gateway/router/router.ts index f1060a3..1401fd2 100644 --- a/server/src/gateway/router/router.ts +++ b/server/src/gateway/router/router.ts @@ -109,9 +109,9 @@ export const imageRuntimeRouter = { }); }, async upscale( - request: ParsedImageUpscaleRequest, - payload: { imageBuffer: Buffer; mimeType: string }, - options?: { signal?: AbortSignal; timeoutMs?: number; traceId?: string } + _request: ParsedImageUpscaleRequest, + _payload: { imageBuffer: Buffer; mimeType: string }, + _options?: { signal?: AbortSignal; timeoutMs?: number; traceId?: string } ) { throw new ProviderError("Image upscale is not available in the model registry refactor.", 400); }, diff --git a/server/src/index.test.ts b/server/src/index.test.ts index 4b94542..2fe76a6 100644 --- a/server/src/index.test.ts +++ b/server/src/index.test.ts @@ -48,4 +48,52 @@ describe("buildServer", () => { expect(closeMock).toHaveBeenCalledTimes(1); }); + + it("generates a server-side x-request-id by default", async () => { + const { buildServer } = await import("./index"); + + const app = await buildServer(); + + const response = await app.inject({ + method: "GET", + url: "/health", + headers: { + "x-request-id": "client-trace-1", + }, + }); + + expect(response.headers["x-request-id"]).toEqual(expect.stringMatching(/^req-/)); + expect(response.headers["x-request-id"]).not.toBe("client-trace-1"); + + await app.close(); + }); + + it("reuses x-request-id only when trusted proxy mode is enabled", async () => { + vi.stubEnv("TRUST_PROXY_REQUEST_ID", "true"); + const { resetConfigForTests } = await import("./config"); + resetConfigForTests(); + + const { buildServer } = await import("./index"); + const app = await buildServer(); + + const echoed = await app.inject({ + method: "GET", + url: "/health", + headers: { + "x-request-id": "proxy-trace-1", + }, + }); + const invalid = await app.inject({ + method: "GET", + url: "/health", + headers: { + "x-request-id": "bad trace id with spaces", + }, + }); + + expect(echoed.headers["x-request-id"]).toBe("proxy-trace-1"); + expect(invalid.headers["x-request-id"]).toEqual(expect.stringMatching(/^req-/)); + + await app.close(); + }); }); diff --git a/server/src/index.ts b/server/src/index.ts index 01f3b96..92f3b08 100644 --- a/server/src/index.ts +++ b/server/src/index.ts @@ -1,14 +1,20 @@ import Fastify, { type FastifyInstance } from "fastify"; import path from "node:path"; import { fileURLToPath } from "node:url"; +import { Pool } from "pg"; +import { createAssetRepository } from "./assets/repository"; +import { AssetService } from "./assets/service"; +import { createAssetStorage } from "./assets/storage"; import { createChatStateRepository } from "./chat/persistence/repository"; import { assertStartupConfig, getConfig } from "./config"; import { registerCors } from "./plugins/cors"; import { registerRateLimit } from "./plugins/rateLimit"; +import { assetRoute } from "./routes/assets"; import { generatedImageRoute } from "./routes/generated-image"; import { modelCatalogRoute } from "./routes/model-catalog"; import { imageConversationRoute } from "./routes/image-conversation"; import { imageGenerateRoute } from "./routes/image-generate"; +import { attachTraceIdHeader, createRequestTraceId } from "./shared/requestTrace"; export const buildServer = async () => { const config = getConfig(); @@ -22,16 +28,35 @@ export const buildServer = async () => { }, }, bodyLimit: config.requestBodyLimitBytes, + genReqId: (request) => + createRequestTraceId(request.headers, { + trustProxyRequestId: config.trustProxyRequestId, + }), }); - const repository = createChatStateRepository(config.databaseUrl); + const pool = config.databaseUrl + ? new Pool({ + connectionString: config.databaseUrl, + }) + : null; + const repository = createChatStateRepository(pool ?? config.databaseUrl); + const assetService = new AssetService( + createAssetRepository(pool, config.supabaseStorageBucket ?? "assets"), + createAssetStorage(config) + ); app.decorate("chatStateRepository", repository); + app.decorate("assetService", assetService); app.addHook("onClose", async () => { await repository.close(); + await assetService.close(); + }); + app.addHook("onRequest", async (request, reply) => { + attachTraceIdHeader(reply, request.id); }); await app.register(registerCors); await app.register(registerRateLimit); + await app.register(assetRoute); await app.register(generatedImageRoute); await app.register(imageConversationRoute); await app.register(imageGenerateRoute); diff --git a/server/src/providers/base/client.ts b/server/src/providers/base/client.ts index 80a7271..88ff194 100644 --- a/server/src/providers/base/client.ts +++ b/server/src/providers/base/client.ts @@ -1,4 +1,6 @@ import { getConfig } from "../../config"; +import { createId } from "../../../../shared/createId"; +import { REQUEST_ID_HEADER } from "../../shared/requestTrace"; import { fetchWithTimeout } from "../../shared/fetchWithTimeout"; import type { ProviderRawResponse, ProviderRequestContext } from "./types"; @@ -7,7 +9,7 @@ export const createProviderRequestContext = ( ): ProviderRequestContext => ({ signal: options?.signal, timeoutMs: options?.timeoutMs ?? getConfig().providerRequestTimeoutMs, - traceId: options?.traceId ?? "provider-request", + traceId: options?.traceId ?? createId("provider-request"), }); export const fetchProviderResponse = ( @@ -16,10 +18,22 @@ export const fetchProviderResponse = ( timeoutMessage: string, context: ProviderRequestContext ) => - fetchWithTimeout(input, init, timeoutMessage, { - signal: context.signal, - timeoutMs: context.timeoutMs, - }); + fetchWithTimeout( + input, + { + ...init, + headers: (() => { + const headers = new Headers(init.headers); + headers.set(REQUEST_ID_HEADER, context.traceId); + return headers; + })(), + }, + timeoutMessage, + { + signal: context.signal, + timeoutMs: context.timeoutMs, + } + ); export const toProviderRawResponse = ( response: Response, diff --git a/server/src/providers/base/types.ts b/server/src/providers/base/types.ts index 8878f09..21b1b46 100644 --- a/server/src/providers/base/types.ts +++ b/server/src/providers/base/types.ts @@ -5,6 +5,7 @@ import type { ParsedImageUpscaleRequest } from "../../shared/imageUpscaleSchema" import type { FrontendImageModelId, ImageDeploymentId, LogicalImageModelId } from "../../../../shared/imageModelCatalog"; import type { ResolvedRouteTarget, RuntimeProviderId } from "../../gateway/router/types"; import type { ImageUpscaleScale } from "../../../../shared/imageGeneration"; +import type { ResolvedProviderAssetRef } from "../../assets/types"; export interface ProviderGeneratedImage { imageUrl?: string; @@ -45,7 +46,9 @@ export interface RuntimeProviderCredentials { export interface PlatformProviderGenerateInput { target: ResolvedRouteTarget; - request: ParsedImageGenerationRequest; + request: ParsedImageGenerationRequest & { + resolvedAssetRefs?: ResolvedProviderAssetRef[]; + }; credentials: RuntimeProviderCredentials; options?: { signal?: AbortSignal; timeoutMs?: number; traceId?: string }; } diff --git a/server/src/providers/dashscope/models/qwen.test.ts b/server/src/providers/dashscope/models/qwen.test.ts index 7a5c1ef..7633a48 100644 --- a/server/src/providers/dashscope/models/qwen.test.ts +++ b/server/src/providers/dashscope/models/qwen.test.ts @@ -172,16 +172,22 @@ describe("generateDashscopeQwen", () => { const { generateDashscopeQwen } = await import("./qwen"); const input = createInput(); - input.request.referenceImages = [ + input.request.resolvedAssetRefs = [ { - id: "ref-1", - url: "data:image/png;base64,AAA", - type: "content", + assetId: "asset-ref-1", + role: "reference", + referenceType: "content", + weight: 1, + signedUrl: "https://assets.example.com/ref-1.png", + mimeType: "image/png", }, { - id: "ref-2", - url: "data:image/png;base64,BBB", - type: "content", + assetId: "asset-ref-2", + role: "reference", + referenceType: "content", + weight: 1, + signedUrl: "https://assets.example.com/ref-2.png", + mimeType: "image/png", }, ]; @@ -199,8 +205,8 @@ describe("generateDashscopeQwen", () => { }; }; expect(requestPayload.input.messages[0]?.content).toEqual([ - { image: "data:image/png;base64,AAA" }, - { image: "data:image/png;base64,BBB" }, + { image: "https://assets.example.com/ref-1.png" }, + { image: "https://assets.example.com/ref-2.png" }, { text: "Rainy alley" }, ]); }); diff --git a/server/src/providers/dashscope/models/qwen.ts b/server/src/providers/dashscope/models/qwen.ts index e4559c3..3d357c3 100644 --- a/server/src/providers/dashscope/models/qwen.ts +++ b/server/src/providers/dashscope/models/qwen.ts @@ -15,11 +15,11 @@ const getDashScopeGenerationUrl = () => const resolvePromptExtend = (value: unknown) => (typeof value === "boolean" ? value : true); const buildQwenMessageContent = (input: PlatformProviderGenerateInput) => { - const referenceImages = input.request.referenceImages - .filter((referenceImage) => typeof referenceImage.url === "string" && referenceImage.url.trim()) + const referenceImages = (input.request.resolvedAssetRefs ?? []) + .filter((referenceImage) => Boolean(referenceImage.signedUrl.trim())) .slice(0, 3) .map((referenceImage) => ({ - image: referenceImage.url.trim(), + image: referenceImage.signedUrl.trim(), })); if (referenceImages.length === 0) { diff --git a/server/src/routes/assets.ts b/server/src/routes/assets.ts new file mode 100644 index 0000000..caad7ab --- /dev/null +++ b/server/src/routes/assets.ts @@ -0,0 +1,198 @@ +import type { FastifyPluginAsync } from "fastify"; +import { requireAuthenticatedUser } from "../auth/user"; + +const readRawBody = async (request: { raw: AsyncIterable }) => { + const chunks: Buffer[] = []; + for await (const chunk of request.raw) { + chunks.push(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk)); + } + return Buffer.concat(chunks); +}; + +export const assetRoute: FastifyPluginAsync = async (app) => { + app.post("/api/assets/uploads/init", async (request, reply) => { + const userId = requireAuthenticatedUser(request); + if (!userId) { + return reply.code(401).send({ error: "Unauthorized." }); + } + + const body = request.body as Record; + const name = typeof body.name === "string" ? body.name.trim() : ""; + const mimeType = typeof body.type === "string" ? body.type.trim() : ""; + const sizeBytes = Number(body.size); + const createdAt = typeof body.createdAt === "string" ? body.createdAt : new Date().toISOString(); + const source = body.source === "ai-generated" ? "ai-generated" : "imported"; + const origin = + body.origin === "url" || body.origin === "ai" ? body.origin : "file"; + const contentHash = typeof body.contentHash === "string" ? body.contentHash.trim() : ""; + const tags = Array.isArray(body.tags) + ? body.tags.filter((entry): entry is string => typeof entry === "string") + : []; + const metadata = + body.metadata && typeof body.metadata === "object" && !Array.isArray(body.metadata) + ? (body.metadata as Record) + : undefined; + const includeThumbnail = Boolean(body.includeThumbnail); + const assetId = + typeof body.assetId === "string" && body.assetId.trim().length > 0 + ? body.assetId.trim() + : undefined; + + if (!name || !mimeType || !Number.isFinite(sizeBytes) || !contentHash) { + return reply.code(400).send({ error: "Invalid upload init payload." }); + } + + return reply.send( + await app.assetService.prepareUpload({ + assetId, + userId, + name, + mimeType, + sizeBytes, + createdAt, + source, + origin, + contentHash, + tags, + metadata, + includeThumbnail, + }) + ); + }); + + app.put("/api/assets/upload/:assetId/:kind", async (request, reply) => { + const userId = requireAuthenticatedUser(request); + if (!userId) { + return reply.code(401).send({ error: "Unauthorized." }); + } + const params = request.params as { assetId?: string; kind?: string }; + const assetId = params.assetId?.trim(); + const kind = + params.kind === "thumbnail" + ? "thumbnail" + : params.kind === "original" + ? "original" + : null; + if (!assetId || !kind) { + return reply.code(400).send({ error: "Invalid upload target." }); + } + + const mimeType = request.headers["content-type"]; + if (typeof mimeType !== "string" || !mimeType.startsWith("image/")) { + return reply.code(400).send({ error: "Only image uploads are supported." }); + } + + const buffer = await readRawBody(request); + if (buffer.byteLength === 0) { + return reply.code(400).send({ error: "Empty upload body." }); + } + + await app.assetService.uploadSessionObject({ + userId, + assetId, + kind, + buffer, + mimeType, + }); + return reply.code(204).send(); + }); + + app.post("/api/assets/uploads/:assetId/complete", async (request, reply) => { + const userId = requireAuthenticatedUser(request); + if (!userId) { + return reply.code(401).send({ error: "Unauthorized." }); + } + const params = request.params as { assetId?: string }; + const assetId = params.assetId?.trim(); + if (!assetId) { + return reply.code(400).send({ error: "Asset id is required." }); + } + return reply.send(await app.assetService.completeUpload(userId, assetId)); + }); + + app.get("/api/assets/changes", async (request, reply) => { + const userId = requireAuthenticatedUser(request); + if (!userId) { + return reply.code(401).send({ error: "Unauthorized." }); + } + + const since = + typeof request.query === "object" && + request.query !== null && + "since" in request.query && + typeof (request.query as Record).since === "string" + ? ((request.query as Record).since as string).trim() + : undefined; + return reply.send({ + changes: await app.assetService.listChanges(userId, since), + }); + }); + + app.get("/api/assets/:assetId", async (request, reply) => { + const userId = requireAuthenticatedUser(request); + if (!userId) { + return reply.code(401).send({ error: "Unauthorized." }); + } + const params = request.params as { assetId?: string }; + const assetId = params.assetId?.trim(); + if (!assetId) { + return reply.code(400).send({ error: "Asset id is required." }); + } + const asset = await app.assetService.getAsset(userId, assetId); + if (!asset) { + return reply.code(404).send({ error: "Asset not found." }); + } + return reply.send(asset); + }); + + app.get("/api/assets/:assetId/:kind", async (request, reply) => { + const params = request.params as { assetId?: string; kind?: string }; + const assetId = params.assetId?.trim(); + const kind = + params.kind === "thumbnail" + ? "thumbnail" + : params.kind === "original" + ? "original" + : null; + const token = + typeof request.query === "object" && + request.query !== null && + "token" in request.query && + typeof (request.query as Record).token === "string" + ? ((request.query as Record).token as string).trim() + : undefined; + + if (!assetId || !kind) { + return reply.code(400).send({ error: "Invalid asset path." }); + } + + const content = await app.assetService.resolveBrowserAssetFile({ + assetId, + kind, + token, + authorization: request.headers.authorization, + }); + if (!content) { + return reply.code(404).send({ error: "Asset file not found." }); + } + + reply.header("Content-Type", content.mimeType); + reply.header("Cache-Control", "private, max-age=60"); + reply.header("X-Content-Type-Options", "nosniff"); + return reply.send(content.buffer); + }); + + app.delete("/api/assets/:assetId", async (request, reply) => { + const userId = requireAuthenticatedUser(request); + if (!userId) { + return reply.code(401).send({ error: "Unauthorized." }); + } + const params = request.params as { assetId?: string }; + const assetId = params.assetId?.trim(); + if (!assetId) { + return reply.code(400).send({ error: "Asset id is required." }); + } + await app.assetService.deleteAsset(userId, assetId); + return reply.code(204).send(); + }); +}; diff --git a/server/src/routes/image-conversation.test.ts b/server/src/routes/image-conversation.test.ts index 87b11a3..53e97da 100644 --- a/server/src/routes/image-conversation.test.ts +++ b/server/src/routes/image-conversation.test.ts @@ -137,6 +137,7 @@ describe("imageConversationRoute", () => { id: "artifact-1", runId: "run-1", turnId: "turn-1", + traceId: "trace-run-1", version: 1, stage: "rewrite", targetKey: null, @@ -166,6 +167,7 @@ describe("imageConversationRoute", () => { id: "artifact-2", runId: "run-1", turnId: "turn-1", + traceId: "trace-run-1", version: 2, stage: "dispatch", targetKey: "dashscope:qwen-image-2.0-pro", diff --git a/server/src/routes/image-generate.eval.test.ts b/server/src/routes/image-generate.eval.test.ts new file mode 100644 index 0000000..3ac8890 --- /dev/null +++ b/server/src/routes/image-generate.eval.test.ts @@ -0,0 +1,691 @@ +import { createHmac } from "node:crypto"; +import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; +import type { AssetService } from "../assets/service"; + +const generateMock = vi.fn(); +const getRouteTargetsMock = vi.fn(); +const downloadGeneratedImageMock = vi.fn(); + +const createPromptCompilerFixture = (input?: { + negativePromptStrategy?: "native" | "merge_into_main"; + sourceImageExecution?: "native" | "reference_guided" | "unsupported"; + referenceRoleHandling?: { + reference: "native" | "compiled_to_reference" | "compiled_to_text"; + edit: "native" | "compiled_to_reference" | "compiled_to_text"; + variation: "native" | "compiled_to_reference" | "compiled_to_text"; + }; + continuityStrength?: { + subject: "strong" | "moderate" | "weak"; + style: "strong" | "moderate" | "weak"; + composition: "strong" | "moderate" | "weak"; + text: "strong" | "moderate" | "weak"; + }; +}) => ({ + acceptedOperations: ["image.generate", "image.edit", "image.variation"], + executableOperations: ["image.generate"], + negativePromptStrategy: input?.negativePromptStrategy ?? "merge_into_main", + sourceImageExecution: input?.sourceImageExecution ?? "unsupported", + referenceRoleHandling: + input?.referenceRoleHandling ?? { + reference: "compiled_to_text", + edit: "compiled_to_text", + variation: "compiled_to_text", + }, + continuityStrength: + input?.continuityStrength ?? { + subject: "weak", + style: "weak", + composition: "weak", + text: "weak", + }, + promptSurface: "natural_language" as const, +}); + +const repositoryMock = { + close: vi.fn(), + getConversationById: vi.fn(), + getOrCreateActiveConversation: vi.fn(), + getConversationSnapshot: vi.fn(), + getPromptArtifactsForTurn: vi.fn(), + getPromptObservabilityForConversation: vi.fn(), + clearActiveConversation: vi.fn(), + deleteTurn: vi.fn(), + getGeneratedImageByCapability: vi.fn(), + createTurn: vi.fn(), + createGeneration: vi.fn(), + createRun: vi.fn(), + createPromptVersions: vi.fn(), + updateConversationPromptState: vi.fn(), + acceptConversationTurn: vi.fn(), + completeGenerationSuccess: vi.fn(), + completeGenerationFailure: vi.fn(), + turnExists: vi.fn(), +}; + +let generatedAssetCounter = 0; + +const assetServiceMock = { + close: vi.fn(), + resolveProviderAssetRefs: vi.fn( + async ( + _userId: string, + assetRefs: Array<{ + assetId: string; + role: "reference" | "edit" | "variation"; + referenceType?: "style" | "content" | "controlnet"; + weight?: number; + }> + ) => + assetRefs.map((assetRef) => ({ + assetId: assetRef.assetId, + role: assetRef.role, + referenceType: assetRef.referenceType ?? "content", + weight: assetRef.weight ?? 1, + signedUrl: `https://assets.example.com/${assetRef.assetId}.png`, + mimeType: "image/png", + })) + ), + createGeneratedAsset: vi.fn(async (input: { createdAt: string; mimeType: string; buffer: Buffer }) => { + generatedAssetCounter += 1; + const assetId = `asset-generated-${generatedAssetCounter}`; + return { + created: true, + assetId, + name: `generated-${generatedAssetCounter}.png`, + type: input.mimeType, + size: input.buffer.byteLength, + source: "ai-generated" as const, + origin: "ai" as const, + contentHash: `hash-${generatedAssetCounter}`, + createdAt: input.createdAt, + updatedAt: input.createdAt, + objectUrl: `/api/assets/${assetId}/original?token=test`, + thumbnailUrl: `/api/assets/${assetId}/original?token=test`, + }; + }), + deleteAsset: vi.fn(async () => undefined), + createAssetEdges: vi.fn(async () => undefined), + deleteAssetEdges: vi.fn(async () => undefined), +}; + +const createRouteTargetFixture = (input: { + modelId: "qwen-image-2-pro" | "seedream-v5"; + deploymentId?: string; + providerId?: "dashscope" | "ark" | "kling"; + providerModel?: string; + promptCompiler?: ReturnType; +}) => { + if (input.modelId === "qwen-image-2-pro") { + return { + frontendModel: { + id: "qwen-image-2-pro", + logicalModel: "image.qwen.v2.pro", + promptCompiler: + input.promptCompiler ?? + createPromptCompilerFixture({ + negativePromptStrategy: "native", + sourceImageExecution: "reference_guided", + referenceRoleHandling: { + reference: "native", + edit: "compiled_to_reference", + variation: "compiled_to_reference", + }, + continuityStrength: { + subject: "strong", + style: "strong", + composition: "strong", + text: "strong", + }, + }), + }, + deployment: { + id: input.deploymentId ?? "dashscope-qwen-image-2-pro-primary", + providerModel: input.providerModel ?? "qwen-image-2.0-pro", + }, + provider: { + id: input.providerId ?? "dashscope", + }, + }; + } + + return { + frontendModel: { + id: "seedream-v5", + logicalModel: "image.seedream.v5", + promptCompiler: input.promptCompiler ?? createPromptCompilerFixture(), + }, + deployment: { + id: input.deploymentId ?? "ark-seedream-v5-primary", + providerModel: input.providerModel ?? "doubao-seedream-5-0-260128", + }, + provider: { + id: input.providerId ?? "ark", + }, + }; +}; + +vi.mock("../gateway/router/router", () => ({ + imageRuntimeRouter: { + getRouteTargets: (...args: unknown[]) => getRouteTargetsMock(...args), + generate: (...args: unknown[]) => generateMock(...args), + }, +})); + +vi.mock("../shared/downloadGeneratedImage", () => ({ + downloadGeneratedImage: (...args: unknown[]) => downloadGeneratedImageMock(...args), +})); + +const encodeBase64Url = (value: string) => + Buffer.from(value, "utf8") + .toString("base64") + .replace(/\+/g, "-") + .replace(/\//g, "_") + .replace(/=+$/g, ""); + +const createBearerToken = (userId: string, secret = "test-secret") => { + const header = encodeBase64Url(JSON.stringify({ alg: "HS256", typ: "JWT" })); + const payload = encodeBase64Url( + JSON.stringify({ + sub: userId, + exp: Math.floor(Date.now() / 1000) + 60, + }) + ); + const signature = createHmac("sha256", secret) + .update(`${header}.${payload}`) + .digest("base64") + .replace(/\+/g, "-") + .replace(/\//g, "_") + .replace(/=+$/g, ""); + return `Bearer ${header}.${payload}.${signature}`; +}; + +const createConversationRecord = () => ({ + id: "conversation-1", + userId: "user-1", + promptState: { + committed: { + prompt: "Original skyline", + preserve: [], + avoid: [], + styleDirectives: [], + continuityTargets: [], + editOps: [], + referenceAssetIds: [], + }, + candidate: null, + baseAssetId: null, + candidateTurnId: null, + revision: 0, + }, + createdAt: "2026-03-12T00:00:00.000Z", + updatedAt: "2026-03-12T00:00:00.000Z", +}); + +const createConversationSnapshot = (input: { + promptState?: (typeof createConversationRecord extends () => infer T ? T : never)["promptState"]; + runs: Array>; + jobs: Array>; +}) => ({ + id: "conversation-1", + thread: { + id: "conversation-1", + creativeBrief: { + latestPrompt: "Original skyline", + latestModelId: "qwen-image-2-pro", + acceptedAssetId: null, + selectedAssetIds: [], + recentAssetRefIds: [], + }, + promptState: input.promptState ?? createConversationRecord().promptState, + createdAt: "2026-03-12T00:00:00.000Z", + updatedAt: "2026-03-12T00:00:00.000Z", + }, + turns: [], + runs: input.runs, + assets: [], + assetEdges: [], + jobs: input.jobs, + createdAt: "2026-03-12T00:00:00.000Z", + updatedAt: "2026-03-12T00:00:05.000Z", +}); + +const createApp = async () => { + const { default: Fastify } = await import("fastify"); + const { imageGenerateRoute } = await import("./image-generate"); + + const app = Fastify(); + app.decorate("chatStateRepository", repositoryMock); + app.decorate("assetService", assetServiceMock as unknown as AssetService); + await app.register(imageGenerateRoute); + return app; +}; + +describe("imageGenerateRoute evals", () => { + beforeEach(() => { + vi.resetModules(); + vi.stubEnv("AUTH_JWT_SECRET", "test-secret"); + generateMock.mockReset(); + getRouteTargetsMock.mockReset(); + downloadGeneratedImageMock.mockReset(); + generatedAssetCounter = 0; + Object.values(repositoryMock).forEach((mockFn) => { + if ("mockReset" in mockFn) { + mockFn.mockReset(); + } + }); + Object.values(assetServiceMock).forEach((mockFn) => { + if ("mockClear" in mockFn) { + mockFn.mockClear(); + } + }); + + repositoryMock.getOrCreateActiveConversation.mockResolvedValue(createConversationRecord()); + getRouteTargetsMock.mockReturnValue([createRouteTargetFixture({ modelId: "qwen-image-2-pro" })]); + }); + + afterEach(() => { + vi.unstubAllEnvs(); + vi.restoreAllMocks(); + }); + + it("reuses prior dispatch artifacts for exact retries and keeps a single trace id", async () => { + repositoryMock.turnExists.mockResolvedValue(true); + repositoryMock.getConversationSnapshot.mockResolvedValue( + createConversationSnapshot({ + runs: [ + { + id: "run-1", + turnId: "turn-1", + jobId: "job-1", + operation: "image.edit", + status: "completed", + requestedTarget: { + modelId: "qwen-image-2-pro", + logicalModel: "image.qwen.v2.pro", + deploymentId: "dashscope-qwen-image-2-pro-primary", + runtimeProvider: "dashscope", + providerModel: "qwen-image-2.0-pro", + pinned: false, + }, + selectedTarget: { + modelId: "qwen-image-2-pro", + logicalModel: "image.qwen.v2.pro", + deploymentId: "dashscope-qwen-image-2-pro-primary", + runtimeProvider: "dashscope", + providerModel: "qwen-image-2.0-pro", + pinned: false, + }, + executedTarget: { + modelId: "qwen-image-2-pro", + logicalModel: "image.qwen.v2.pro", + deploymentId: "dashscope-qwen-image-2-pro-primary", + runtimeProvider: "dashscope", + providerModel: "qwen-image-2.0-pro", + pinned: false, + }, + prompt: { + originalPrompt: "Remove the coffee cup", + compiledPrompt: "Compiled edit", + dispatchedPrompt: "Dispatched degraded edit", + providerEffectivePrompt: null, + semanticLosses: [ + { + code: "OPERATION_DEGRADED_TO_IMAGE_GENERATE", + severity: "warn", + fieldPath: "promptIR.operation", + degradeMode: "approximated", + userMessage: "This edit request was degraded.", + }, + ], + warnings: ["This edit request was degraded."], + }, + error: null, + warnings: [], + assetIds: [], + referencedAssetIds: ["thread-asset-1"], + createdAt: "2026-03-12T00:00:00.000Z", + completedAt: "2026-03-12T00:00:05.000Z", + telemetry: { + traceId: "trace-previous-run-1", + providerRequestId: null, + providerTaskId: null, + latencyMs: 5000, + }, + }, + ], + jobs: [ + { + id: "job-1", + turnId: "turn-1", + runId: "run-1", + modelId: "qwen-image-2-pro", + logicalModel: "image.qwen.v2.pro", + deploymentId: "dashscope-qwen-image-2-pro-primary", + runtimeProvider: "dashscope", + providerModel: "qwen-image-2.0-pro", + compiledPrompt: "Compiled edit", + requestSnapshot: { + prompt: "Remove the coffee cup", + modelId: "qwen-image-2-pro", + aspectRatio: "1:1", + batchSize: 1, + style: "none", + referenceImages: [], + assetRefs: [{ assetId: "thread-asset-1", role: "edit" }], + modelParams: { + promptExtend: true, + }, + }, + status: "succeeded", + error: null, + createdAt: "2026-03-12T00:00:00.000Z", + completedAt: "2026-03-12T00:00:05.000Z", + }, + ], + }) + ); + generateMock.mockImplementation(async (_request, options) => { + const [target] = options.targets ?? []; + if (!target) { + throw new Error("Expected exact retry to select a single target."); + } + + const resolvedRequest = await options.resolveRequest?.(target); + if (!resolvedRequest) { + throw new Error("Expected exact retry to resolve a dispatch request."); + } + + return { + modelId: "qwen-image-2-pro", + logicalModel: "image.qwen.v2.pro", + deploymentId: "dashscope-qwen-image-2-pro-primary", + runtimeProvider: "dashscope", + providerModel: "qwen-image-2.0-pro", + warnings: [], + images: [ + { + binaryData: Buffer.from([1, 2, 3]), + mimeType: "image/png", + }, + ], + }; + }); + + const app = await createApp(); + const response = await app.inject({ + method: "POST", + url: "/api/image-generate", + headers: { + Authorization: createBearerToken("user-1"), + }, + payload: { + prompt: "This should be ignored", + modelId: "seedream-v5", + aspectRatio: "16:9", + batchSize: 4, + style: "none", + referenceImages: [], + modelParams: {}, + retryOfTurnId: "turn-1", + retryMode: "exact", + }, + }); + + expect(response.statusCode).toBe(200); + + const body = response.json(); + expect(body.traceId).toEqual(expect.any(String)); + expect(body.runs[1]?.telemetry?.traceId).toBe(body.traceId); + + const createdGeneration = repositoryMock.createGeneration.mock.calls[0]?.[0] as { + run: { + operation: string; + telemetry: { + traceId: string; + }; + }; + }; + expect(createdGeneration.run.operation).toBe("image.edit"); + expect(createdGeneration.run.telemetry.traceId).toBe(body.traceId); + + const promptVersions = repositoryMock.createPromptVersions.mock.calls.flatMap( + ([input]) => (input as { versions: Array> }).versions + ) as Array<{ + stage: string; + attempt: number | null; + traceId: string | null; + }>; + + expect(promptVersions).toHaveLength(2); + expect(promptVersions.every((version) => version.stage === "dispatch")).toBe(true); + expect(promptVersions.every((version) => version.traceId === body.traceId)).toBe(true); + expect(promptVersions.map((version) => version.attempt)).toEqual([1, 1]); + + await app.close(); + }); + + it("tracks fallback dispatches without losing requested or selected target snapshots", async () => { + const { ProviderError } = await import("../providers/base/errors"); + const routeTargets = [ + createRouteTargetFixture({ + modelId: "qwen-image-2-pro", + deploymentId: "dashscope-qwen-image-2-pro-primary", + providerModel: "qwen-image-2.0-pro", + }), + createRouteTargetFixture({ + modelId: "qwen-image-2-pro", + deploymentId: "dashscope-qwen-image-2-pro-fallback", + providerModel: "qwen-image-2.0-pro-fallback", + promptCompiler: createPromptCompilerFixture({ + negativePromptStrategy: "merge_into_main", + sourceImageExecution: "unsupported", + referenceRoleHandling: { + reference: "compiled_to_text", + edit: "compiled_to_text", + variation: "compiled_to_text", + }, + continuityStrength: { + subject: "moderate", + style: "moderate", + composition: "weak", + text: "weak", + }, + }), + }), + ]; + const resolvedRequests: Array> = []; + + getRouteTargetsMock.mockReturnValue(routeTargets); + generateMock.mockImplementation(async (_request, options) => { + const [primaryTarget, fallbackTarget] = options.targets ?? []; + if (!primaryTarget || !fallbackTarget) { + throw new Error("Expected both primary and fallback route targets."); + } + + const primaryRequest = await options.resolveRequest?.(primaryTarget); + if (!primaryRequest) { + throw new Error("Expected primary dispatch request."); + } + resolvedRequests.push(primaryRequest); + + try { + throw new ProviderError("Primary target timed out.", 503); + } catch (error) { + if (!(error instanceof ProviderError) || error.statusCode !== 503) { + throw error; + } + } + + const fallbackRequest = await options.resolveRequest?.(fallbackTarget); + if (!fallbackRequest) { + throw new Error("Expected fallback dispatch request."); + } + resolvedRequests.push(fallbackRequest); + + return { + modelId: "qwen-image-2-pro", + logicalModel: "image.qwen.v2.pro", + deploymentId: "dashscope-qwen-image-2-pro-fallback", + runtimeProvider: "dashscope", + providerModel: "qwen-image-2.0-pro-fallback", + warnings: ["Fallback target executed."], + images: [ + { + binaryData: Buffer.from([1, 2, 3]), + mimeType: "image/png", + revisedPrompt: "provider revised fallback prompt", + }, + ], + }; + }); + + const app = await createApp(); + const response = await app.inject({ + method: "POST", + url: "/api/image-generate", + headers: { + Authorization: createBearerToken("user-1"), + }, + payload: { + prompt: "Remove the coffee cup while preserving the poster text", + modelId: "qwen-image-2-pro", + aspectRatio: "1:1", + batchSize: 1, + style: "none", + negativePrompt: "blurry text, watermark", + referenceImages: [], + assetRefs: [{ assetId: "thread-asset-1", role: "edit" }], + promptIntent: { + preserve: [], + avoid: [], + styleDirectives: [], + continuityTargets: ["text"], + editOps: [], + }, + modelParams: { + promptExtend: true, + }, + }, + }); + + expect(response.statusCode).toBe(200); + + const body = response.json(); + expect(body.traceId).toEqual(expect.any(String)); + expect(resolvedRequests).toHaveLength(2); + expect(resolvedRequests[0]).toMatchObject({ + requestedTarget: { + deploymentId: "dashscope-qwen-image-2-pro-primary", + provider: "dashscope", + }, + negativePrompt: "blurry text, watermark", + }); + expect(resolvedRequests[1]).toMatchObject({ + requestedTarget: { + deploymentId: "dashscope-qwen-image-2-pro-fallback", + provider: "dashscope", + }, + }); + expect(resolvedRequests[1]?.negativePrompt).toBeUndefined(); + + const promptVersions = repositoryMock.createPromptVersions.mock.calls.flatMap( + ([input]) => (input as { versions: Array> }).versions + ) as Array<{ + stage: string; + targetKey: string | null; + attempt: number | null; + traceId: string | null; + providerEffectivePrompt?: string | null; + semanticLosses?: Array<{ code: string }>; + }>; + const compileArtifacts = promptVersions.filter((version) => version.stage === "compile"); + const dispatchArtifacts = promptVersions.filter((version) => version.stage === "dispatch"); + + expect(promptVersions.every((version) => version.traceId === body.traceId)).toBe(true); + expect(compileArtifacts.map((version) => version.targetKey)).toEqual([ + "dashscope:qwen-image-2.0-pro", + "dashscope:qwen-image-2.0-pro-fallback", + ]); + expect( + compileArtifacts.map((version) => version.semanticLosses?.map((loss) => loss.code) ?? []) + ).toEqual([ + expect.arrayContaining([ + "OPERATION_DEGRADED_TO_IMAGE_GENERATE", + "APPROXIMATED_AS_REGENERATION", + "ASSET_ROLE_DEGRADED_TO_REFERENCE_GUIDANCE", + "STYLE_REFERENCE_ROLE_COLLAPSED", + ]), + expect.arrayContaining([ + "OPERATION_DEGRADED_TO_IMAGE_GENERATE", + "APPROXIMATED_AS_REGENERATION", + "SOURCE_IMAGE_NOT_EXECUTABLE", + "STYLE_REFERENCE_ROLE_COLLAPSED", + "EXACT_TEXT_CONTINUITY_AT_RISK", + "NEGATIVE_PROMPT_DEGRADED_TO_TEXT", + ]), + ]); + expect( + dispatchArtifacts.map((version) => ({ + attempt: version.attempt, + targetKey: version.targetKey, + providerEffectivePrompt: version.providerEffectivePrompt ?? null, + })) + ).toEqual([ + { + attempt: 1, + targetKey: "dashscope:qwen-image-2.0-pro", + providerEffectivePrompt: null, + }, + { + attempt: 2, + targetKey: "dashscope:qwen-image-2.0-pro-fallback", + providerEffectivePrompt: null, + }, + { + attempt: 2, + targetKey: "dashscope:qwen-image-2.0-pro-fallback", + providerEffectivePrompt: "provider revised fallback prompt", + }, + ]); + + expect(repositoryMock.completeGenerationSuccess).toHaveBeenCalledWith( + expect.objectContaining({ + run: expect.objectContaining({ + telemetry: expect.objectContaining({ + traceId: body.traceId, + }), + executedTarget: expect.objectContaining({ + deploymentId: "dashscope-qwen-image-2-pro-fallback", + providerModel: "qwen-image-2.0-pro-fallback", + }), + }), + }) + ); + expect(body).toMatchObject({ + runtimeProvider: "dashscope", + providerModel: "qwen-image-2.0-pro-fallback", + runs: [ + expect.any(Object), + expect.objectContaining({ + telemetry: expect.objectContaining({ + traceId: body.traceId, + }), + requestedTarget: expect.objectContaining({ + deploymentId: "dashscope-qwen-image-2-pro-primary", + providerModel: "qwen-image-2.0-pro", + }), + selectedTarget: expect.objectContaining({ + deploymentId: "dashscope-qwen-image-2-pro-primary", + providerModel: "qwen-image-2.0-pro", + }), + executedTarget: expect.objectContaining({ + deploymentId: "dashscope-qwen-image-2-pro-fallback", + providerModel: "qwen-image-2.0-pro-fallback", + }), + }), + ], + }); + + await app.close(); + }); +}); diff --git a/server/src/routes/image-generate.test.ts b/server/src/routes/image-generate.test.ts index 9a84bf9..0453101 100644 --- a/server/src/routes/image-generate.test.ts +++ b/server/src/routes/image-generate.test.ts @@ -1,5 +1,6 @@ import { createHmac } from "node:crypto"; import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; +import type { AssetService } from "../assets/service"; const generateMock = vi.fn(); const getRouteTargetsMock = vi.fn(); @@ -60,6 +61,52 @@ const repositoryMock = { turnExists: vi.fn(), }; +let generatedAssetCounter = 0; + +const assetServiceMock = { + close: vi.fn(), + resolveProviderAssetRefs: vi.fn( + async ( + _userId: string, + assetRefs: Array<{ + assetId: string; + role: "reference" | "edit" | "variation"; + referenceType?: "style" | "content" | "controlnet"; + weight?: number; + }> + ) => + assetRefs.map((assetRef) => ({ + assetId: assetRef.assetId, + role: assetRef.role, + referenceType: assetRef.referenceType ?? "content", + weight: assetRef.weight ?? 1, + signedUrl: `https://assets.example.com/${assetRef.assetId}.png`, + mimeType: "image/png", + })) + ), + createGeneratedAsset: vi.fn(async (input: { createdAt: string; mimeType: string; buffer: Buffer }) => { + generatedAssetCounter += 1; + const assetId = `asset-generated-${generatedAssetCounter}`; + return { + created: true, + assetId, + name: `generated-${generatedAssetCounter}.png`, + type: input.mimeType, + size: input.buffer.byteLength, + source: "ai-generated" as const, + origin: "ai" as const, + contentHash: `hash-${generatedAssetCounter}`, + createdAt: input.createdAt, + updatedAt: input.createdAt, + objectUrl: `/api/assets/${assetId}/original?token=test`, + thumbnailUrl: `/api/assets/${assetId}/original?token=test`, + }; + }), + deleteAsset: vi.fn(async () => undefined), + createAssetEdges: vi.fn(async () => undefined), + deleteAssetEdges: vi.fn(async () => undefined), +}; + const createRouteTargetFixture = (input: { modelId: "qwen-image-2-pro" | "seedream-v5"; deploymentId?: string; @@ -177,6 +224,7 @@ const createApp = async () => { const app = Fastify(); app.decorate("chatStateRepository", repositoryMock); + app.decorate("assetService", assetServiceMock as unknown as AssetService); await app.register(imageGenerateRoute); return app; }; @@ -188,11 +236,17 @@ describe("imageGenerateRoute", () => { generateMock.mockReset(); getRouteTargetsMock.mockReset(); downloadGeneratedImageMock.mockReset(); + generatedAssetCounter = 0; Object.values(repositoryMock).forEach((mockFn) => { if ("mockReset" in mockFn) { mockFn.mockReset(); } }); + Object.values(assetServiceMock).forEach((mockFn) => { + if ("mockClear" in mockFn) { + mockFn.mockClear(); + } + }); getRouteTargetsMock.mockImplementation((request: { modelId: string }) => resolveRouteSelectionFixture(request.modelId) @@ -237,6 +291,12 @@ describe("imageGenerateRoute", () => { aspectRatio: "1:1", batchSize: 1, style: "none", + assetRefs: [ + { + assetId: "asset-reference-1", + role: "reference", + }, + ], referenceImages: [], modelParams: {}, }, @@ -345,42 +405,40 @@ describe("imageGenerateRoute", () => { expect(repositoryMock.completeGenerationSuccess).toHaveBeenCalledWith( expect.objectContaining({ conversationId: "conversation-1", - generatedImages: [ + generatedImages: [], + assets: [ expect.objectContaining({ - ownerUserId: "user-1", - visibility: "private", - mimeType: "image/png", - blobData: Buffer.from([9, 8, 7]), + id: "asset-generated-1", + assetType: "image", }), expect.objectContaining({ - ownerUserId: "user-1", - visibility: "private", - mimeType: "image/png", - blobData: Buffer.from([1, 2, 3]), + id: "asset-generated-2", + assetType: "image", }), ], results: [ expect.objectContaining({ - imageId: expect.any(String), - imageUrl: expect.stringMatching(/^\/api\/generated-images\/[^?]+\?token=/), + assetId: "asset-generated-1", + imageUrl: expect.stringMatching(/^\/api\/assets\/asset-generated-1\/original\?token=/), }), expect.objectContaining({ - imageId: expect.any(String), - imageUrl: expect.stringMatching(/^\/api\/generated-images\/[^?]+\?token=/), + assetId: "asset-generated-2", + imageUrl: expect.stringMatching(/^\/api\/assets\/asset-generated-2\/original\?token=/), }), ], }) ); const body = response.json(); - expect(body.imageId).toEqual(expect.any(String)); - expect(body.imageUrl).toMatch(/^\/api\/generated-images\/[^?]+\?token=/); + expect(body.imageId).toBeUndefined(); + expect(body.imageUrl).toMatch(/^\/api\/assets\/asset-generated-1\/original\?token=/); expect(body.images).toHaveLength(2); expect(body.images[0]).toMatchObject({ + assetId: "asset-generated-1", provider: "dashscope", model: "qwen-image-2.0-pro", }); - expect(body.images[0].imageUrl).toMatch(/^\/api\/generated-images\/[^?]+\?token=/); + expect(body.images[0].imageUrl).toMatch(/^\/api\/assets\/asset-generated-1\/original\?token=/); await app.close(); }); @@ -611,6 +669,7 @@ describe("imageGenerateRoute", () => { createdAt: "2026-03-12T00:00:00.000Z", completedAt: "2026-03-12T00:00:02.000Z", telemetry: { + traceId: "trace-run-moderation-1", providerRequestId: null, providerTaskId: null, latencyMs: 2000, @@ -661,6 +720,7 @@ describe("imageGenerateRoute", () => { createdAt: "2026-03-12T00:00:00.000Z", completedAt: "2026-03-12T00:00:05.000Z", telemetry: { + traceId: "trace-run-1", providerRequestId: null, providerTaskId: null, latencyMs: 5000, @@ -933,6 +993,7 @@ describe("imageGenerateRoute", () => { createdAt: "2026-03-12T00:00:00.000Z", completedAt: "2026-03-12T00:00:05.000Z", telemetry: { + traceId: "trace-run-1", providerRequestId: null, providerTaskId: null, latencyMs: 5000, @@ -1453,6 +1514,53 @@ describe("imageGenerateRoute", () => { await app.close(); }); + it("cleans up newly created canonical assets when later persistence fails", async () => { + generateMock.mockResolvedValue({ + modelId: "qwen-image-2-pro", + logicalModel: "image.qwen.v2.pro", + deploymentId: "dashscope-qwen-image-2-pro-primary", + runtimeProvider: "dashscope", + providerModel: "qwen-image-2.0-pro", + warnings: [], + images: [ + { + binaryData: Buffer.from([1, 2, 3]), + mimeType: "image/png", + }, + ], + }); + repositoryMock.completeGenerationSuccess.mockRejectedValueOnce( + new Error("Persisted run could not be committed.") + ); + + const app = await createApp(); + const response = await app.inject({ + method: "POST", + url: "/api/image-generate", + headers: { + Authorization: createBearerToken("user-1"), + }, + payload: { + prompt: "Studio portrait", + modelId: "qwen-image-2-pro", + aspectRatio: "1:1", + batchSize: 1, + style: "none", + referenceImages: [], + modelParams: {}, + }, + }); + + expect(response.statusCode).toBe(500); + expect(repositoryMock.completeGenerationFailure).toHaveBeenCalled(); + expect(assetServiceMock.deleteAsset).toHaveBeenCalledWith( + "user-1", + "asset-generated-1" + ); + + await app.close(); + }); + it("returns provider errors together with persisted identifiers", async () => { const { ProviderError } = await import("../providers/base/errors"); generateMock.mockRejectedValue(new ProviderError("Provider rejected request.", 429)); @@ -1478,12 +1586,14 @@ describe("imageGenerateRoute", () => { expect(response.statusCode).toBe(429); expect(response.json()).toMatchObject({ error: "Provider rejected request.", + traceId: expect.any(String), conversationId: "conversation-1", threadId: "conversation-1", turnId: expect.any(String), jobId: expect.any(String), runId: expect.any(String), }); + expect(response.headers["x-request-id"]).toEqual(expect.any(String)); expect(repositoryMock.completeGenerationFailure).toHaveBeenCalled(); await app.close(); diff --git a/server/src/routes/image-generate.ts b/server/src/routes/image-generate.ts index cced221..d9fd5cd 100644 --- a/server/src/routes/image-generate.ts +++ b/server/src/routes/image-generate.ts @@ -1,4 +1,4 @@ -import type { FastifyPluginAsync } from "fastify"; +import type { FastifyPluginAsync, FastifyReply } from "fastify"; import { ZodError } from "zod"; import type { PersistedAssetEdgeType, @@ -9,6 +9,7 @@ import type { PersistedRunRecord, PersistedRunTargetSnapshot, } from "../../../shared/chatImageTypes"; +import { createId } from "../../../shared/createId"; import type { PromptVersionRecord } from "../gateway/prompt/types"; import { requireAuthenticatedUser } from "../auth/user"; import { ChatPromptStateConflictError } from "../chat/persistence/types"; @@ -28,8 +29,8 @@ import type { ResolvedRouteTarget } from "../gateway/router/types"; import { getFrontendImageModelById } from "../models/frontendRegistry"; import { ProviderError } from "../providers/base/errors"; import { downloadGeneratedImage } from "../shared/downloadGeneratedImage"; -import { createGeneratedImageCapability } from "../shared/generatedImageCapability"; import { getImageGenerationCapabilityWarnings } from "../shared/imageGenerationCapabilityWarnings"; +import { attachTraceIdHeader, getRequestTraceId } from "../shared/requestTrace"; import { imageGenerationRequestSchema, type ParsedImageGenerationRequest, @@ -43,14 +44,6 @@ import { const GENERATED_IMAGE_NORMALIZATION_CONCURRENCY = 2; -const createId = (prefix: string) => { - if (typeof crypto !== "undefined" && "randomUUID" in crypto) { - return crypto.randomUUID(); - } - - return `${prefix}-${Date.now()}-${Math.random().toString(16).slice(2, 8)}`; -}; - const cloneSnapshot = (value: T): T => JSON.parse(JSON.stringify(value)) as T; const uniqueWarnings = (warnings: Array) => @@ -65,6 +58,44 @@ const formatNormalizationWarning = (count: number) => count === 1 ? "was" : "were" } omitted.`; +type PersistedGenerationContext = { + conversationId: string; + turnId: string; + jobId: string; + runId: string; + attemptId: string; +}; + +const toPersistedGenerationResponse = ( + persistedGeneration: PersistedGenerationContext | null +) => + persistedGeneration + ? { + conversationId: persistedGeneration.conversationId, + threadId: persistedGeneration.conversationId, + turnId: persistedGeneration.turnId, + jobId: persistedGeneration.jobId, + runId: persistedGeneration.runId, + } + : {}; + +const sendTraceableError = ( + reply: FastifyReply, + input: { + statusCode: number; + error: string; + traceId: string; + persistedGeneration?: PersistedGenerationContext | null; + } +) => { + attachTraceIdHeader(reply, input.traceId); + return reply.code(input.statusCode).send({ + error: input.error, + traceId: input.traceId, + ...toPersistedGenerationResponse(input.persistedGeneration ?? null), + }); +}; + const createRunTargetSnapshot = (input: PersistedRunTargetSnapshot): PersistedRunTargetSnapshot => ({ modelId: input.modelId, logicalModel: input.logicalModel, @@ -289,6 +320,7 @@ const findMatchingExactTarget = ( const buildPromptVersionRecord = (input: { runId: string; turnId: string; + traceId: string; version: number; stage: PromptVersionRecord["stage"]; compilerVersion: string; @@ -312,6 +344,7 @@ const buildPromptVersionRecord = (input: { id: createId("prompt-version"), runId: input.runId, turnId: input.turnId, + traceId: input.traceId, version: input.version, stage: input.stage, targetKey: input.targetKey ?? null, @@ -347,9 +380,15 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { }, }, async (request, reply) => { + const traceId = getRequestTraceId(request); + attachTraceIdHeader(reply, traceId); const userId = requireAuthenticatedUser(request); if (!userId) { - return reply.code(401).send({ error: "Unauthorized." }); + return sendTraceableError(reply, { + statusCode: 401, + error: "Unauthorized.", + traceId, + }); } const requestController = new AbortController(); @@ -402,15 +441,10 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { }; let payload; - let persistedGeneration: - | { - conversationId: string; - turnId: string; - jobId: string; - runId: string; - attemptId: string; - } - | null = null; + let persistedGeneration: PersistedGenerationContext | null = null; + const createdGeneratedAssetIds: string[] = []; + let createdAssetEdgeIds: string[] = []; + let generationAssetsCommitted = false; try { payload = imageGenerationRequestSchema.parse(request.body); @@ -419,7 +453,11 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { error instanceof ZodError ? "Invalid request payload." : "Request body could not be parsed."; - return reply.code(400).send({ error: message }); + return sendTraceableError(reply, { + statusCode: 400, + error: message, + traceId, + }); } try { @@ -429,8 +467,10 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { payload.conversationId && payload.threadId !== payload.conversationId ) { - return reply.code(400).send({ + return sendTraceableError(reply, { + statusCode: 400, error: "threadId and conversationId must match when both are provided.", + traceId, }); } @@ -439,7 +479,11 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { ? await repository.getConversationById(userId, requestedConversationId) : await repository.getOrCreateActiveConversation(userId); if (!conversation) { - return reply.code(404).send({ error: "Conversation not found." }); + return sendTraceableError(reply, { + statusCode: 404, + error: "Conversation not found.", + traceId, + }); } if (payload.retryOfTurnId) { @@ -449,8 +493,10 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { payload.retryOfTurnId ); if (!retryTurnExists) { - return reply.code(400).send({ + return sendTraceableError(reply, { + statusCode: 400, error: "retryOfTurnId does not belong to the selected conversation.", + traceId, }); } } @@ -476,9 +522,10 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { const retryRun = findRetryRun(snapshot.runs, payload.retryOfTurnId); const retryJob = retryRun ? findRetryJob(snapshot.jobs, retryRun) : null; if (!retryRun?.prompt || !retryJob?.requestSnapshot) { - return reply.code(400).send({ - error: - "Exact retry is unavailable because no prior execution snapshot was found.", + return sendTraceableError(reply, { + statusCode: 400, + error: "Exact retry is unavailable because no prior execution snapshot was found.", + traceId, }); } @@ -494,7 +541,11 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { const frontendModel = getFrontendImageModelById(effectivePayload.modelId); if (!frontendModel) { - return reply.code(400).send({ error: `Unsupported modelId: ${effectivePayload.modelId}.` }); + return sendTraceableError(reply, { + statusCode: 400, + error: `Unsupported modelId: ${effectivePayload.modelId}.`, + traceId, + }); } const requestedOperation = resolveRequestedOperation(effectivePayload.assetRefs); @@ -511,8 +562,10 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { const validationResult = compatibilityProbe.safeParse(effectivePayload); if (!validationResult.success) { const firstIssue = validationResult.error.issues[0]; - return reply.code(400).send({ + return sendTraceableError(reply, { + statusCode: 400, error: firstIssue?.message ?? "Request is incompatible with selected model.", + traceId, }); } @@ -535,16 +588,19 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { if (effectiveRetryMode === "exact" && payload.retryOfTurnId) { const retryRun = exactRetrySourceRun; if (!retryRun?.prompt) { - return reply.code(400).send({ - error: - "Exact retry is unavailable because no prior prompt snapshot was found.", + return sendTraceableError(reply, { + statusCode: 400, + error: "Exact retry is unavailable because no prior prompt snapshot was found.", + traceId, }); } const exactTarget = findMatchingExactTarget(routeTargets, retryRun); if (!exactTarget) { - return reply.code(400).send({ + return sendTraceableError(reply, { + statusCode: 400, error: "Exact retry target is no longer available. Use recompile retry instead.", + traceId, }); } @@ -602,6 +658,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { rewritePromptVersion = buildPromptVersionRecord({ runId: rewriteRunId, turnId, + traceId, version: 1, stage: "rewrite", compilerVersion: promptContext.compilerVersion, @@ -633,6 +690,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { return buildPromptVersionRecord({ runId: imageRunId, turnId, + traceId, version: index + 1, stage: "compile", targetKey: compiled.targetKey, @@ -741,6 +799,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { createdAt, completedAt: createdAt, telemetry: { + traceId, providerRequestId: null, providerTaskId: null, latencyMs: null, @@ -805,6 +864,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { createdAt, completedAt: null, telemetry: { + traceId, providerRequestId: null, providerTaskId: null, latencyMs: null, @@ -837,6 +897,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { let finalDispatchWarnings = [...initialPromptSnapshot.warnings]; const generated = await imageRuntimeRouter.generate(routingRequest, { signal: requestController.signal, + traceId, targets: routeTargets, resolveRequest: async (target) => { dispatchAttempt += 1; @@ -850,6 +911,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { buildPromptVersionRecord({ runId: imageRunId, turnId, + traceId, version: compilePromptVersions.length + dispatchAttempt, stage: "dispatch", targetKey: `${target.provider.id}:${target.deployment.providerModel}`, @@ -893,6 +955,10 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { deploymentId: target.deployment.id, provider: target.provider.id, }, + resolvedAssetRefs: await app.assetService.resolveProviderAssetRefs( + userId, + effectivePayload.assetRefs ?? [] + ), prompt: dispatchedPrompt, negativePrompt: undefined, }; @@ -911,6 +977,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { buildPromptVersionRecord({ runId: imageRunId, turnId, + traceId, version: compilePromptVersions.length + dispatchAttempt, stage: "dispatch", targetKey: compiled.targetKey, @@ -961,6 +1028,10 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { deploymentId: target.deployment.id, provider: target.provider.id, }, + resolvedAssetRefs: await app.assetService.resolveProviderAssetRefs( + userId, + effectivePayload.assetRefs ?? [] + ), prompt: compiled.dispatchedPrompt, negativePrompt: compiled.negativePrompt ?? undefined, }; @@ -973,12 +1044,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { async (image, index) => normalizeGeneratedImage(image, index) ); const normalizedResults: Array<{ - resultId: string; - assetId: string; buffer: Buffer; - imageId: string; - imageUrl: string; - privateTokenHash: string; provider: string; model: string; mimeType?: string; @@ -996,14 +1062,8 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { return; } - const capability = createGeneratedImageCapability(); normalizedResults.push({ - resultId: createId("chat-result"), - assetId: createId("thread-asset"), buffer: normalized.buffer, - imageId: capability.imageId, - imageUrl: capability.imageUrl, - privateTokenHash: capability.privateTokenHash, provider: generated.runtimeProvider, model: generated.providerModel, mimeType: normalized.mimeType, @@ -1015,11 +1075,13 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { normalizationFailureCount += 1; firstNormalizationError ??= result.reason; - app.log.warn( + request.log.warn( { err: result.reason, imageIndex: index, conversationId: persistedGeneration?.conversationId ?? null, + turnId: persistedGeneration?.turnId ?? null, + runId: persistedGeneration?.runId ?? null, }, "Generated image result could not be normalized." ); @@ -1027,12 +1089,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { const normalizedImages = normalizedResults.reduce< Array<{ - resultId: string; - assetId: string; buffer: Buffer; - imageId: string; - imageUrl: string; - privateTokenHash: string; provider: string; model: string; mimeType?: string; @@ -1048,10 +1105,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { return accumulator; }, []); - const firstImageUrl = normalizedImages[0]?.imageUrl; - const firstImageId = normalizedImages[0]?.imageId; - - if (!firstImageUrl) { + if (normalizedImages.length === 0) { if (firstNormalizationError) { throw firstNormalizationError; } @@ -1071,15 +1125,60 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { finalPromptSnapshot, normalizedImages[0]?.revisedPrompt ?? null ); - const assets = normalizedImages.map((image, index) => ({ + const assetizedImages: Array<{ + resultId: string; + assetId: string; + imageUrl: string; + thumbnailUrl: string; + created: boolean; + provider: string; + model: string; + mimeType?: string; + revisedPrompt: string | null; + index: number; + }> = []; + for (const [index, image] of normalizedImages.entries()) { + const createdAsset = await app.assetService.createGeneratedAsset({ + userId, + name: `generated-${turnId}-${index + 1}`, + mimeType: image.mimeType ?? "image/png", + buffer: image.buffer, + createdAt: completedAt, + source: "ai-generated", + origin: "ai", + metadata: { + runtimeProvider: image.provider, + providerModel: image.model, + revisedPrompt: image.revisedPrompt ?? null, + index, + }, + }); + if (createdAsset.created) { + createdGeneratedAssetIds.push(createdAsset.assetId); + } + + assetizedImages.push({ + resultId: createId("chat-result"), + assetId: createdAsset.assetId, + imageUrl: createdAsset.objectUrl, + thumbnailUrl: createdAsset.thumbnailUrl, + created: createdAsset.created, + provider: image.provider, + model: image.model, + mimeType: createdAsset.type, + revisedPrompt: image.revisedPrompt, + index: image.index, + }); + } + const assets = assetizedImages.map((image, index) => ({ id: image.assetId, turnId, runId: imageRunId, assetType: "image" as const, label: `Generated image ${index + 1}`, metadata: { - imageId: image.imageId, imageUrl: image.imageUrl, + thumbnailUrl: image.thumbnailUrl, mimeType: image.mimeType ?? null, runtimeProvider: image.provider, providerModel: image.model, @@ -1090,7 +1189,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { { id: createId("thread-locator"), assetId: image.assetId, - locatorType: "generated_image_store" as const, + locatorType: "remote_url" as const, locatorValue: image.imageUrl, mimeType: image.mimeType, expiresAt: null, @@ -1109,6 +1208,13 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { createdAt: completedAt, })) ); + await app.assetService.createAssetEdges( + assetEdges.map((edge) => ({ + ...edge, + conversationId: conversation.id, + })) + ); + createdAssetEdgeIds = assetEdges.map((edge) => edge.id); const executedTarget = createRunTargetSnapshot({ modelId: generated.modelId, logicalModel: generated.logicalModel, @@ -1127,6 +1233,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { buildPromptVersionRecord({ runId: imageRunId, turnId, + traceId, version: compilePromptVersions.length + dispatchAttempt + 1, stage: "dispatch", targetKey: `${generated.runtimeProvider}:${generated.providerModel}`, @@ -1185,6 +1292,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { createdAt, completedAt: createdAt, telemetry: { + traceId, providerRequestId: null, providerTaskId: null, latencyMs: null, @@ -1204,30 +1312,19 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { providerRequestId: generated.providerRequestId, providerTaskId: generated.providerTaskId, warnings: mergedWarnings, - generatedImages: normalizedImages.map((image) => ({ - id: image.imageId, - ownerUserId: userId, - conversationId: conversation.id, - turnId, - mimeType: image.mimeType ?? "image/png", - sizeBytes: image.buffer.byteLength, - blobData: image.buffer, - visibility: "private", - privateTokenHash: image.privateTokenHash, - createdAt: completedAt, - })), - results: normalizedImages.map((image, index) => ({ + generatedImages: [], + results: assetizedImages.map((image, index) => ({ id: image.resultId, imageUrl: image.imageUrl, - imageId: image.imageId, + imageId: null, threadAssetId: image.assetId, runtimeProvider: image.provider, providerModel: image.model, mimeType: image.mimeType, revisedPrompt: image.revisedPrompt, index, - assetId: null, - saved: false, + assetId: image.assetId, + saved: true, })), assets, assetEdges, @@ -1238,6 +1335,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { referencedAssetIds: effectivePayload.assetRefs?.map((assetRef) => assetRef.assetId) ?? [], telemetry: { + traceId, providerRequestId: generated.providerRequestId ?? null, providerTaskId: generated.providerTaskId ?? null, latencyMs: Date.now() - startedAt, @@ -1246,6 +1344,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { }, completedAt, }); + generationAssetsCommitted = true; const imageRunRecord: PersistedRunRecord = { id: imageRunId, @@ -1271,6 +1370,7 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { createdAt, completedAt, telemetry: { + traceId, providerRequestId: generated.providerRequestId ?? null, providerTaskId: generated.providerTaskId ?? null, latencyMs: Date.now() - startedAt, @@ -1283,18 +1383,17 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { turnId, jobId, runId: imageRunId, + traceId, modelId: generated.modelId, logicalModel: generated.logicalModel, deploymentId: generated.deploymentId, runtimeProvider: generated.runtimeProvider, providerModel: generated.providerModel, createdAt: completedAt, - imageId: firstImageId, - imageUrl: firstImageUrl, - images: normalizedImages.map((image) => ({ + imageUrl: assetizedImages[0]?.imageUrl, + images: assetizedImages.map((image) => ({ resultId: image.resultId, assetId: image.assetId, - imageId: image.imageId, imageUrl: image.imageUrl, provider: image.provider, model: image.model, @@ -1308,8 +1407,10 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { }); } catch (error) { if (error instanceof ChatPromptStateConflictError) { - return reply.code(409).send({ + return sendTraceableError(reply, { + statusCode: 409, error: "Conversation state changed during prompt compilation. Please retry.", + traceId, }); } @@ -1331,40 +1432,67 @@ export const imageGenerateRoute: FastifyPluginAsync = async (app) => { completedAt: new Date().toISOString(), }); } catch (persistenceError) { - app.log.error(persistenceError); + request.log.error( + { + err: persistenceError, + conversationId: persistedGeneration.conversationId, + turnId: persistedGeneration.turnId, + jobId: persistedGeneration.jobId, + runId: persistedGeneration.runId, + }, + "Failed to persist image generation failure state." + ); } } + if (!generationAssetsCommitted && createdAssetEdgeIds.length > 0) { + await app.assetService.deleteAssetEdges(createdAssetEdgeIds).catch(() => undefined); + } + if (!generationAssetsCommitted && createdGeneratedAssetIds.length > 0) { + await Promise.allSettled( + createdGeneratedAssetIds.map((assetId) => + app.assetService.deleteAsset(userId, assetId) + ) + ); + } + if (requestController.signal.aborted || reply.raw.destroyed) { return reply; } if (error instanceof ProviderError) { - return reply.code(error.statusCode).send({ + request.log.warn( + { + err: error, + conversationId: persistedGeneration?.conversationId ?? null, + turnId: persistedGeneration?.turnId ?? null, + jobId: persistedGeneration?.jobId ?? null, + runId: persistedGeneration?.runId ?? null, + }, + "Image generation failed with a provider error." + ); + return sendTraceableError(reply, { + statusCode: error.statusCode, error: error.message, - ...(persistedGeneration - ? { - conversationId: persistedGeneration.conversationId, - threadId: persistedGeneration.conversationId, - turnId: persistedGeneration.turnId, - jobId: persistedGeneration.jobId, - runId: persistedGeneration.runId, - } - : {}), + traceId, + persistedGeneration, }); } - app.log.error(error); - return reply.code(500).send({ + request.log.error( + { + err: error, + conversationId: persistedGeneration?.conversationId ?? null, + turnId: persistedGeneration?.turnId ?? null, + jobId: persistedGeneration?.jobId ?? null, + runId: persistedGeneration?.runId ?? null, + }, + "Image generation failed." + ); + return sendTraceableError(reply, { + statusCode: 500, error: "Image generation failed.", - ...(persistedGeneration - ? { - conversationId: persistedGeneration.conversationId, - threadId: persistedGeneration.conversationId, - turnId: persistedGeneration.turnId, - jobId: persistedGeneration.jobId, - runId: persistedGeneration.runId, - } - : {}), + traceId, + persistedGeneration, }); } finally { request.raw.removeListener("aborted", abortRequest); diff --git a/server/src/shared/generatedImageCapability.ts b/server/src/shared/generatedImageCapability.ts index 5ea6f7e..1c61e8d 100644 --- a/server/src/shared/generatedImageCapability.ts +++ b/server/src/shared/generatedImageCapability.ts @@ -1,4 +1,5 @@ -import { createHash, randomBytes, randomUUID } from "node:crypto"; +import { createHash, randomBytes } from "node:crypto"; +import { createId } from "../../../shared/createId"; export type GeneratedImageVisibility = "private"; @@ -18,7 +19,7 @@ export const buildGeneratedImageUrl = (imageId: string, token: string) => `/api/generated-images/${encodeURIComponent(imageId)}?token=${encodeURIComponent(token)}`; export const createGeneratedImageCapability = (): GeneratedImageCapability => { - const imageId = randomUUID(); + const imageId = createId("generated-image"); const privateToken = buildCapabilityToken(); return { imageId, diff --git a/server/src/shared/imageGenerationCapabilityFacts.test.ts b/server/src/shared/imageGenerationCapabilityFacts.test.ts index 66bc1cd..483d6da 100644 --- a/server/src/shared/imageGenerationCapabilityFacts.test.ts +++ b/server/src/shared/imageGenerationCapabilityFacts.test.ts @@ -6,9 +6,6 @@ import { validateImageGenerationRequestAgainstModel, } from "./imageGenerationSchema"; -const createDataUrl = (byteLength: number) => - `data:image/png;base64,${Buffer.alloc(byteLength, 1).toString("base64")}`; - describe("image generation capability facts", () => { it("stay aligned across frontend model registry, catalog, and server validation", () => { const frontendModel = getFrontendImageModelById("qwen-image-2-pro"); @@ -47,13 +44,6 @@ describe("image generation capability facts", () => { width: 2048, height: 1024, style: "cinematic", - referenceImages: [ - { - id: "ref-1", - url: createDataUrl(1024), - type: "content", - }, - ], negativePrompt: "avoid blur", seed: 42, guidanceScale: 11, @@ -62,7 +52,7 @@ describe("image generation capability facts", () => { modelParams: { promptExtend: true, }, - assetRefs: [{ assetId: "thread-asset-1", role: "edit" }], + assetRefs: [{ assetId: "thread-asset-1", role: "edit", referenceType: "content" }], }); expect(validationResult.success).toBe(false); @@ -75,7 +65,7 @@ describe("image generation capability facts", () => { expect(issuePaths).toContain("steps"); }); - it("rejects oversized qwen reference images during compatibility validation", () => { + it("rejects weighted qwen reference assets during compatibility validation", () => { const frontendModel = getFrontendImageModelById("qwen-image-2-pro"); expect(frontendModel).not.toBeNull(); if (!frontendModel) { @@ -91,11 +81,12 @@ describe("image generation capability facts", () => { modelId: "qwen-image-2-pro", aspectRatio: "1:1", style: "none", - referenceImages: [ + assetRefs: [ { - id: "ref-oversized", - url: createDataUrl(10 * 1024 * 1024 + 1), - type: "content", + assetId: "asset-ref-1", + role: "reference", + referenceType: "content", + weight: 0.5, }, ], batchSize: 1, @@ -112,7 +103,7 @@ describe("image generation capability facts", () => { expect(validationResult.error.issues).toEqual( expect.arrayContaining([ expect.objectContaining({ - path: ["referenceImages", 0, "url"], + path: ["assetRefs", 0, "weight"], }), ]) ); diff --git a/server/src/shared/imageGenerationCapabilityWarnings.ts b/server/src/shared/imageGenerationCapabilityWarnings.ts index 6c80d25..162dfa5 100644 --- a/server/src/shared/imageGenerationCapabilityWarnings.ts +++ b/server/src/shared/imageGenerationCapabilityWarnings.ts @@ -10,8 +10,11 @@ export const getImageGenerationCapabilityWarnings = ( } const warnings: string[] = []; - if (!frontendModel.constraints.referenceImages.enabled && request.referenceImages.length > 0) { - const count = request.referenceImages.length; + const guidedAssetCount = (request.assetRefs ?? []).filter( + (assetRef) => assetRef.role === "reference" + ).length; + if (!frontendModel.constraints.referenceImages.enabled && guidedAssetCount > 0) { + const count = guidedAssetCount; warnings.push( `${frontendModel.label} ignores ${count} reference image${count === 1 ? "" : "s"}.` ); diff --git a/server/src/shared/imageGenerationSchema.ts b/server/src/shared/imageGenerationSchema.ts index 365f169..47cf2b3 100644 --- a/server/src/shared/imageGenerationSchema.ts +++ b/server/src/shared/imageGenerationSchema.ts @@ -31,18 +31,6 @@ const appendIssue = ( }); }; -const estimateDataUrlBytes = (value: string): number | null => { - const trimmedValue = value.trim(); - const match = /^data:[^;,]+;base64,([A-Za-z0-9+/=]+)$/i.exec(trimmedValue); - if (!match?.[1]) { - return null; - } - - const encoded = match[1]; - const padding = encoded.endsWith("==") ? 2 : encoded.endsWith("=") ? 1 : 0; - return Math.max(0, Math.floor((encoded.length * 3) / 4) - padding); -}; - export const validateImageGenerationRequestAgainstModel = ( payload: ParsedImageGenerationRequest, frontendModel: FrontendModelSpec, @@ -127,21 +115,25 @@ export const validateImageGenerationRequestAgainstModel = ( } } + const referenceAssetRefs = payload.assetRefs.filter((assetRef) => assetRef.role === "reference"); if (capability.referenceImages.enabled) { - if (payload.referenceImages.length > capability.referenceImages.maxImages) { + if (referenceAssetRefs.length > capability.referenceImages.maxImages) { appendIssue( ctx, - ["referenceImages"], + ["assetRefs"], `${label} supports at most ${capability.referenceImages.maxImages} reference images.` ); } - payload.referenceImages.forEach((referenceImage, index) => { - if (!capability.referenceImages.supportedTypes.includes(referenceImage.type)) { + referenceAssetRefs.forEach((referenceImage, index) => { + if ( + referenceImage.referenceType && + !capability.referenceImages.supportedTypes.includes(referenceImage.referenceType) + ) { appendIssue( ctx, - ["referenceImages", index, "type"], - `${label} does not support reference image type ${referenceImage.type}.` + ["assetRefs", index, "referenceType"], + `${label} does not support reference image type ${referenceImage.referenceType}.` ); } @@ -152,31 +144,15 @@ export const validateImageGenerationRequestAgainstModel = ( ) { appendIssue( ctx, - ["referenceImages", index, "weight"], + ["assetRefs", index, "weight"], `${label} does not support reference image weights.` ); } - - if (typeof capability.referenceImages.maxFileSizeBytes === "number") { - const estimatedBytes = estimateDataUrlBytes(referenceImage.url); - if ( - typeof estimatedBytes === "number" && - estimatedBytes > capability.referenceImages.maxFileSizeBytes - ) { - appendIssue( - ctx, - ["referenceImages", index, "url"], - `${label} reference images must be ${Math.round( - capability.referenceImages.maxFileSizeBytes / 1024 / 1024 - )} MB or smaller.` - ); - } - } }); - } else if (payload.referenceImages.length > 0) { + } else if (referenceAssetRefs.length > 0) { appendIssue( ctx, - ["referenceImages"], + ["assetRefs"], `${label} does not support reference images.` ); } diff --git a/server/src/shared/requestTrace.ts b/server/src/shared/requestTrace.ts new file mode 100644 index 0000000..8619654 --- /dev/null +++ b/server/src/shared/requestTrace.ts @@ -0,0 +1,41 @@ +import type { FastifyReply, FastifyRequest } from "fastify"; +import { createId } from "../../../shared/createId"; + +export const REQUEST_ID_HEADER = "x-request-id"; +const MAX_REQUEST_TRACE_ID_LENGTH = 128; +const TRUSTED_REQUEST_TRACE_ID_PATTERN = /^[A-Za-z0-9][A-Za-z0-9._:-]{0,127}$/; + +const normalizeHeaderValue = (value: string | string[] | undefined) => { + const normalized = + typeof value === "string" + ? value.trim() + : Array.isArray(value) + ? value[0]?.trim() + : ""; + + return normalized ? normalized : null; +}; + +export const createRequestTraceId = ( + headers: Record, + options?: { trustProxyRequestId?: boolean } +) => { + if (options?.trustProxyRequestId) { + const headerValue = normalizeHeaderValue(headers[REQUEST_ID_HEADER]); + if ( + headerValue && + headerValue.length <= MAX_REQUEST_TRACE_ID_LENGTH && + TRUSTED_REQUEST_TRACE_ID_PATTERN.test(headerValue) + ) { + return headerValue; + } + } + + return createId("req"); +}; + +export const getRequestTraceId = (request: FastifyRequest) => request.id; + +export const attachTraceIdHeader = (reply: FastifyReply, traceId: string) => { + reply.header(REQUEST_ID_HEADER, traceId); +}; diff --git a/shared/chatImageTypes.ts b/shared/chatImageTypes.ts index c29a7d6..1629dde 100644 --- a/shared/chatImageTypes.ts +++ b/shared/chatImageTypes.ts @@ -126,6 +126,7 @@ export interface PersistedPromptArtifactRecord { id: string; runId: string; turnId: string; + traceId: string | null; version: number; stage: PersistedPromptArtifactStage; targetKey: string | null; @@ -234,6 +235,7 @@ export interface PersistedPromptSnapshot { } export interface PersistedRunTelemetry { + traceId: string | null; providerRequestId: string | null; providerTaskId: string | null; latencyMs: number | null; diff --git a/shared/createId.ts b/shared/createId.ts new file mode 100644 index 0000000..319b12c --- /dev/null +++ b/shared/createId.ts @@ -0,0 +1,16 @@ +let fallbackCounter = 0; + +const createFallbackId = () => { + fallbackCounter += 1; + return `${Date.now().toString(36)}-${fallbackCounter.toString(36)}`; +}; + +export const createId = (prefix?: string) => { + const baseId = + typeof globalThis.crypto !== "undefined" && typeof globalThis.crypto.randomUUID === "function" + ? globalThis.crypto.randomUUID() + : createFallbackId(); + const normalizedPrefix = typeof prefix === "string" ? prefix.trim() : ""; + + return normalizedPrefix.length > 0 ? `${normalizedPrefix}-${baseId}` : baseId; +}; diff --git a/shared/imageGeneration.ts b/shared/imageGeneration.ts index 68dfa6f..d561758 100644 --- a/shared/imageGeneration.ts +++ b/shared/imageGeneration.ts @@ -109,6 +109,8 @@ export interface ReferenceImage { export interface ImageGenerationAssetRef { assetId: string; role: ImageGenerationAssetRefRole; + referenceType?: ReferenceImageType; + weight?: number; } export interface ImageAssetRefValidationIssue { @@ -195,7 +197,7 @@ export interface GeneratedImage { resultId?: string; imageUrl: string; imageId?: string; - assetId?: string; + assetId: string; provider: ImageProviderId; model: string; mimeType?: string; @@ -215,6 +217,7 @@ export interface ImageGenerationResponse { turnId: string; jobId: string; runId: string; + traceId: string; modelId: import("./imageModelCatalog").FrontendImageModelId; logicalModel: import("./imageModelCatalog").LogicalImageModelId; deploymentId: import("./imageModelCatalog").ImageDeploymentId; diff --git a/shared/imageGenerationSchema.ts b/shared/imageGenerationSchema.ts index e93d776..bc630bf 100644 --- a/shared/imageGenerationSchema.ts +++ b/shared/imageGenerationSchema.ts @@ -52,6 +52,8 @@ export const requestedImageGenerationTargetSchema = z.object({ export const imageGenerationAssetRefSchema = z.object({ assetId: z.string().trim().min(1), role: imageGenerationAssetRefRoleSchema.default("reference"), + referenceType: referenceImageTypeSchema.default("content"), + weight: z.number().min(0).max(1).optional(), }); export const imagePromptIntentEditOpSchema = z.object({ diff --git a/src/features/canvas/CanvasAppBar.tsx b/src/features/canvas/CanvasAppBar.tsx index d23261c..d9c1efc 100644 --- a/src/features/canvas/CanvasAppBar.tsx +++ b/src/features/canvas/CanvasAppBar.tsx @@ -1,22 +1,16 @@ -import { useNavigate } from "@tanstack/react-router"; import { CirclePlus, Download, Redo2, Undo2 } from "lucide-react"; import { Input } from "@/components/ui/input"; import { useCanvasStore } from "@/stores/canvasStore"; import { useCanvasHistory } from "./hooks/useCanvasHistory"; +import { useCanvasWorkbenchActions } from "./hooks/useCanvasWorkbenchActions"; interface CanvasAppBarProps { onExport: () => void; } export function CanvasAppBar({ onExport }: CanvasAppBarProps) { - const navigate = useNavigate(); - const activeWorkbenchId = useCanvasStore((state) => state.activeWorkbenchId); - const activeWorkbenchName = useCanvasStore((state) => { - const activeWorkbench = state.workbenches.find((entry) => entry.id === state.activeWorkbenchId); - return activeWorkbench?.name ?? ""; - }); - const createWorkbench = useCanvasStore((state) => state.createWorkbench); - const upsertWorkbench = useCanvasStore((state) => state.upsertWorkbench); + const { activeWorkbenchId, activeWorkbenchMeta, createWorkbenchAndNavigate, renameActiveWorkbench } = + useCanvasWorkbenchActions(); const zoom = useCanvasStore((state) => state.zoom); const { canUndo, canRedo, undo, redo } = useCanvasHistory(); @@ -25,19 +19,9 @@ export function CanvasAppBar({ onExport }: CanvasAppBarProps) {
{activeWorkbenchId ? ( { - const currentWorkbench = useCanvasStore - .getState() - .workbenches.find((entry) => entry.id === activeWorkbenchId); - if (!currentWorkbench) { - return; - } - - void upsertWorkbench({ - ...currentWorkbench, - name: event.target.value || "\u672a\u547d\u540d\u5de5\u4f5c\u53f0", - }); + void renameActiveWorkbench(event.target.value); }} className="h-8 w-[220px] rounded-lg border-white/10 bg-white/[0.06] px-2.5 text-sm text-zinc-100 placeholder:text-zinc-500" placeholder="\u5de5\u4f5c\u53f0\u540d\u79f0" @@ -48,16 +32,7 @@ export function CanvasAppBar({ onExport }: CanvasAppBarProps) { ))} -
+ + ) : (
state.upsertElement); - const setElementDraftAdjustments = useCanvasRuntimeStore( - (state) => state.setElementDraftAdjustments - ); - const clearElementDraftAdjustments = useCanvasRuntimeStore( - (state) => state.clearElementDraftAdjustments - ); - const requestBoardPreview = useCanvasRuntimeStore((state) => state.requestBoardPreview); + const { + clearElementDraftAdjustments, + requestBoardPreview, + setElementDraftAdjustments, + } = useCanvasPreviewActions(); const [openSections, setOpenSections] = useState(createInitialOpenSections); const { activeWorkbench, committedSelectedElementIds, primarySelectedImageElement: imageElement } = useCanvasSelectionModel(); + const { setAdjustments } = useCanvasImagePropertyActions(imageElement); const committedImageElement = useMemo( () => resolvePrimarySelectedImageElement(activeWorkbench, committedSelectedElementIds), [activeWorkbench, committedSelectedElementIds] ); + const displayedImageElementId = imageElement?.id ?? null; const committedImageElementId = committedImageElement?.id ?? null; const committedImageElementIdRef = useRef(committedImageElementId); - - const asset = useAssetStore((state) => - imageElement - ? (state.assets.find((candidate) => candidate.id === imageElement.assetId) ?? null) - : null - ); - - const draftAdjustments = useCanvasRuntimeStore((state) => - imageElement ? state.draftAdjustmentsByElementId[imageElement.id] : undefined - ); + const displayedImageElementIdRef = useRef(displayedImageElementId); + const { asset } = useCanvasRuntimeAsset(imageElement?.assetId ?? null); + const draftAdjustments = useCanvasElementDraftAdjustments(imageElement?.id ?? null); const adjustments = useMemo( () => @@ -559,9 +554,32 @@ export function CanvasImageEditPanel({ children }: CanvasImageEditPanelProps) { committedImageElementIdRef.current = committedImageElementId; }, [clearElementDraftAdjustments, committedImageElementId]); + useEffect(() => { + const previousDisplayedImageElementId = displayedImageElementIdRef.current; + if ( + previousDisplayedImageElementId && + previousDisplayedImageElementId !== displayedImageElementId && + previousDisplayedImageElementId !== committedImageElementId + ) { + clearElementDraftAdjustments(previousDisplayedImageElementId); + } + displayedImageElementIdRef.current = displayedImageElementId; + }, [ + clearElementDraftAdjustments, + committedImageElementId, + displayedImageElementId, + ]); + useEffect( () => () => { const currentCommittedImageElementId = committedImageElementIdRef.current; + const currentDisplayedImageElementId = displayedImageElementIdRef.current; + if ( + currentDisplayedImageElementId && + currentDisplayedImageElementId !== currentCommittedImageElementId + ) { + clearElementDraftAdjustments(currentDisplayedImageElementId); + } if (currentCommittedImageElementId) { clearElementDraftAdjustments(currentCommittedImageElementId); } @@ -586,10 +604,7 @@ export function CanvasImageEditPanel({ children }: CanvasImageEditPanelProps) { return; } setElementDraftAdjustments(imageElement.id, nextAdjustments); - await upsertElement({ - ...imageElement, - adjustments: nextAdjustments, - }); + await setAdjustments(nextAdjustments); clearElementDraftAdjustments(imageElement.id); void requestBoardPreview(imageElement.id, "interactive"); }, @@ -598,7 +613,7 @@ export function CanvasImageEditPanel({ children }: CanvasImageEditPanelProps) { imageElement, requestBoardPreview, setElementDraftAdjustments, - upsertElement, + setAdjustments, ] ); diff --git a/src/features/canvas/CanvasLayerPanel.tsx b/src/features/canvas/CanvasLayerPanel.tsx index e74754b..2346eae 100644 --- a/src/features/canvas/CanvasLayerPanel.tsx +++ b/src/features/canvas/CanvasLayerPanel.tsx @@ -1,9 +1,7 @@ import { Eye, EyeOff, GripVertical, Layers3, Lock, Trash2, Unlock } from "lucide-react"; -import { memo, useCallback, useState } from "react"; +import { memo } from "react"; import { cn } from "@/lib/utils"; -import { useCanvasStore } from "@/stores/canvasStore"; import type { Asset, CanvasRenderableNode } from "@/types"; -import { getCanvasDescendantIds } from "./documentGraph"; import { canvasDockActionChipClassName, canvasDockBadgeClassName, @@ -19,9 +17,7 @@ import { canvasDockSectionClassName, canvasDockSectionMutedClassName, } from "./editDockTheme"; -import { useCanvasSelectionModel } from "./hooks/useCanvasSelectionModel"; -import { useCanvasInteraction } from "./hooks/useCanvasInteraction"; -import { useCanvasLayers } from "./hooks/useCanvasLayers"; +import { useCanvasLayerPanelModel } from "./hooks/useCanvasLayerPanelModel"; interface LayerRowProps { asset: Asset | null; @@ -154,141 +150,20 @@ const LayerRow = memo(function LayerRow({ export function CanvasLayerPanel() { const { - activeWorkbench, - layers, assetById, - activeWorkbenchId, - reparentNodes, - reorderElements, - toggleElementVisibility, - toggleElementLock, - deleteElements, - } = useCanvasLayers(); - const groupElements = useCanvasStore((state) => state.groupElements); - const ungroupElement = useCanvasStore((state) => state.ungroupElement); - const { displaySelectedElementIdSet, displaySelectedElementIds, primarySelectedElement } = - useCanvasSelectionModel(); - const { selectElement } = useCanvasInteraction(); - const [draggingId, setDraggingId] = useState(null); - - const reorder = useCallback( - (fromId: string, toId: string) => { - if (!activeWorkbenchId || fromId === toId) { - return; - } - const fromLayer = layers.find((layer) => layer.id === fromId); - const toLayer = layers.find((layer) => layer.id === toId); - if (!fromLayer || !toLayer) { - return; - } - - const targetParentId = toLayer.type === "group" ? toLayer.id : (toLayer.parentId ?? null); - if (targetParentId === fromId) { - return; - } - if ( - fromLayer.type === "group" && - activeWorkbench && - targetParentId && - getCanvasDescendantIds(activeWorkbench, fromLayer.id).includes(targetParentId) - ) { - return; - } - - if (fromLayer.parentId !== targetParentId) { - const targetSiblingIds = layers - .filter((layer) => layer.parentId === targetParentId) - .map((layer) => layer.id); - const targetIndex = - toLayer.type === "group" ? targetSiblingIds.length : targetSiblingIds.indexOf(toLayer.id); - void reparentNodes( - [fromId], - targetParentId, - targetIndex < 0 ? undefined : targetIndex - ); - return; - } - - const siblingIds = layers - .filter((layer) => layer.parentId === targetParentId) - .map((layer) => layer.id); - const fromIndex = siblingIds.indexOf(fromId); - const toIndex = siblingIds.indexOf(toId); - if (fromIndex < 0 || toIndex < 0) { - return; - } - const ordered = siblingIds.slice(); - const [moved] = ordered.splice(fromIndex, 1); - ordered.splice(toIndex, 0, moved); - void reorderElements(ordered.reverse(), targetParentId); - }, - [activeWorkbench, activeWorkbenchId, layers, reorderElements, reparentNodes] - ); - - const handleSelect = useCallback( - (layerId: string, additive: boolean) => { - selectElement(layerId, { additive }); - }, - [selectElement] - ); - - const handleDelete = useCallback( - (layerId: string) => { - if (!activeWorkbenchId) { - return; - } - void deleteElements([layerId]); - }, - [activeWorkbenchId, deleteElements] - ); - - const handleToggleVisibility = useCallback( - (layerId: string) => { - if (!activeWorkbenchId) { - return; - } - void toggleElementVisibility(layerId); - }, - [activeWorkbenchId, toggleElementVisibility] - ); - - const handleToggleLock = useCallback( - (layerId: string) => { - if (!activeWorkbenchId) { - return; - } - void toggleElementLock(layerId); - }, - [activeWorkbenchId, toggleElementLock] - ); - - const handleDragStart = useCallback((layerId: string) => { - setDraggingId(layerId); - }, []); - - const handleDrop = useCallback( - (layerId: string) => { - if (draggingId) { - reorder(draggingId, layerId); - } - setDraggingId(null); - }, - [draggingId, reorder] - ); - - const handleGroup = useCallback(() => { - if (!activeWorkbenchId || displaySelectedElementIds.length < 2) { - return; - } - void groupElements(displaySelectedElementIds); - }, [activeWorkbenchId, displaySelectedElementIds, groupElements]); - - const handleUngroup = useCallback(() => { - if (!activeWorkbenchId || primarySelectedElement?.type !== "group") { - return; - } - void ungroupElement(primarySelectedElement.id); - }, [activeWorkbenchId, primarySelectedElement, ungroupElement]); + displaySelectedElementIdSet, + displaySelectedElementIds, + handleDelete, + handleDragStart, + handleDrop, + handleGroup, + handleSelect, + handleToggleLock, + handleToggleVisibility, + handleUngroup, + layers, + primarySelectedElement, + } = useCanvasLayerPanelModel(); return (
diff --git a/src/features/canvas/CanvasPropertiesPanel.tsx b/src/features/canvas/CanvasPropertiesPanel.tsx index b88d6c4..a399d21 100644 --- a/src/features/canvas/CanvasPropertiesPanel.tsx +++ b/src/features/canvas/CanvasPropertiesPanel.tsx @@ -1,4 +1,4 @@ -import { memo, useMemo } from "react"; +import { memo } from "react"; import { SlidersHorizontal } from "lucide-react"; import { filmProfiles } from "@/data/filmProfiles"; import { Input } from "@/components/ui/input"; @@ -10,9 +10,7 @@ import { SelectValue, } from "@/components/ui/select"; import { cn } from "@/lib/utils"; -import { useAssetStore } from "@/stores/assetStore"; -import { useCanvasStore } from "@/stores/canvasStore"; -import type { CanvasElement } from "@/types"; +import type { CanvasTextFontSizeTier } from "@/types"; import { canvasDockBodyTextClassName, canvasDockEmptyStateClassName, @@ -27,15 +25,10 @@ import { canvasDockSectionClassName, canvasDockSectionMutedClassName, } from "./editDockTheme"; -import { useCanvasSelectionModel } from "./hooks/useCanvasSelectionModel"; import { - applyCanvasTextFontSizeTier, - CANVAS_TEXT_COLOR_OPTIONS, - CANVAS_TEXT_FONT_OPTIONS, CANVAS_TEXT_SIZE_TIER_OPTIONS, - getCanvasTextColorOption, - getCanvasTextFontOption, } from "./textStyle"; +import { useCanvasPropertiesPanelModel } from "./hooks/useCanvasPropertiesPanelModel"; interface CanvasPropertiesPanelProps { variant?: "embedded" | "standalone"; @@ -73,46 +66,28 @@ const NumberInput = ({ export const CanvasPropertiesPanel = memo(function CanvasPropertiesPanel({ variant = "standalone", }: CanvasPropertiesPanelProps) { - const upsertElement = useCanvasStore((state) => state.upsertElement); - const assets = useAssetStore((state) => state.assets); - const { activeWorkbench, primarySelectedElement: selected } = useCanvasSelectionModel(); - - const update = (patch: Partial) => { - if (!selected) { - return; - } - void upsertElement({ - ...selected, - ...patch, - } as CanvasElement); - }; - - const selectedAsset = useMemo(() => { - if (!selected || selected.type !== "image") { - return null; - } - return assets.find((asset) => asset.id === selected.assetId) ?? null; - }, [assets, selected]); - - const textFontOptions = useMemo(() => { - if (!selected || selected.type !== "text") { - return CANVAS_TEXT_FONT_OPTIONS; - } - const current = getCanvasTextFontOption(selected.fontFamily); - return CANVAS_TEXT_FONT_OPTIONS.some((option) => option.value === current.value) - ? CANVAS_TEXT_FONT_OPTIONS - : [...CANVAS_TEXT_FONT_OPTIONS, current]; - }, [selected]); - - const textColorOptions = useMemo(() => { - if (!selected || selected.type !== "text") { - return CANVAS_TEXT_COLOR_OPTIONS; - } - const current = getCanvasTextColorOption(selected.color); - return CANVAS_TEXT_COLOR_OPTIONS.some((option) => option.value === current.value) - ? CANVAS_TEXT_COLOR_OPTIONS - : [...CANVAS_TEXT_COLOR_OPTIONS, current]; - }, [selected]); + const { + activeWorkbench, + selected, + selectedAsset, + setFilmProfileId, + setFill, + setFontFamily, + setFontSizeTier, + setHeight, + setOpacity, + setRotation, + setStroke, + setStrokeWidth, + setTextAlign, + setTextColor, + setTextContent, + setWidth, + setX, + setY, + textColorOptions, + textFontOptions, + } = useCanvasPropertiesPanelModel(); const isEmbedded = variant === "embedded"; return ( @@ -184,33 +159,18 @@ export const CanvasPropertiesPanel = memo(function CanvasPropertiesPanel({

Transform

- update({ x: value })} /> - update({ y: value })} /> - update({ width: Math.max(1, value) })} - /> - update({ height: Math.max(1, value) })} - /> - update({ rotation: value })} - /> + + + + + update({ opacity: Math.max(0, Math.min(1, value)) })} + onChange={setOpacity} />
@@ -229,9 +189,7 @@ export const CanvasPropertiesPanel = memo(function CanvasPropertiesPanel({ update({ content: event.target.value })} + onChange={(event) => setTextContent(event.target.value)} className={canvasDockFieldClassName} />
- @@ -274,11 +229,7 @@ export const CanvasPropertiesPanel = memo(function CanvasPropertiesPanel({ - @@ -305,9 +256,7 @@ export const CanvasPropertiesPanel = memo(function CanvasPropertiesPanel({ update({ fill: event.target.value } as Partial)} + onChange={(event) => setFill(event.target.value)} className={canvasDockFieldClassName} /> @@ -339,9 +288,7 @@ export const CanvasPropertiesPanel = memo(function CanvasPropertiesPanel({ - update({ stroke: event.target.value } as Partial) - } + onChange={(event) => setStroke(event.target.value)} className={canvasDockFieldClassName} /> @@ -350,9 +297,7 @@ export const CanvasPropertiesPanel = memo(function CanvasPropertiesPanel({ value={selected.strokeWidth} min={0} step={0.5} - onChange={(value) => - update({ strokeWidth: Math.max(0, value) } as Partial) - } + onChange={setStrokeWidth} />
) : null} diff --git a/src/features/canvas/CanvasStoryPanel.tsx b/src/features/canvas/CanvasStoryPanel.tsx index 097f52f..0015d7b 100644 --- a/src/features/canvas/CanvasStoryPanel.tsx +++ b/src/features/canvas/CanvasStoryPanel.tsx @@ -1,22 +1,8 @@ import { Frame, LayoutTemplate, ScissorsLineDashed, Shield } from "lucide-react"; -import { useEffect, useMemo } from "react"; import { Button } from "@/components/ui/button"; import { Input } from "@/components/ui/input"; import { cn } from "@/lib/utils"; -import { useCanvasStore } from "@/stores/canvasStore"; -import type { CanvasPresetId } from "@/types"; -import { - applyCanvasPresetToDocument, - getStudioCanvasPreset, - STUDIO_CANVAS_PRESETS, -} from "./studioPresets"; -import { - appendCanvasSlice, - buildStripSlices, - clearCanvasSlices, - deleteCanvasSlice, - updateCanvasSlice, -} from "./slices"; +import { STUDIO_CANVAS_PRESETS } from "./studioPresets"; import { canvasDockActionChipClassName, canvasDockBadgeClassName, @@ -35,6 +21,7 @@ import { canvasDockSectionClassName, canvasDockSectionMutedClassName, } from "./editDockTheme"; +import { useCanvasStoryPanelModel } from "./hooks/useCanvasStoryPanelModel"; interface CanvasStoryPanelProps { selectedSliceId: string | null; @@ -42,74 +29,30 @@ interface CanvasStoryPanelProps { } export function CanvasStoryPanel({ selectedSliceId, onSelectSlice }: CanvasStoryPanelProps) { - const workbenches = useCanvasStore((state) => state.workbenches); - const activeWorkbenchId = useCanvasStore((state) => state.activeWorkbenchId); - const upsertWorkbench = useCanvasStore((state) => state.upsertWorkbench); - - const activeWorkbench = useMemo( - () => workbenches.find((document) => document.id === activeWorkbenchId) ?? null, - [workbenches, activeWorkbenchId] - ); - - const orderedSlices = useMemo( - () => activeWorkbench?.slices.slice().sort((left, right) => left.order - right.order) ?? [], - [activeWorkbench?.slices] - ); - - const selectedSlice = - orderedSlices.find((slice) => slice.id === selectedSliceId) ?? orderedSlices[0] ?? null; - const currentPreset = getStudioCanvasPreset(activeWorkbench?.presetId); - - useEffect(() => { - if (!selectedSliceId && orderedSlices[0]) { - onSelectSlice(orderedSlices[0].id); - return; - } - if (selectedSliceId && !orderedSlices.some((slice) => slice.id === selectedSliceId)) { - onSelectSlice(orderedSlices[0]?.id ?? null); - } - }, [onSelectSlice, orderedSlices, selectedSliceId]); + const { + activeWorkbench, + appendSlice, + applyPreset, + buildStripSlices, + clearSlices, + currentPreset, + deleteSelectedSlice, + orderedSlices, + selectSlice, + selectedSlice, + updateGuide, + updateSafeArea, + updateSelectedSliceName, + updateSelectedSliceNumberField, + } = useCanvasStoryPanelModel({ + selectedSliceId, + onSelectSlice, + }); if (!activeWorkbench) { return null; } - const commitDocument = (nextDocument: typeof activeWorkbench) => { - void upsertWorkbench(nextDocument); - }; - - const updateGuide = (key: keyof typeof activeWorkbench.guides, value: boolean) => { - commitDocument({ - ...activeWorkbench, - guides: { - ...activeWorkbench.guides, - [key]: value, - }, - }); - }; - - const updateSafeArea = (key: keyof typeof activeWorkbench.safeArea, rawValue: string) => { - const nextValue = Math.max(0, Number(rawValue) || 0); - commitDocument({ - ...activeWorkbench, - safeArea: { - ...activeWorkbench.safeArea, - [key]: nextValue, - }, - }); - }; - - const updateSelectedSlice = (patch: Parameters[2]) => { - if (!selectedSlice) { - return; - } - commitDocument(updateCanvasSlice(activeWorkbench, selectedSlice.id, patch)); - }; - - const applyPreset = (presetId: CanvasPresetId) => { - commitDocument(applyCanvasPresetToDocument(activeWorkbench, presetId)); - }; - return (
@@ -203,8 +146,7 @@ export function CanvasStoryPanel({ selectedSliceId, onSelectSlice }: CanvasStory variant="secondary" className={canvasDockActionChipClassName} onClick={() => { - onSelectSlice(null); - commitDocument(clearCanvasSlices(activeWorkbench)); + clearSlices(); }} > Single Frame @@ -216,9 +158,7 @@ export function CanvasStoryPanel({ selectedSliceId, onSelectSlice }: CanvasStory variant="secondary" className={canvasDockActionChipClassName} onClick={() => { - const nextDocument = buildStripSlices(activeWorkbench, count); - onSelectSlice(nextDocument.slices[0]?.id ?? null); - commitDocument(nextDocument); + buildStripSlices(count); }} > {count} Frames @@ -229,9 +169,7 @@ export function CanvasStoryPanel({ selectedSliceId, onSelectSlice }: CanvasStory variant="secondary" className={canvasDockActionChipClassName} onClick={() => { - const nextDocument = appendCanvasSlice(activeWorkbench); - onSelectSlice(nextDocument.slices[nextDocument.slices.length - 1]?.id ?? null); - commitDocument(nextDocument); + appendSlice(); }} > Add Frame @@ -246,7 +184,7 @@ export function CanvasStoryPanel({ selectedSliceId, onSelectSlice }: CanvasStory @@ -305,7 +239,7 @@ export function CanvasStoryPanel({ selectedSliceId, onSelectSlice }: CanvasStory updateSelectedSlice({ name: event.target.value })} + onChange={(event) => updateSelectedSliceName(event.target.value)} className={canvasDockFieldClassName} /> @@ -323,12 +257,7 @@ export function CanvasStoryPanel({ selectedSliceId, onSelectSlice }: CanvasStory min={0} value={Math.round(value)} onChange={(event) => - updateSelectedSlice({ - [key]: Math.max( - key === "x" || key === "y" ? 0 : 1, - Number(event.target.value) || 0 - ), - }) + updateSelectedSliceNumberField(key, event.target.value) } className={canvasDockFieldClassName} /> diff --git a/src/features/canvas/CanvasTextToolbar.tsx b/src/features/canvas/CanvasTextToolbar.tsx index 5c9a4e6..8aaf580 100644 --- a/src/features/canvas/CanvasTextToolbar.tsx +++ b/src/features/canvas/CanvasTextToolbar.tsx @@ -9,8 +9,9 @@ import { type ReactNode, } from "react"; import { Check } from "lucide-react"; -import type { CanvasTextElement, CanvasTextFontSizeTier } from "@/types"; +import type { CanvasTextFontSizeTier } from "@/types"; import { cn } from "@/lib/utils"; +import type { CanvasTextEditorModel } from "./textRuntimeViewModel"; import { CANVAS_TEXT_COLOR_OPTIONS, CANVAS_TEXT_FONT_OPTIONS, @@ -28,7 +29,7 @@ type ToolbarMenu = "color" | "font" | "size" | null; const TOOLBAR_MENU_GAP = 10; export interface CanvasTextToolbarProps { - element: CanvasTextElement; + element: CanvasTextEditorModel; onColorChange: (value: string) => void; onFontFamilyChange: (value: string) => void; onFontSizeTierChange: (value: CanvasTextFontSizeTier) => void; diff --git a/src/features/canvas/CanvasViewport.tsx b/src/features/canvas/CanvasViewport.tsx index 2cfebba..2b748b3 100644 --- a/src/features/canvas/CanvasViewport.tsx +++ b/src/features/canvas/CanvasViewport.tsx @@ -1,347 +1,118 @@ import type Konva from "konva"; -import { - Fragment, - memo, - useCallback, - useEffect, - useMemo, - useRef, - useState, - type RefObject, -} from "react"; -import { unstable_batchedUpdates } from "react-dom"; import { Crosshair, Hand, Minus, MousePointer2, Plus } from "lucide-react"; -import { Layer, Line, Rect, Stage, Text as KonvaText } from "react-konva"; -import type { - CanvasRenderableElement, - CanvasRenderableNode, - CanvasShapeElement, - CanvasTextElement, -} from "@/types"; +import { useCallback, useEffect, useMemo, useRef, type RefObject } from "react"; +import { shallow } from "zustand/shallow"; +import type { CanvasShapeElement } from "@/types"; import { cn } from "@/lib/utils"; import { useCanvasStore } from "@/stores/canvasStore"; -import { useCanvasRuntimeStore } from "@/stores/canvasRuntimeStore"; -import { CanvasTextToolbar } from "./CanvasTextToolbar"; -import { ImageElement } from "./elements/ImageElement"; -import { ShapeElement } from "./elements/ShapeElement"; -import { getVisibleWorldGridBounds, GRID_SIZE, quantizeDragPosition } from "./grid"; -import type { CanvasOverlayRect } from "./overlayGeometry"; -import { - isSelectableSelectionTarget, - normalizeSelectionRect, - resolveCompletedMarqueeSelectionIds, - resolveMarqueeSelectionIds, - screenRectToWorldRect, - selectionDistanceExceedsThreshold, - type CanvasSelectionTarget, - type CanvasSelectionPoint, -} from "./selectionGeometry"; -import { - applyCanvasTextFontSizeTier, - CANVAS_TEXT_LINE_HEIGHT_MULTIPLIER, - fitCanvasTextElementToContent, -} from "./textStyle"; -import { TextElement, isCanvasTextElementEditable } from "./elements/TextElement"; -import { registerCanvasStage } from "./hooks/canvasStageRegistry"; +import { CanvasViewportOverlayHost } from "./CanvasViewportOverlayHost"; +import { CanvasViewportStageShell } from "./CanvasViewportStageShell"; +import { VIEWPORT_INSETS } from "./canvasViewportConstants"; +import { isCanvasTextElementEditable } from "./elements/TextElement"; +import { getVisibleWorldGridBounds, quantizeDragPosition } from "./grid"; +import { useCanvasActiveWorkbenchCommands } from "./hooks/useCanvasActiveWorkbenchCommands"; +import { useCanvasActiveWorkbenchState } from "./hooks/useCanvasActiveWorkbenchState"; import { useCanvasInteraction } from "./hooks/useCanvasInteraction"; -import { useCanvasViewportOverlay } from "./hooks/useCanvasViewportOverlay"; +import { useCanvasMarqueeSelection } from "./hooks/useCanvasMarqueeSelection"; import { useCanvasSelectionModel } from "./hooks/useCanvasSelectionModel"; +import { useCanvasTextRuntimeViewModel } from "./hooks/useCanvasTextRuntimeViewModel"; import { useCanvasTextSession } from "./hooks/useCanvasTextSession"; -import { selectionIdsEqual } from "./selectionModel"; -import { resolveCanvasToolController } from "./tools/toolControllers"; +import { useCanvasViewportLifecycle } from "./hooks/useCanvasViewportLifecycle"; +import { useCanvasViewportNavigation } from "./hooks/useCanvasViewportNavigation"; +import { useCanvasViewportToolOrchestrator } from "./hooks/useCanvasViewportToolOrchestrator"; +import { applyCanvasTextFontSizeTier } from "./textStyle"; +import type { CanvasToolName } from "./tools/toolControllers"; interface CanvasViewportProps { stageRef: RefObject; selectedSliceId?: string | null; } -const BOARD_SURFACE_NODE_ID = "canvas-background"; -const WORKSPACE_BACKGROUND_NODE_ID = "canvas-workspace-background"; -const WORKSPACE_DOT_GRID_NODE_ID = "canvas-workspace-grid"; -const DOT_RADIUS = 0.72; -const WORKSPACE_BACKGROUND_FILL = "rgb(38, 38, 38)"; -const WORKSPACE_DOT_FILL = "rgb(68, 68, 68)"; -const VIEWPORT_INSETS = { - top: 88, - right: 32, - bottom: 104, - left: 112, -}; -const FLOATING_TOOLBAR_GAP = 12; -const DEFAULT_TEXT_TOOLBAR_SIZE = { - width: 196, - height: 48, -}; -const DEFAULT_DIMENSIONS_BADGE_SIZE = { - width: 116, - height: 40, -}; -const MARQUEE_DRAG_THRESHOLD_PX = 4; -const CANVAS_SELECTION_ACCENT = "#f59e0b"; -const CANVAS_SELECTION_ACCENT_FILL = "rgba(245,158,11,0.12)"; - -const clamp = (value: number, min: number, max: number) => Math.min(max, Math.max(min, value)); - -const isInputLikeElement = (target: EventTarget | null) => { - if (!(target instanceof HTMLElement)) { - return false; - } - const tagName = target.tagName.toLowerCase(); - return ( - tagName === "input" || - tagName === "textarea" || - tagName === "select" || - target.isContentEditable - ); -}; - -const getSelectionOverlayRect = (node: Konva.Node): CanvasOverlayRect => { - const rect = node.getClientRect({ - skipShadow: true, - skipStroke: true, - }); - - return { - x: rect.x, - y: rect.y, - width: rect.width, - height: rect.height, - }; -}; - -interface MarqueeSelectionState { - additive: boolean; - baseSelectedIds: string[]; - currentCanvas: CanvasSelectionPoint; - currentScreen: CanvasSelectionPoint; - hasActivated: boolean; - startCanvas: CanvasSelectionPoint; - startScreen: CanvasSelectionPoint; +interface CanvasViewportControlsProps { + adjustZoom: (direction: "in" | "out") => void; + resetView: () => void; + setTool: (tool: CanvasToolName) => void; + shouldPan: boolean; + tool: CanvasToolName; } -interface MarqueeSelectionRenderState { - hasSession: boolean; - isDragging: boolean; - rect: CanvasOverlayRect | null; -} - -const EMPTY_MARQUEE_RENDER_STATE: MarqueeSelectionRenderState = { - hasSession: false, - isDragging: false, - rect: null, -}; - -const marqueeRenderStateEqual = ( - left: MarqueeSelectionRenderState, - right: MarqueeSelectionRenderState -) => { - const rectsEqual = - left.rect === right.rect || - (!!left.rect && - !!right.rect && - Math.abs(left.rect.x - right.rect.x) < 0.5 && - Math.abs(left.rect.y - right.rect.y) < 0.5 && - Math.abs(left.rect.width - right.rect.width) < 0.5 && - Math.abs(left.rect.height - right.rect.height) < 0.5); - - if (!rectsEqual) { - return false; - } - - if (left.hasSession !== right.hasSession || left.isDragging !== right.isDragging) { - return false; - } - return true; -}; - -function DotGrid({ - bounds, -}: { - bounds: { - x: number; - y: number; - width: number; - height: number; - }; -}) { - const [dotGridPattern, setDotGridPattern] = useState(null); - - useEffect(() => { - if (typeof document === "undefined") { - return; - } - - const canvas = document.createElement("canvas"); - canvas.width = GRID_SIZE; - canvas.height = GRID_SIZE; - - const context = canvas.getContext("2d"); - if (!context) { - return; - } - - context.clearRect(0, 0, GRID_SIZE, GRID_SIZE); - context.fillStyle = WORKSPACE_DOT_FILL; - context.beginPath(); - context.arc(0, 0, DOT_RADIUS, 0, Math.PI * 2, false); - context.fill(); - - const patternImage = new Image(); - let isActive = true; - patternImage.onload = () => { - if (isActive) { - setDotGridPattern(patternImage); - } - }; - patternImage.src = canvas.toDataURL("image/png"); - - return () => { - isActive = false; - patternImage.onload = null; - }; - }, []); - - if (!dotGridPattern || bounds.width <= 0 || bounds.height <= 0) { - return null; - } - +function CanvasViewportControls({ + adjustZoom, + resetView, + setTool, + shouldPan, + tool, +}: CanvasViewportControlsProps) { return ( - - ); -} - -interface CanvasElementsLayerProps { - dragBoundFunc: (position: { x: number; y: number }) => { x: number; y: number }; - editingTextDraft: CanvasTextElement | null; - editingTextId: string | null; - elements: CanvasRenderableElement[]; - interactivePreviewElementId: string | null; - onElementDragEnd: (elementId: string, x: number, y: number) => void; - onElementSelect: (elementId: string, additive: boolean) => void; - onTextElementDoubleClick: (elementId: string) => void; -} - -const CanvasElementsLayer = memo(function CanvasElementsLayer({ - dragBoundFunc, - editingTextDraft, - editingTextId, - elements, - interactivePreviewElementId, - onElementDragEnd, - onElementSelect, - onTextElementDoubleClick, -}: CanvasElementsLayerProps) { - return ( - <> - {elements.map((element) => { - if (element.type === "image") { - return ( - - ); - } - - if (element.type === "shape") { - return ( - - ); - } - - const liveTextElement = editingTextDraft?.id === element.id ? editingTextDraft : element; - return ( - - ); - })} - +
+
+ + +
+
+ + + +
); -}); - -interface CanvasSelectionOutlineLayerProps { - selectedElements: Array; } -const CanvasSelectionOutlineLayer = memo(function CanvasSelectionOutlineLayer({ - selectedElements, -}: CanvasSelectionOutlineLayerProps) { - return ( - <> - {selectedElements.map((element) => { - const outlineElement = - element.type === "group" - ? { - id: element.id, - rotation: 0, - x: element.bounds.x, - y: element.bounds.y, - width: element.bounds.width, - height: element.bounds.height, - } - : element.type === "text" - ? fitCanvasTextElementToContent(element) - : element; - - return ( - - ); - })} - - ); -}); - export function CanvasViewport({ stageRef, selectedSliceId }: CanvasViewportProps) { - const activeWorkbenchId = useCanvasStore((state) => state.activeWorkbenchId); - const activeWorkbench = useCanvasStore((state) => - state.activeWorkbenchId - ? (state.workbenches.find((document) => document.id === state.activeWorkbenchId) ?? null) - : null + const { activeWorkbench, activeWorkbenchId } = useCanvasActiveWorkbenchState(); + const { upsertElement } = useCanvasActiveWorkbenchCommands(); + const availableWorkbenchIds = useCanvasStore( + (state) => state.workbenches.map((workbench) => workbench.id), + shallow ); const executeCommandInWorkbench = useCanvasStore((state) => state.executeCommandInWorkbench); - const upsertElement = useCanvasStore((state) => state.upsertElement); const upsertElementInWorkbench = useCanvasStore((state) => state.upsertElementInWorkbench); const tool = useCanvasStore((state) => state.tool); const activeShapeType = useCanvasStore((state) => state.activeShapeType); @@ -350,64 +121,30 @@ export function CanvasViewport({ stageRef, selectedSliceId }: CanvasViewportProp const setZoom = useCanvasStore((state) => state.setZoom); const viewport = useCanvasStore((state) => state.viewport); const setViewport = useCanvasStore((state) => state.setViewport); - const setSelectionPreviewElementIds = useCanvasRuntimeStore( - (state) => state.setSelectionPreviewElementIds - ); - const clearSelectionPreview = useCanvasRuntimeStore((state) => state.clearSelectionPreview); const { displaySelectedElementIds } = useCanvasSelectionModel(); const { selectedElementIds, setSelectedElementIds, selectElement, clearSelection } = useCanvasInteraction(); - const viewportContainerRef = useRef(null); - const textToolbarRef = useRef(null); - const dimensionsBadgeRef = useRef(null); - const textEditorRef = useRef(null); - const textEditorInputRef = useRef(null); - const [isSpacePressed, setIsSpacePressed] = useState(false); - const [isPanning, setIsPanning] = useState(false); - const [marqueeRenderState, setMarqueeRenderState] = useState( - EMPTY_MARQUEE_RENDER_STATE - ); - const panningAnchorRef = useRef<{ x: number; y: number } | null>(null); - const viewportAnchorRef = useRef<{ x: number; y: number } | null>(null); - const initializedDocumentIdsRef = useRef>(new Set()); - const marqueeRenderFrameRef = useRef(null); - const marqueeSelectionRef = useRef(null); - const marqueeSelectionTargetsRef = useRef([]); - const selectedElementIdsRef = useRef(selectedElementIds); - const [stageSize, setStageSize] = useState(() => ({ - width: 0, - height: 0, - })); const elementById = useMemo( () => new Map((activeWorkbench?.allNodes ?? []).map((element) => [element.id, element])), [activeWorkbench?.allNodes] ); const elementByIdRef = useRef(elementById); + + useEffect(() => { + elementByIdRef.current = elementById; + }, [elementById]); + const interactivePreviewElementId = useMemo( () => (displaySelectedElementIds.length === 1 ? displaySelectedElementIds[0]! : null), [displaySelectedElementIds] ); - const displaySelectedElements = useMemo( - () => - displaySelectedElementIds - .map((elementId) => { - const element = elementById.get(elementId); - if (!element) { - return null; - } - return element.type === "text" && editingTextDraft?.id === element.id - ? editingTextDraft - : element; - }) - .filter((element): element is CanvasRenderableNode | CanvasTextElement => Boolean(element)), - [displaySelectedElementIds, editingTextDraft, elementById] - ); const singleSelectedElement = useMemo(() => { if (selectedElementIds.length !== 1) { return null; } + return elementById.get(selectedElementIds[0]!) ?? null; }, [elementById, selectedElementIds]); @@ -417,54 +154,87 @@ export function CanvasViewport({ stageRef, selectedSliceId }: CanvasViewportProp ); const singleSelectedNonTextElement = useMemo( () => - singleSelectedElement && singleSelectedElement.type !== "text" ? singleSelectedElement : null, + singleSelectedElement && singleSelectedElement.type !== "text" + ? singleSelectedElement + : null, [singleSelectedElement] ); - const isDraggingMarqueeSelection = useCallback( - (state: MarqueeSelectionState) => - selectionDistanceExceedsThreshold( - state.startScreen, - state.currentScreen, - MARQUEE_DRAG_THRESHOLD_PX - ), - [] - ); - const isMarqueeDragging = marqueeRenderState.isDragging; - const hasMarqueeSession = marqueeRenderState.hasSession; + + const { fitView, isSpacePressed, stageSize, viewportContainerRef } = + useCanvasViewportLifecycle({ + activeWorkbench, + activeWorkbenchId, + insets: VIEWPORT_INSETS, + stageRef, + setViewport, + setZoom, + }); + const shouldPan = tool === "hand" || isSpacePressed; + const { + adjustZoom, + beginPanInteraction, + cursor, + endPanInteraction, + handleStageWheel, + resetView, + toCanvasPoint, + toScreenPoint, + updatePanInteraction, + } = useCanvasViewportNavigation({ + fitView, + shouldPan, + stageRef, + viewport, + zoom, + setViewport, + setZoom, + }); + const { + beginMarqueeInteraction, + commitMarqueeInteraction, + hasMarqueeSession, + isMarqueeDragging, + marqueeRenderState, + updateMarqueeInteraction, + } = useCanvasMarqueeSelection({ + activeWorkbench, + activeWorkbenchId, + stageRef, + tool, + viewport, + zoom, + selectedElementIds, + setSelectedElementIds, + }); const { - editingTextId, - editingTextDraft, - editingTextValue, - editingTextRenderElement, - trackedTextOverlayElement, - beginTextEdit, - handleTextValueChange, - handleTextInputKeyDown, - updateSelectedTextElement, + actions: textSessionActions, + session: textSession, } = useCanvasTextSession({ activeWorkbenchId, + availableWorkbenchIds, elementById, + selectedElementIds, singleSelectedTextElement, selectElement, clearSelection, upsertElementInWorkbench, executeCommandInWorkbench, - textEditorRef, - textToolbarRef, }); - - useEffect(() => { - selectedElementIdsRef.current = selectedElementIds; - }, [selectedElementIds]); - - useEffect(() => { - elementByIdRef.current = elementById; - }, [elementById]); + const textRuntimeViewModel = useCanvasTextRuntimeViewModel({ + activeWorkbenchId, + displaySelectedElementIds, + hasMarqueeSession, + isMarqueeDragging, + nodeById: elementById, + selectedElementIds, + textSession, + }); const thirdsGuideLines = useMemo(() => { if (!activeWorkbench || !activeWorkbench.guides.showThirds) { return []; } + return [ [activeWorkbench.width / 3, 0, activeWorkbench.width / 3, activeWorkbench.height], [(activeWorkbench.width * 2) / 3, 0, (activeWorkbench.width * 2) / 3, activeWorkbench.height], @@ -477,28 +247,13 @@ export function CanvasViewport({ stageRef, selectedSliceId }: CanvasViewportProp if (!activeWorkbench || !activeWorkbench.guides.showCenter) { return []; } + return [ [activeWorkbench.width / 2, 0, activeWorkbench.width / 2, activeWorkbench.height], [0, activeWorkbench.height / 2, activeWorkbench.width, activeWorkbench.height / 2], ]; }, [activeWorkbench]); - const fitZoom = useMemo(() => { - if (!activeWorkbench || stageSize.width <= 0 || stageSize.height <= 0) { - return 1; - } - const usableWidth = Math.max(1, stageSize.width - VIEWPORT_INSETS.left - VIEWPORT_INSETS.right); - const usableHeight = Math.max( - 1, - stageSize.height - VIEWPORT_INSETS.top - VIEWPORT_INSETS.bottom - ); - return clamp( - Math.min(usableWidth / activeWorkbench.width, usableHeight / activeWorkbench.height, 1), - 0.2, - 1 - ); - }, [activeWorkbench, stageSize.height, stageSize.width]); - const workspaceGridBounds = useMemo( () => getVisibleWorldGridBounds(viewport, zoom, stageSize), [stageSize, viewport, zoom] @@ -509,455 +264,44 @@ export function CanvasViewport({ stageRef, selectedSliceId }: CanvasViewportProp [] ); - const shouldPan = tool === "hand" || isSpacePressed; - - const toCanvasPoint = useCallback( - (stage: Konva.Stage) => { - const pointer = stage.getPointerPosition(); - if (!pointer) { - return null; - } - return { - x: (pointer.x - viewport.x) / zoom, - y: (pointer.y - viewport.y) / zoom, - }; - }, - [viewport.x, viewport.y, zoom] - ); - - const toScreenPoint = useCallback((stage: Konva.Stage) => { - const pointer = stage.getPointerPosition(); - if (!pointer) { - return null; - } - return { - x: pointer.x, - y: pointer.y, - }; - }, []); - - const buildMarqueeSelectionTargets = useCallback(() => { - const stage = stageRef.current; - if (!activeWorkbench || !stage) { - return []; - } - - const nextTargets: CanvasSelectionTarget[] = []; - for (const element of activeWorkbench.elements) { - if (!isSelectableSelectionTarget(element)) { - continue; - } - - const node = stage.findOne(`#${element.id}`); - if (!node) { - continue; - } - - nextTargets.push({ - id: element.id, - rect: screenRectToWorldRect(getSelectionOverlayRect(node), viewport, zoom), - }); - } - - return nextTargets; - }, [activeWorkbench, stageRef, viewport, zoom]); - - const cancelQueuedMarqueeSelection = useCallback(() => { - if (marqueeRenderFrameRef.current === null) { - return; - } - - cancelAnimationFrame(marqueeRenderFrameRef.current); - marqueeRenderFrameRef.current = null; - }, []); - - useEffect(() => { - cancelQueuedMarqueeSelection(); - marqueeSelectionRef.current = null; - marqueeSelectionTargetsRef.current = []; - unstable_batchedUpdates(() => { - clearSelectionPreview(); - setMarqueeRenderState((current) => - marqueeRenderStateEqual(current, EMPTY_MARQUEE_RENDER_STATE) - ? current - : EMPTY_MARQUEE_RENDER_STATE - ); - }); - }, [activeWorkbenchId, cancelQueuedMarqueeSelection, clearSelectionPreview]); - - const commitSelectedElementIds = useCallback( - (nextSelectedIds: string[]) => { - if (selectionIdsEqual(selectedElementIdsRef.current, nextSelectedIds)) { - return; - } - - selectedElementIdsRef.current = nextSelectedIds; - setSelectedElementIds(nextSelectedIds); - }, - [setSelectedElementIds] - ); - - const resolveMarqueeStateSelectionIds = useCallback( - (state: MarqueeSelectionState) => { - const selectionRect = normalizeSelectionRect(state.startCanvas, state.currentCanvas); - const targets = - marqueeSelectionTargetsRef.current.length > 0 - ? marqueeSelectionTargetsRef.current - : buildMarqueeSelectionTargets(); - if (marqueeSelectionTargetsRef.current.length === 0) { - marqueeSelectionTargetsRef.current = targets; - } - - return resolveMarqueeSelectionIds( - selectionRect, - targets, - state.baseSelectedIds, - state.additive - ); - }, - [buildMarqueeSelectionTargets] - ); - - const queueMarqueeRenderState = useCallback(() => { - if (marqueeRenderFrameRef.current !== null) { - return; - } - - marqueeRenderFrameRef.current = requestAnimationFrame(() => { - marqueeRenderFrameRef.current = null; - const nextState = marqueeSelectionRef.current; - if (!nextState) { - return; - } - - const nextPreviewSelectedIds = nextState.hasActivated - ? resolveMarqueeStateSelectionIds(nextState) - : null; - const nextRenderState: MarqueeSelectionRenderState = { - hasSession: true, - isDragging: nextState.hasActivated, - rect: nextState.hasActivated - ? normalizeSelectionRect(nextState.startCanvas, nextState.currentCanvas) - : null, - }; - - unstable_batchedUpdates(() => { - setSelectionPreviewElementIds(nextPreviewSelectedIds); - setMarqueeRenderState((current) => - marqueeRenderStateEqual(current, nextRenderState) ? current : nextRenderState - ); - }); - }); - }, [resolveMarqueeStateSelectionIds, setSelectionPreviewElementIds]); - - const beginPanInteraction = useCallback( - (screenPoint: CanvasSelectionPoint) => { - setIsPanning(true); - panningAnchorRef.current = screenPoint; - viewportAnchorRef.current = viewport; - }, - [viewport] - ); - - const updatePanInteraction = useCallback( - (screenPoint: CanvasSelectionPoint) => { - if (!isPanning || !panningAnchorRef.current || !viewportAnchorRef.current) { - return; - } - - setViewport({ - x: viewportAnchorRef.current.x + (screenPoint.x - panningAnchorRef.current.x), - y: viewportAnchorRef.current.y + (screenPoint.y - panningAnchorRef.current.y), - }); - }, - [isPanning, setViewport] - ); - - const endPanInteraction = useCallback(() => { - if (isPanning) { - setIsPanning(false); - } - panningAnchorRef.current = null; - viewportAnchorRef.current = null; - }, [isPanning]); - - const beginMarqueeInteraction = useCallback( - ({ - additive, - canvasPoint, - screenPoint, - }: { - additive: boolean; - canvasPoint: CanvasSelectionPoint; - screenPoint: CanvasSelectionPoint; - }) => { - const baseSelectedIds = additive ? selectedElementIdsRef.current : []; - const nextSelection: MarqueeSelectionState = { - additive, - baseSelectedIds, - currentCanvas: canvasPoint, - currentScreen: screenPoint, - hasActivated: false, - startCanvas: canvasPoint, - startScreen: screenPoint, - }; - cancelQueuedMarqueeSelection(); - marqueeSelectionRef.current = nextSelection; - marqueeSelectionTargetsRef.current = []; - unstable_batchedUpdates(() => { - clearSelectionPreview(); - setMarqueeRenderState((current) => { - const nextRenderState: MarqueeSelectionRenderState = { - hasSession: true, - isDragging: false, - rect: null, - }; - return marqueeRenderStateEqual(current, nextRenderState) ? current : nextRenderState; - }); - }); - }, - [cancelQueuedMarqueeSelection, clearSelectionPreview] - ); - - const updateMarqueeInteraction = useCallback( - ({ - canvasPoint, - screenPoint, - }: { - canvasPoint: CanvasSelectionPoint; - screenPoint: CanvasSelectionPoint; - }) => { - const currentSelection = marqueeSelectionRef.current; - if (!currentSelection) { - return; - } - - const nextSelectionDraft: MarqueeSelectionState = { - ...currentSelection, - currentCanvas: canvasPoint, - currentScreen: screenPoint, - }; - const nextSelection: MarqueeSelectionState = { - ...nextSelectionDraft, - hasActivated: - currentSelection.hasActivated || isDraggingMarqueeSelection(nextSelectionDraft), - }; - marqueeSelectionRef.current = nextSelection; - if (!nextSelection.hasActivated) { - return; - } - - queueMarqueeRenderState(); - if (marqueeSelectionTargetsRef.current.length === 0) { - marqueeSelectionTargetsRef.current = buildMarqueeSelectionTargets(); - } - }, - [buildMarqueeSelectionTargets, isDraggingMarqueeSelection, queueMarqueeRenderState] - ); - - const commitMarqueeInteraction = useCallback( - ({ - canvasPoint, - screenPoint, - }: { - canvasPoint: CanvasSelectionPoint | null; - screenPoint: CanvasSelectionPoint | null; - }) => { - const currentSelection = marqueeSelectionRef.current; - if (!currentSelection) { - return; - } - - let nextSelection = currentSelection; - if (canvasPoint && screenPoint) { - nextSelection = { - ...currentSelection, - currentCanvas: canvasPoint, - currentScreen: screenPoint, - }; - } - - cancelQueuedMarqueeSelection(); - const nextPreviewSelectedIds = nextSelection.hasActivated - ? resolveMarqueeStateSelectionIds(nextSelection) - : nextSelection.baseSelectedIds; - const nextSelectedIds = resolveCompletedMarqueeSelectionIds({ - additive: nextSelection.additive, - baseSelectedIds: nextSelection.baseSelectedIds, - hasActivated: nextSelection.hasActivated, - nextSelectedIds: nextPreviewSelectedIds, - }); - - unstable_batchedUpdates(() => { - commitSelectedElementIds(nextSelectedIds); - clearSelectionPreview(); - setMarqueeRenderState((current) => - marqueeRenderStateEqual(current, EMPTY_MARQUEE_RENDER_STATE) - ? current - : EMPTY_MARQUEE_RENDER_STATE - ); - }); - - marqueeSelectionRef.current = null; - marqueeSelectionTargetsRef.current = []; - }, - [ - cancelQueuedMarqueeSelection, - clearSelectionPreview, - commitSelectedElementIds, - resolveMarqueeStateSelectionIds, - ] - ); - const insertShapeElement = useCallback( (element: CanvasShapeElement) => { if (!activeWorkbenchId) { return; } + void upsertElement(element); }, [activeWorkbenchId, upsertElement] ); - const activeToolController = useMemo( - () => resolveCanvasToolController(tool, shouldPan), - [shouldPan, tool] - ); - - const activeToolContext = useMemo( - () => ({ - activeWorkbenchId, - activeShapeType, - beginMarqueeSelection: beginMarqueeInteraction, - beginPan: beginPanInteraction, - beginTextEdit, - clearSelection, - commitMarqueeSelection: commitMarqueeInteraction, - endPan: endPanInteraction, - insertShape: insertShapeElement, - selectElement: (elementId: string) => { - selectElement(elementId); - }, - setTool, - updateMarqueeSelection: updateMarqueeInteraction, - updatePan: updatePanInteraction, - }), - [ - activeWorkbenchId, - activeShapeType, - beginMarqueeInteraction, - beginPanInteraction, - beginTextEdit, - clearSelection, - commitMarqueeInteraction, - endPanInteraction, - insertShapeElement, - selectElement, - setTool, - updateMarqueeInteraction, - updatePanInteraction, - ] - ); - - useEffect(() => { - if (tool === "select") { - return; - } - cancelQueuedMarqueeSelection(); - marqueeSelectionRef.current = null; - marqueeSelectionTargetsRef.current = []; - unstable_batchedUpdates(() => { - clearSelectionPreview(); - setMarqueeRenderState((current) => - marqueeRenderStateEqual(current, EMPTY_MARQUEE_RENDER_STATE) - ? current - : EMPTY_MARQUEE_RENDER_STATE - ); - }); - }, [cancelQueuedMarqueeSelection, clearSelectionPreview, tool]); - - useEffect( - () => () => { - cancelQueuedMarqueeSelection(); - clearSelectionPreview(); - }, - [cancelQueuedMarqueeSelection, clearSelectionPreview] - ); - - const handleWorkspacePointerDown = useCallback( - (event: Konva.KonvaEventObject) => { - const stage = stageRef.current; - if (!stage || !activeWorkbench) { - return; - } - - const isBackgroundTarget = - event.target === stage || event.target.id() === WORKSPACE_BACKGROUND_NODE_ID; - if (!isBackgroundTarget) { - return; - } - - event.evt.preventDefault(); - activeToolController.onPointerDown(activeToolContext, { - additive: Boolean(event.evt.shiftKey), - canvasPoint: toCanvasPoint(stage), - isBackgroundTarget, - screenPoint: toScreenPoint(stage), - }); - }, - [ - activeWorkbench, - activeToolController, - activeToolContext, - stageRef, - toCanvasPoint, - toScreenPoint, - ] - ); - - const handleWorkspacePointerMove = useCallback( - (event?: Konva.KonvaEventObject) => { - const stage = stageRef.current; - if (!stage || !activeToolController.onPointerMove) { - return; - } - - event?.evt.preventDefault(); - activeToolController.onPointerMove(activeToolContext, { - canvasPoint: toCanvasPoint(stage), - screenPoint: toScreenPoint(stage), - }); - }, - [ - activeToolController, - activeToolContext, - stageRef, - toCanvasPoint, - toScreenPoint, - ] - ); - - const handleWorkspacePointerUp = useCallback( - (event?: Konva.KonvaEventObject) => { - if (!activeToolController.onPointerUp && !shouldPan) { - return; - } - event?.evt.preventDefault(); - const stage = stageRef.current; - activeToolController.onPointerUp?.(activeToolContext, { - canvasPoint: stage ? toCanvasPoint(stage) : null, - screenPoint: stage ? toScreenPoint(stage) : null, - }); + const { + handleWorkspacePointerDown, + handleWorkspacePointerMove, + handleWorkspacePointerUp, + } = useCanvasViewportToolOrchestrator({ + activeShapeType, + activeWorkbench, + activeWorkbenchId, + beginMarqueeInteraction, + beginPanInteraction, + beginTextEdit: textSessionActions.begin, + clearSelection, + commitMarqueeInteraction, + endPanInteraction, + insertShapeElement, + selectElement: (elementId: string) => { + selectElement(elementId); }, - [ - activeToolController, - activeToolContext, - shouldPan, - stageRef, - toCanvasPoint, - toScreenPoint, - ] - ); + setTool, + shouldPan, + stageRef, + toCanvasPoint, + toScreenPoint, + tool, + updateMarqueeInteraction, + updatePanInteraction, + }); const handleElementSelect = useCallback( (elementId: string, additive: boolean) => { @@ -991,153 +335,42 @@ export function CanvasViewport({ stageRef, selectedSliceId }: CanvasViewportProp const handleTextElementDoubleClick = useCallback( (elementId: string) => { const element = elementByIdRef.current.get(elementId); - if (element?.type !== "text" || !isCanvasTextElementEditable(element)) { + if (!element?.type || element.type !== "text" || !isCanvasTextElementEditable(element)) { return; } - beginTextEdit(element); + textSessionActions.begin(element); }, - [beginTextEdit] + [textSessionActions] ); - useEffect(() => { - registerCanvasStage(stageRef.current); - return () => { - registerCanvasStage(null); - }; - }, [stageRef, activeWorkbenchId]); + const handleTextColorChange = useCallback( + (color: string) => { + textSessionActions.updateDraft((element) => ({ + ...element, + color, + })); + }, + [textSessionActions] + ); - useEffect(() => { - const container = viewportContainerRef.current; - if (!container) { - return; - } + const handleTextFontFamilyChange = useCallback( + (fontFamily: string) => { + textSessionActions.updateDraft((element) => ({ + ...element, + fontFamily, + })); + }, + [textSessionActions] + ); - const updateStageSize = (width: number, height: number) => { - setStageSize((current) => - current.width === width && current.height === height ? current : { width, height } + const handleTextFontSizeTierChange = useCallback( + (fontSizeTier: Parameters[1]) => { + textSessionActions.updateDraft((element) => + applyCanvasTextFontSizeTier(element, fontSizeTier) ); - }; - - const measure = () => { - const rect = container.getBoundingClientRect(); - updateStageSize(Math.round(rect.width), Math.round(rect.height)); - }; - - measure(); - - if (typeof ResizeObserver === "undefined") { - window.addEventListener("resize", measure); - return () => { - window.removeEventListener("resize", measure); - }; - } - - const observer = new ResizeObserver((entries) => { - const entry = entries[0]; - if (!entry) { - return; - } - updateStageSize(Math.round(entry.contentRect.width), Math.round(entry.contentRect.height)); - }); - observer.observe(container); - - return () => { - observer.disconnect(); - }; - }, []); - - useEffect(() => { - if (!activeWorkbench || stageSize.width <= 0 || stageSize.height <= 0) { - return; - } - if (initializedDocumentIdsRef.current.has(activeWorkbench.id)) { - return; - } - initializedDocumentIdsRef.current.add(activeWorkbench.id); - const usableWidth = Math.max(1, stageSize.width - VIEWPORT_INSETS.left - VIEWPORT_INSETS.right); - const usableHeight = Math.max( - 1, - stageSize.height - VIEWPORT_INSETS.top - VIEWPORT_INSETS.bottom - ); - setZoom(fitZoom); - setViewport({ - x: Math.round(VIEWPORT_INSETS.left + (usableWidth - activeWorkbench.width * fitZoom) / 2), - y: Math.round(VIEWPORT_INSETS.top + (usableHeight - activeWorkbench.height * fitZoom) / 2), - }); - }, [activeWorkbench, fitZoom, setViewport, setZoom, stageSize.height, stageSize.width]); - const { selectionOverlay, toolbarPosition, dimensionsBadgePosition, editingTextLayout } = - useCanvasViewportOverlay({ - stageRef, - stageSize, - viewport, - zoom, - selectedElementIds, - editingTextId, - editingTextRenderElement, - trackedTextOverlayElement, - singleSelectedNonTextElement, - textToolbarRef, - dimensionsBadgeRef, - toolbarSize: DEFAULT_TEXT_TOOLBAR_SIZE, - dimensionsBadgeSize: DEFAULT_DIMENSIONS_BADGE_SIZE, - floatingToolbarGap: FLOATING_TOOLBAR_GAP, - activeWorkbenchUpdatedAt: activeWorkbench?.updatedAt, - }); - - useEffect(() => { - const handleKeyDown = (event: KeyboardEvent) => { - if (event.code === "Space" && !isInputLikeElement(event.target)) { - event.preventDefault(); - setIsSpacePressed(true); - } - }; - const handleKeyUp = (event: KeyboardEvent) => { - if (event.code === "Space") { - setIsSpacePressed(false); - setIsPanning(false); - panningAnchorRef.current = null; - viewportAnchorRef.current = null; - } - }; - window.addEventListener("keydown", handleKeyDown); - window.addEventListener("keyup", handleKeyUp); - return () => { - window.removeEventListener("keydown", handleKeyDown); - window.removeEventListener("keyup", handleKeyUp); - }; - }, []); - - const adjustZoom = (direction: "in" | "out") => { - const scaleBy = 1.08; - const nextZoom = clamp(direction === "in" ? zoom * scaleBy : zoom / scaleBy, 0.2, 4); - setZoom(nextZoom); - }; - - const resetView = () => { - if (!activeWorkbench) { - return; - } - const usableWidth = Math.max(1, stageSize.width - VIEWPORT_INSETS.left - VIEWPORT_INSETS.right); - const usableHeight = Math.max( - 1, - stageSize.height - VIEWPORT_INSETS.top - VIEWPORT_INSETS.bottom - ); - setZoom(fitZoom); - setViewport({ - x: Math.round(VIEWPORT_INSETS.left + (usableWidth - activeWorkbench.width * fitZoom) / 2), - y: Math.round(VIEWPORT_INSETS.top + (usableHeight - activeWorkbench.height * fitZoom) / 2), - }); - }; - - const showTextToolbar = Boolean( - selectionOverlay && editingTextRenderElement && selectedElementIds.length === 1 - ); - const showTextEditor = Boolean( - !hasMarqueeSession && !isMarqueeDragging && editingTextId && editingTextRenderElement - ); - const showDimensionsBadge = Boolean( - selectionOverlay && singleSelectedNonTextElement && selectedElementIds.length === 1 + }, + [textSessionActions] ); if (!activeWorkbench) { @@ -1149,352 +382,62 @@ export function CanvasViewport({ stageRef, selectedSliceId }: CanvasViewportProp } return ( -
- { - event.evt.preventDefault(); - const stage = stageRef.current; - if (!stage) { - return; - } - const pointer = stage.getPointerPosition(); - if (!pointer) { - return; - } - const scaleBy = 1.08; - const direction = event.evt.deltaY > 0 ? -1 : 1; - const nextZoom = clamp(direction > 0 ? zoom * scaleBy : zoom / scaleBy, 0.2, 4); - const worldPoint = { - x: (pointer.x - viewport.x) / zoom, - y: (pointer.y - viewport.y) / zoom, - }; - setZoom(nextZoom); - setViewport({ - x: pointer.x - worldPoint.x * nextZoom, - y: pointer.y - worldPoint.y * nextZoom, - }); - }} - onMouseDown={(event: Konva.KonvaEventObject) => { - const stage = stageRef.current; - if (!stage) { - return; - } - const isBackgroundTarget = - event.target === stage || event.target.id() === WORKSPACE_BACKGROUND_NODE_ID; - if (!isBackgroundTarget) { - return; - } - handleWorkspacePointerDown(event); - }} - onTouchStart={(event: Konva.KonvaEventObject) => { - handleWorkspacePointerDown(event); - }} - onMouseMove={(event: Konva.KonvaEventObject) => { - handleWorkspacePointerMove(event); - }} - onTouchMove={(event: Konva.KonvaEventObject) => { - handleWorkspacePointerMove(event); - }} - onMouseUp={(event: Konva.KonvaEventObject) => { - handleWorkspacePointerUp(event); - }} - onTouchEnd={(event: Konva.KonvaEventObject) => { - handleWorkspacePointerUp(event); - }} - onTouchCancel={(event: Konva.KonvaEventObject) => { - handleWorkspacePointerUp(event); - }} - > - - - - - - - - {activeWorkbench.guides.showSafeArea ? ( - - ) : null} - - {thirdsGuideLines.map((points, index) => ( - - ))} - - {centerGuideLines.map((points, index) => ( - - ))} - - - - - - - - - - - - {isMarqueeDragging && marqueeRenderState.rect ? ( - - ) : null} - - - - {activeWorkbench.slices.map((slice) => { - const selected = slice.id === selectedSliceId; - return ( - - - - - ); - })} - - - - {showDimensionsBadge && singleSelectedNonTextElement ? ( -
- {Math.round( - singleSelectedNonTextElement.type === "group" - ? singleSelectedNonTextElement.bounds.width - : singleSelectedNonTextElement.width - )}{" "} - x{" "} - {Math.round( - singleSelectedNonTextElement.type === "group" - ? singleSelectedNonTextElement.bounds.height - : singleSelectedNonTextElement.height - )} -
- ) : null} - - {showTextToolbar && editingTextRenderElement && selectionOverlay ? ( - { - updateSelectedTextElement((element) => ({ - ...element, - color, - })); - }} - onFontFamilyChange={(fontFamily) => { - updateSelectedTextElement((element) => ({ - ...element, - fontFamily, - })); - }} - onFontSizeTierChange={(fontSizeTier) => { - updateSelectedTextElement((element) => - applyCanvasTextFontSizeTier(element, fontSizeTier) - ); - }} - /> - ) : null} - - {showTextEditor && editingTextRenderElement && editingTextLayout ? ( -
{ - event.stopPropagation(); - }} - > -