-
Notifications
You must be signed in to change notification settings - Fork 204
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Epic/feat/documents tracker #3022
base: dev
Are you sure you want to change the base?
Conversation
WalkthroughThis pull request introduces an extensive set of changes that revamp document management across database, backend services, and UI components. At the database level, new migrations adjust the Document table by adding columns for project, issuingCountry, comment, and decisionReason, and by updating ENUM types. The Prisma schema is updated to support structured document entities. Service layers and controllers are refactored to remove redundant document fields while adding new document creation, update, and deletion endpoints. UI modules have been enhanced with new hooks, components, and mutation logic for document tracking and decision management, and several dynamic form and validation utilities have been overhauled. Changes
Sequence Diagram(s)sequenceDiagram
participant C as DocumentControllerExternal
participant S as DocumentService
participant R as DocumentRepository
participant DB as Database
C->>+S: Create/Update Document Request
S->>+R: Process document data (CRUD, decision update)
R-->>-S: Return document result
S->>C: Return updated document info
Possibly related PRs
Poem
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🔭 Outside diff range comments (2)
services/workflows-service/src/workflow/workflow.service.ts (2)
1782-1826
: Enhance error handling in file persistence.The
__persistDocumentPagesFiles
method could benefit from improved error handling:
- Add error handling for failed file operations
- Consider retrying failed operations
- Add logging for successful file persistence
Apply this improvement:
private async __persistDocumentPagesFiles( document: TDocumentWithoutPageType, entityId: string, projectId: TProjectId, customerName: string, ) { + this.logger.debug('Starting document pages files persistence', { documentId: document.id }); return await Promise.all( document?.pages?.map(async documentPage => { if (documentPage.ballerineFileId) return documentPage; const documentId = document.id! || getDocumentId(document, false); + try { const persistedFile = await this.fileService.copyToDestinationAndCreate( { id: documentId, uri: documentPage.uri, provider: documentPage.provider, fileName: ( documentPage as typeof documentPage & { fileName: string; } ).fileName, }, entityId, projectId, customerName, ); + this.logger.debug('Successfully persisted file', { documentId, fileId: persistedFile?.id }); const ballerineFileId = documentPage.ballerineFileId || persistedFile?.id; const mimeType = persistedFile?.mimeType || mime.getType(persistedFile.fileName || persistedFile.uri || '') || undefined; return { ...documentPage, type: mimeType, ballerineFileId, fileName: persistedFile?.fileName, }; + } catch (error) { + this.logger.error('Failed to persist document page file', { + documentId, + error, + uri: documentPage.uri, + }); + throw error; + } }), ); }
2648-2714
: Improve OCR error handling and timeout configuration.The
runOCROnDocument
method has several areas for improvement:
- The 180-second timeout is hardcoded
- Error handling for OCR failures could be more specific
- No retry mechanism for transient failures
Apply these improvements:
async runOCROnDocument({ workflowRuntimeId, projectId, documentId, }: { workflowRuntimeId: string; projectId: string; documentId: string; }) { + const OCR_TIMEOUT_MS = 180_000; + const MAX_RETRIES = 3; + let attempt = 0; + return await this.prismaService.$transaction( async transaction => { const customer = await this.customerService.getByProjectId(projectId); if (!customer.features?.[FEATURE_LIST.DOCUMENT_OCR]) { throw new BadRequestException( `Document OCR is not enabled for customer id ${customer.id}`, ); } + while (attempt < MAX_RETRIES) { + try { // ... existing OCR logic ... return ( await new UnifiedApiClient().runOcr({ images, schema: document.propertiesSchema as unknown as TSchema, }) )?.data; + } catch (error) { + attempt++; + if (attempt === MAX_RETRIES) { + this.logger.error('OCR failed after max retries', { + error, + documentId, + attempts: attempt, + }); + throw new Error('OCR processing failed after multiple attempts'); + } + this.logger.warn('OCR attempt failed, retrying', { + error, + documentId, + attempt, + }); + await new Promise(resolve => setTimeout(resolve, 1000 * attempt)); + } + } }, { - timeout: 180_000, + timeout: OCR_TIMEOUT_MS, }, ); }
🧹 Nitpick comments (19)
services/workflows-service/src/storage/storage.service.ts (1)
79-81
: Consider standardizing error handling approach.The code uses a custom
NotFoundException
from@/errors
, but there's an unused import ofNotFoundException
from@nestjs/common
. Consider standardizing the error handling approach across the codebase by either:
- Using NestJS built-in exceptions consistently
- Using custom exceptions consistently
If choosing option 1, apply this diff:
-throw new errors.NotFoundException('file not found'); +throw new NotFoundException('file not found');services/workflows-service/src/document/document.controller.external.ts (5)
53-66
: Consider unifying validation libraries.Here, you're mixing TypeBox and Zod for request validation (
Type.Composite
along withDocumentFileJsonSchema.parse
andz.preprocess
). While it works, it's often cleaner and more consistent to standardize on a single validation library.
68-107
: Review return contract ofcreateDocument
.
createDocument
currently returns all documents for the given entity & workflow rather than just the newly created one. This might be intentional, but from a REST perspective, creating a single document usually returns only the created resource. If returning multiple documents is desired, consider clarifying it in the endpoint name, docs, or response schema.
109-138
: Evaluate pagination or filtering for document retrieval.
getDocumentsByEntityIdAndWorkflowId
might return a large array if the entity has many documents. Consider adding optional pagination or filtering to ensure better performance and user experience under high loads.
140-166
: Clarify update response.Like the create endpoint,
updateDocumentById
returns the entire set of documents in the project rather than only the updated document or a confirmatory status. This can be confusing for clients expecting to see only the updated resource.
168-188
: Confirm desired behavior for delete endpoint.Similar to the update logic,
deleteDocumentsByIds
returns a list of all remaining documents, not just the IDs deleted. If this is the intended behavior, ensure that it’s documented; otherwise, consider returning only the deleted IDs or a relevant subset.services/workflows-service/src/document/document.service.ts (1)
138-162
: Reassess returning all documents on update.When updating a single document, returning the entire project’s documents can lead to unnecessary overhead and confusion. A more typical approach is returning only the updated document or a confirmation message.
services/workflows-service/src/storage/storage.module.ts (1)
1-1
: Consider refactoring to avoid circular dependency.The use of
forwardRef
indicates a circular dependency betweenStorageModule
andFileModule
. While the current solution works, consider these architectural improvements:
- Extract shared logic into a separate module
- Use events/message bus for communication
- Reorganize the modules to have a clear hierarchical relationship
Would you like me to help explore these architectural alternatives in more detail?
Also applies to: 9-9, 12-12
services/workflows-service/src/document/dtos/document.dto.ts (1)
4-18
: Consider adding validation constraints.The schema could benefit from additional validation constraints:
- String length limits for
id
,category
,type
, etc.- Pattern validation for
issuingVersion
- Range validation for
version
Example implementation:
export const DocumentSchema = Type.Object({ - id: Type.String(), + id: Type.String({ minLength: 1, maxLength: 36 }), - category: Type.String(), + category: Type.String({ minLength: 1, maxLength: 100 }), - type: Type.String(), + type: Type.String({ minLength: 1, maxLength: 100 }), - issuingVersion: Type.String(), + issuingVersion: Type.String({ pattern: '^[0-9]+\.[0-9]+\.[0-9]+$' }), - issuingCountry: Type.String(), + issuingCountry: Type.String({ minLength: 2, maxLength: 2 }), - version: Type.Integer(), + version: Type.Integer({ minimum: 1 }), // ... rest of the schema });services/workflows-service/src/document-file/dtos/document-file.dto.ts (1)
15-24
: Consider adding schema descriptions for better documentation.The JSON schema transformation looks good, but adding descriptions would improve API documentation generation.
export const DocumentFileJsonSchema = z .string() .transform(value => JSON.parse(value)) .pipe( z.object({ - type: z.nativeEnum(DocumentFileType), - variant: z.nativeEnum(DocumentFileVariant), - page: z.number().positive().int(), + type: z.nativeEnum(DocumentFileType).describe('Type of the document file'), + variant: z.nativeEnum(DocumentFileVariant).describe('Variant of the document file'), + page: z.number().positive().int().describe('Page number in the document'), }), );services/workflows-service/src/document-file/document-file.service.ts (1)
10-16
: Add error handling for repository operations.While the implementation is solid, consider adding try-catch blocks to handle potential repository failures and provide meaningful error messages.
async create( data: Prisma.DocumentFileUncheckedCreateInput, args?: Prisma.DocumentFileCreateArgs, transaction?: PrismaTransactionClient, ) { + try { return await this.repository.create(data, args, transaction); + } catch (error) { + throw new Error(`Failed to create document file: ${error.message}`); + } }Also applies to: 18-24, 26-33, 35-43, 45-52, 54-61
services/workflows-service/src/document/document.repository.ts (1)
35-51
: Consider index optimization for entity ID queries.The
OR
condition onbusinessId
andendUserId
might affect index usage. Consider creating a composite index to optimize this query.-- Add composite indexes for both business and end user queries CREATE INDEX idx_business_workflow ON "Document" ("businessId", "workflowRuntimeDataId", "projectId"); CREATE INDEX idx_enduser_workflow ON "Document" ("endUserId", "workflowRuntimeDataId", "projectId");services/workflows-service/src/document-file/document-file.repository.ts (1)
7-97
: Consider adding findById method for single record retrieval.The repository handles bulk operations well but lacks a method for retrieving a single document file by ID.
+ async findById( + id: string, + projectIds: TProjectId[], + args?: Prisma.DocumentFileFindFirstArgs, + transaction: PrismaTransactionClient = this.prismaService, + ) { + return transaction.documentFile.findFirst({ + ...args, + where: { + ...args?.where, + id, + projectId: { in: projectIds }, + }, + }); + }services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts (1)
66-83
: Add error handling for workflow runtime data retrieval.The workflow runtime data retrieval should include error handling to gracefully handle cases where the data doesn't exist.
const workflowRuntimeData = await this.workflowService.getWorkflowRuntimeDataById( tokenScope.workflowRuntimeDataId, {}, [tokenScope.projectId], ); +if (!workflowRuntimeData) { + throw new NotFoundException('Workflow runtime data not found'); +} return this.fileService.uploadNewFile(tokenScope.projectId, workflowRuntimeData, { ...file, mimetype: file.mimetype || ( await getFileMetadata({ file: file.originalname || '', fileName: file.originalname || '', }) )?.mimeType || '', });services/workflows-service/src/providers/file/file.service.ts (1)
362-393
: Enhance file name sanitization.While the current implementation removes file extensions and non-alphabetic characters, consider these improvements:
- Add length limits to prevent extremely long file names
- Preserve some special characters that are commonly used in file names
- Handle potential null/undefined file names more gracefully
- const nameWithoutExtension = (file.originalname || randomUUID()).replace(/\.[^.]+$/, ''); - const alphabeticOnlyName = nameWithoutExtension.replace(/\W/g, ''); + const MAX_FILENAME_LENGTH = 255; + const sanitizeFileName = (fileName: string) => { + const nameWithoutExtension = fileName.replace(/\.[^.]+$/, ''); + // Allow alphanumeric, hyphens, and underscores + const sanitized = nameWithoutExtension.replace(/[^a-zA-Z0-9-_]/g, ''); + return sanitized.slice(0, MAX_FILENAME_LENGTH); + }; + + const originalName = file.originalname?.trim(); + const fileName = originalName ? sanitizeFileName(originalName) : randomUUID();services/workflows-service/src/workflow/workflow.service.ts (2)
25-26
: Address the import cycle dependency.The ESLint cycle detection had to be disabled for the FileService import. This indicates a potential architectural issue that should be addressed:
- Consider extracting common functionality to break the cycle
- Review if the services can be restructured to avoid circular dependencies
2529-2552
: Add validation for document pages.The
copyDocumentsPagesFilesAndCreate
method should validate document pages before processing:
- Check for required page properties
- Validate file types and sizes
- Add rate limiting for bulk operations
services/workflows-service/prisma/migrations/20250129142137_documents_init/migration.sql (2)
22-39
: Creation of the Document TableThe new
Document
table is defined with the required fields and a primary key constraint.
- Verify that the non-nullable constraints (e.g., on
category
,type
,issuingVersion
, andproperties
) align with the domain requirements.- Consider whether the
updatedAt
field should have a default value or be managed by triggers.
53-67
: Index Creation for Performance OptimizationThe indexes on the
Document
andDocumentFile
tables (for columns such asbusinessId
,endUserId
,workflowRuntimeDataId
,documentId
, andfileId
) are in place to improve query performance.
Consider reviewing whether additional indexes might be beneficial based on query patterns.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (28)
services/workflows-service/prisma/data-migrations
(1 hunks)services/workflows-service/prisma/migrations/20250129142137_documents_init/migration.sql
(1 hunks)services/workflows-service/prisma/migrations/20250129151319_documents_project/migration.sql
(1 hunks)services/workflows-service/prisma/migrations/20250129160127_documents_issuing_country/migration.sql
(1 hunks)services/workflows-service/prisma/migrations/20250202144546_added_document_file_cascade/migration.sql
(1 hunks)services/workflows-service/prisma/schema.prisma
(6 hunks)services/workflows-service/scripts/generate-end-user.ts
(0 hunks)services/workflows-service/src/app.module.ts
(2 hunks)services/workflows-service/src/business/business.controller.external.ts
(0 hunks)services/workflows-service/src/business/business.controller.ts
(0 hunks)services/workflows-service/src/collection-flow/collection-flow.service.ts
(0 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts
(3 hunks)services/workflows-service/src/document-file/document-file.module.ts
(1 hunks)services/workflows-service/src/document-file/document-file.repository.ts
(1 hunks)services/workflows-service/src/document-file/document-file.service.ts
(1 hunks)services/workflows-service/src/document-file/dtos/document-file.dto.ts
(1 hunks)services/workflows-service/src/document/document.controller.external.ts
(1 hunks)services/workflows-service/src/document/document.module.ts
(1 hunks)services/workflows-service/src/document/document.repository.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)services/workflows-service/src/document/dtos/document.dto.ts
(1 hunks)services/workflows-service/src/providers/file/file-service.module.ts
(0 hunks)services/workflows-service/src/providers/file/file.module.ts
(1 hunks)services/workflows-service/src/providers/file/file.service.ts
(3 hunks)services/workflows-service/src/storage/storage.module.ts
(1 hunks)services/workflows-service/src/storage/storage.service.ts
(2 hunks)services/workflows-service/src/workflow/workflow.module.ts
(2 hunks)services/workflows-service/src/workflow/workflow.service.ts
(1 hunks)
💤 Files with no reviewable changes (5)
- services/workflows-service/src/providers/file/file-service.module.ts
- services/workflows-service/src/business/business.controller.ts
- services/workflows-service/scripts/generate-end-user.ts
- services/workflows-service/src/collection-flow/collection-flow.service.ts
- services/workflows-service/src/business/business.controller.external.ts
✅ Files skipped from review due to trivial changes (1)
- services/workflows-service/prisma/data-migrations
🔇 Additional comments (28)
services/workflows-service/src/document/document.service.ts (1)
24-63
: Good initial validations.The checks ensuring either a business ID or an end-user ID but not both, along with verifying
workflowRuntimeDataId
, demonstrate robust input validation and prevent common edge cases.services/workflows-service/src/document-file/document-file.module.ts (1)
1-12
: Module design looks good.The module properly imports
PrismaModule
and providesDocumentFileService
andDocumentFileRepository
. If other modules need access toDocumentFileRepository
, you might also export it, but this setup is sufficient if you only need to expose the service.services/workflows-service/src/document/document.module.ts (1)
1-16
: LGTM! Well-structured module following NestJS best practices.The module is properly organized with clear separation of concerns and follows the dependency injection pattern. The imports, controllers, providers, and exports are correctly defined.
services/workflows-service/src/providers/file/file.module.ts (2)
1-20
: LGTM! Well-structured module following NestJS best practices.The module is properly organized with clear separation of concerns. The imports, providers, and exports are correctly defined.
12-12
: Address the TODO comment about HTTP retry mechanism.The HTTP module requires configuration for retry mechanisms. This is important for handling transient failures in HTTP calls.
Would you like me to help implement the HTTP retry configuration using the
@nestjs/axios
module's built-in retry capabilities?services/workflows-service/src/document/dtos/document.dto.ts (1)
4-18
: LGTM! Well-structured document schema.The schema is comprehensive and uses appropriate types for each field.
services/workflows-service/src/document-file/dtos/document-file.dto.ts (1)
5-13
: LGTM! Type-safe schema definition using TypeBox.The schema properly defines all required fields with appropriate types and enum validations.
services/workflows-service/src/document-file/document-file.service.ts (1)
7-62
: LGTM! Well-structured service with proper separation of concerns.The service properly delegates to the repository and consistently handles project ID scoping.
services/workflows-service/src/document/document.repository.ts (1)
7-86
: LGTM! Well-implemented repository with proper transaction support.The repository correctly implements CRUD operations with proper project ID scoping and transaction support.
services/workflows-service/src/document-file/document-file.repository.ts (1)
7-97
: LGTM! Well-structured repository with consistent patterns.The repository implements CRUD operations with proper project ID scoping and transaction support.
services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts (1)
34-35
: LGTM! Good separation of concerns.The addition of
FileService
andWorkflowService
improves the architecture by properly separating file handling and workflow management responsibilities.services/workflows-service/src/workflow/workflow.module.ts (1)
47-48
: LGTM! Proper module dependency management.The addition of
FileModule
withforwardRef
correctly handles potential circular dependencies, and theFileRepository
is properly registered as a provider.Also applies to: 60-60
services/workflows-service/src/app.module.ts (1)
52-52
: LGTM! Clean module integration.The
DocumentModule
is properly integrated into the application's module imports.Also applies to: 137-137
services/workflows-service/src/providers/file/file.service.ts (1)
372-374
: LGTM! Good error handling.The error handling for missing
entityId
is appropriate and provides a clear error message.services/workflows-service/prisma/migrations/20250129151319_documents_project/migration.sql (1)
15-18
: Review ON DELETE RESTRICT constraint.The RESTRICT constraint on project deletion may cause issues if you need to delete projects in the future. Consider using ON DELETE CASCADE if documents should be deleted with their projects, or keep RESTRICT if preserving document history is critical.
services/workflows-service/prisma/migrations/20250202144546_added_document_file_cascade/migration.sql (1)
8-11
: LGTM! Good improvement to referential integrity.The change to CASCADE for both DELETE and UPDATE operations is a good improvement that will:
- Automatically clean up document files when documents are deleted
- Maintain referential integrity during document updates
services/workflows-service/prisma/migrations/20250129142137_documents_init/migration.sql (5)
1-6
: Warning Block Clarity and Safety PrecautionThe introductory comment clearly warns about dropping the
documents
column on theBusiness
table. Ensure that proper data backup procedures are in place before running this migration.
7-18
: ENUM Definitions ConsistencyThe creation of the ENUM types (
DocumentStatus
,DocumentDecision
,DocumentFileType
, andDocumentFileVariant
) looks correct. Make sure these values remain in sync with the corresponding Prisma schema enums and any application logic that relies on them.
19-21
: Dropping thedocuments
ColumnThe
ALTER TABLE "Business" DROP COLUMN "documents";
statement is deliberate given the schema redesign. Please double-check that all legacy references to this column have been removed or migrated appropriately in the application code.
41-51
: Creation of the DocumentFile TableThe
DocumentFile
table is set up with the necessary fields and a primary key. The definition appears clear and matches the intended design for handling document-related files.
68-82
: Foreign Key Constraints and Referential IntegrityThe foreign key constraints ensure referential integrity:
- For the
Document
table, foreign keys referenceBusiness
,EndUser
, andWorkflowRuntimeData
withON DELETE SET NULL
andON UPDATE CASCADE
.- For the
DocumentFile
table, foreign keys referenceDocument
andFile
withON DELETE RESTRICT
, which prevents accidental deletion of important records.
Ensure that the chosen cascading behaviors are in line with the business logic.services/workflows-service/prisma/schema.prisma (7)
86-91
: Addition of Document Relation in EndUser ModelThe new field
documents Document[]
has been added to theEndUser
model. This creates a one-to-many relationship between an end user and associated documents.
Ensure that all related operations (queries and mutations) in the application are updated to honor this new relation.
142-142
: Addition of Document Relation in Business ModelThe
documents Document[]
field added to theBusiness
model enables tracking of business-related documents.
Double-check that the removal of the legacydocuments
column (in the migration file) is consistently reflected here.
955-988
: New Document Model DefinitionThe new
Document
model accurately mirrors the migration’s structure:
- All necessary fields such as
id
,category
,type
,issuingVersion
,issuingCountry
,version
,status
,decision
, andproperties
are defined.- It includes relation fields to
Business
,EndUser
,WorkflowRuntimeData
, andProject
.- The inclusion of the relation field
files DocumentFile[]
properly sets up the link to file records.
Review that non-null constraints (e.g., onissuingCountry
) are appropriate for all use cases.
990-1013
: Enums for Document Status and DecisionThe enums
DocumentStatus
,DocumentDecision
,DocumentFileType
, andDocumentFileVariant
in the Prisma schema match those in the migration script. Consistency between the database layer and the ORM is maintained.
1014-1031
: New DocumentFile Model DefinitionThe
DocumentFile
model is defined with its relations:
- It sets up fields for
id
,type
,variant
,page
,documentId
, andfileId
.- The foreign key constraints use
onDelete: Cascade
for both relatedDocument
andFile
records, ensuring that deletion of a parent properly cleans up child records.
Ensure that this cascading behavior aligns with the expected business logic.
307-308
: Update to the File Model: DocumentFile RelationThe addition of the
documentFiles DocumentFile[]
field in theFile
model establishes a bidirectional relationship between files and their associations with documents.
Verify that any file handling logic is updated to reflect this new relation.
416-418
: Project Model EnhancementsThe
Project
model now includesdocuments Document[]
anddocumentFiles DocumentFile[]
fields. This integration ensures that projects can directly access associated documents and document files.
Make sure that related business logic (e.g., in controllers or services) is updated accordingly to accommodate these new relations.
@@ -1,4 +1,4 @@ | |||
import { Injectable } from '@nestjs/common'; | |||
import { Injectable, NotFoundException } from '@nestjs/common'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove unused imports.
The following imports are not used in the code:
NotFoundException
from@nestjs/common
(line 1) - the code uses custom errors from@/errors
randomUUID
fromcrypto
(line 21)
Apply this diff to remove the unused imports:
-import { Injectable, NotFoundException } from '@nestjs/common';
+import { Injectable } from '@nestjs/common';
-import { randomUUID } from 'crypto';
Also applies to: 21-21
const uploadedFile = await this.fileService.uploadNewFile(projectId, workflowRuntimeData, { | ||
...file, | ||
mimetype: | ||
file.mimetype || | ||
( | ||
await getFileMetadata({ | ||
file: file.originalname || '', | ||
fileName: file.originalname || '', | ||
}) | ||
)?.mimeType || | ||
'', | ||
}); | ||
const document = await this.repository.create( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Handle partial failures during file upload.
The logic uploads a file and then creates document records. If the upload fails or partially completes, the code throws exceptions. Consider wrapping the upload + record creation in a single database transaction (if feasible) to avoid inconsistent states in case of partial failures or rollback required scenarios.
- Added the required column `projectId` to the `Document` table without a default value. This is not possible if the table is not empty. | ||
- Added the required column `projectId` to the `DocumentFile` table without a default value. This is not possible if the table is not empty. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider data migration strategy for existing records.
The addition of required columns without default values will fail if the tables contain data. Consider:
- First adding the columns as nullable
- Migrating existing data with appropriate project IDs
- Then adding the NOT NULL constraint
/* | ||
Warnings: | ||
|
||
- Added the required column `issuingCountry` to the `Document` table without a default value. This is not possible if the table is not empty. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider data migration strategy for issuingCountry.
Similar to the projectId column, adding a required column without a default value will fail if the table contains data. Consider:
- Adding the column as nullable
- Migrating existing data with appropriate country codes
- Then adding the NOT NULL constraint
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🔭 Outside diff range comments (1)
apps/backoffice-v2/src/lib/blocks/hooks/useManageUbosBlock/ubos-form-json-definition.ts (1)
73-76
: Add validation for ownership percentage.The ownership percentage field lacks min/max validation, which could allow invalid values.
Add validation rules:
jsonFormDefinition: { type: 'number', + minimum: 0, + maximum: 100, },
🧹 Nitpick comments (2)
services/workflows-service/src/document/document.service.ts (2)
130-133
: Improve type safety for document files inclusion.The type assertion
as Array<Document & { files: DocumentFile[] }>
assumes files are always included. This could lead to runtime errors if the include option is modified.Consider adding a type guard:
const hasFiles = (doc: Document): doc is Document & { files: DocumentFile[] } => { return 'files' in doc && Array.isArray(doc.files); }; const documentsWithFiles = await this.fetchDocumentsFiles({ documents: documents.filter(hasFiles), format: 'signed-url', });
147-155
: Optimize document retrieval after update.The method fetches all documents after updating one. This could be inefficient when there are many documents.
Consider fetching only the updated document:
- const documents = await this.repository.findMany( + const documents = await this.repository.findByIds( + [id], projectIds, { include: { files: true, }, }, transaction, );
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
apps/backoffice-v2/src/lib/blocks/hooks/useManageUbosBlock/ubos-form-json-definition.ts
(2 hunks)services/workflows-service/src/document-file/document-file.repository.ts
(1 hunks)services/workflows-service/src/document-file/document-file.service.ts
(1 hunks)services/workflows-service/src/document/document.controller.external.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- services/workflows-service/src/document-file/document-file.service.ts
⏰ Context from checks skipped due to timeout of 90000ms (7)
- GitHub Check: test_windows
- GitHub Check: test_linux
- GitHub Check: build (windows-latest)
- GitHub Check: spell_check
- GitHub Check: Analyze (javascript)
- GitHub Check: build (ubuntu-latest)
- GitHub Check: lint
🔇 Additional comments (2)
services/workflows-service/src/document/document.service.ts (1)
71-103
: Handle partial failures during file upload and document creation.The file upload and document creation operations are not wrapped in a single transaction. If the document creation fails after a successful file upload, it could lead to orphaned files.
Consider wrapping both operations in a transaction:
- const uploadedFile = await this.fileService.uploadNewFile(projectId, workflowRuntimeData, { + return await this.prismaService.$transaction(async transaction => { + const uploadedFile = await this.fileService.uploadNewFile(projectId, workflowRuntimeData, { ...file, mimetype: file.mimetype || ( await getFileMetadata({ file: file.originalname || '', fileName: file.originalname || '', }) )?.mimeType || '', - }); - const document = await this.repository.create( + }); + const document = await this.repository.create( { ...data, ...(data.businessId && { businessId: data.businessId }), ...(data.endUserId && { endUserId: data.endUserId }), projectId, }, args, - transaction, + transaction ); - await this.documentFileService.create( + await this.documentFileService.create( { documentId: document.id, fileId: uploadedFile.id, projectId, ...metadata, }, undefined, - transaction, + transaction - ); + ); + return document; + });services/workflows-service/src/document-file/document-file.repository.ts (1)
1-95
: LGTM!The repository is well-structured with proper transaction support and clear separation of concerns.
FileInterceptor('file', { | ||
storage: getDiskStorage(), | ||
limits: { | ||
files: 1, | ||
}, | ||
fileFilter, | ||
}), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add file type validation for uploads.
The file interceptor uses fileFilter
but there's no explicit validation of allowed file types. This could be a security risk.
Add MIME type validation:
FileInterceptor('file', {
storage: getDiskStorage(),
limits: {
files: 1,
},
+ fileFilter: (req, file, callback) => {
+ const allowedMimes = ['image/jpeg', 'image/png', 'application/pdf'];
+ if (!allowedMimes.includes(file.mimetype)) {
+ callback(new UnprocessableEntityException('Invalid file type'), false);
+ return;
+ }
+ callback(null, true);
+ },
}),
Committable suggestion skipped: line range outside the PR's diff.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 22
🔭 Outside diff range comments (1)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.unit.test.ts (1)
30-30
:⚠️ Potential issueRemove @ts-ignore comments.
Multiple @ts-ignore comments suggest underlying type issues that should be addressed rather than ignored.
Consider:
- Using proper type assertions with type guards
- Updating the types to match the actual structure
- Adding proper null checks in the implementation
Also applies to: 47-47, 64-64, 74-74, 116-116
🧹 Nitpick comments (71)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.ts (1)
19-19
: Remove debugging console.log statement.The console.log statement appears to be debugging code that should not be committed to production.
- console.log('filePath', filePath);
Consider implementing proper logging if this information is needed for debugging in production.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityFieldGroupType/useEntityFieldGroupType.ts (1)
7-12
: Consider adding TypeScript strict null checks.The error handling is well-implemented with a descriptive message. To further enhance type safety, consider enabling TypeScript's strict null checks and explicitly typing the context.
- const context = useContext(EntityFieldContext); + const context = useContext<EntityFieldContextType>(EntityFieldContext); if (!context) {packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.ts (1)
31-32
: Improve error handling for missing upload settings.The error handling could be more informative for developers.
Consider adding more descriptive error messages:
- console.log('Failed to upload, no upload settings provided'); + console.warn('File upload failed: httpParams.createDocument is required for file uploads');Also applies to: 33-38
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/remove-document-from-list-by-template-id/remove-document-from-list-by-template-id.unit.test.ts (2)
7-30
: Consider reducing test data duplication and enhancing test coverage.While the test cases provide good coverage for the basic functionality, there are opportunities for improvement:
- The document test data is duplicated across test cases
- The
properties
field is always empty- Missing edge cases for the new fields (e.g., different versions, categories, types)
Consider extracting the test data to a helper:
const createTestDocument = (id: string): IDocumentTemplate => ({ id, category: 'test', type: 'test', issuer: { country: 'test', }, version: 1, issuingVersion: 1, properties: {}, }); // Usage in tests const documents = [ createTestDocument('doc1'), createTestDocument('doc2'), ] as IDocumentTemplate[];Additionally, consider adding test cases for:
- Documents with different categories/types
- Documents with non-empty properties
- Documents with different versions
Also applies to: 39-62, 83-117
34-35
: Enhance assertions to verify document property integrity.The current assertions only verify array length and document IDs. Consider adding assertions to verify that all properties of the remaining documents are preserved after removal operations.
Example enhancement for the first test case:
expect(result).toHaveLength(1); expect(result[0]).toEqual({ id: 'doc2', category: 'test', type: 'test', issuer: { country: 'test', }, version: 1, issuingVersion: 1, properties: {}, });Also applies to: 66-67, 73-74, 79-80, 121-123
packages/ui/src/components/organisms/Form/DynamicForm/_stories/FileUploadShowcase/FileUploadShowcase.tsx (1)
16-17
: Consider using environment variables for URLs.The showcase currently uses hardcoded localhost URLs. While acceptable for demo purposes, consider using environment variables to make the component more configurable and environment-aware.
- url: 'http://localhost:3000/upload', + url: process.env.NEXT_PUBLIC_UPLOAD_URL || 'http://localhost:3000/upload',Also applies to: 30-31, 48-49, 67-68
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.unit.test.ts (3)
22-35
: Consider using a type-safe approach for mock data.Instead of using
as unknown as IEntityFieldGroupParams
, consider creating a proper mock factory function that returns a correctly typed object.- } as unknown as IEntityFieldGroupParams, + params: { + httpParams: { + deleteEntity: { + url: 'http://test.com', + }, + }, + type: 'entityFieldGroup', + } satisfies IEntityFieldGroupParams,
50-54
: Enhance crypto mock setup.Consider adding cleanup in afterEach to restore the original crypto object and prevent potential test pollution.
+ afterEach(() => { + // Restore the original crypto object + delete (window as any).crypto; + });
116-146
: Consider adding error handling test cases.The "when entity is created" suite could benefit from additional test cases:
- Testing error handling when the delete request fails
- Verifying loading state during deletion
Here's a suggested test case for error handling:
it('should handle delete request failure', async () => { const deleteEntitySpy = vi.fn().mockRejectedValue(new Error('Delete failed')); const mockOnChange = vi.fn(); const mockEntities = [ { __id: '1', ballerineEntityId: '1', name: 'Entity 1' }, ]; vi.mocked(useHttp).mockReturnValue({ run: deleteEntitySpy, isLoading: false, } as unknown as ReturnType<typeof useHttp>); vi.mocked(useField).mockReturnValue({ onChange: mockOnChange, value: mockEntities, } as unknown as ReturnType<typeof useField>); const { result } = renderHook(() => useEntityFieldGroupList({ element: mockElement })); await expect(result.current.removeItem('1')).rejects.toThrow('Delete failed'); expect(mockOnChange).not.toHaveBeenCalled(); });packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.unit.test.ts (2)
7-31
: Consider adding type assertion for mock data.While the mock data structure is comprehensive and well-organized, consider adding an explicit type assertion to ensure type safety.
- const mockElement: IFormElement = { + const mockElement = { id: 'test', element: 'test', valueDestination: 'test', children: [ // ... existing children ], disable: [], - }; + } as const satisfies IFormElement;
33-104
: Consider adding tests for additional edge cases.The current test coverage is good, but consider adding tests for:
- Deeply nested children (3+ levels deep)
- Multiple existing disable rules
- Invalid disable rule formats
Example test case for multiple disable rules:
it('should handle multiple existing disable rules', () => { const elementWithMultipleDisable: IFormElement = { id: 'test', element: 'test', valueDestination: 'test', children: [{ id: 'child', element: 'test', valueDestination: 'test.child', disable: [ { engine: 'json-logic', value: { '===': ['test1', 'test1'] }, }, { engine: 'json-logic', value: { '===': ['test2', 'test2'] }, } ], }], }; const { result } = renderHook(() => useChildrenDisabledOnLock(elementWithMultipleDisable, true), ); expect(result.current?.[0]?.disable).toEqual([ { engine: 'json-logic', value: { '===': ['test1', 'test1'] }, }, { engine: 'json-logic', value: { '===': ['test2', 'test2'] }, }, { engine: 'json-logic', value: { '==': [1, 1] }, } ]); });packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts (4)
21-25
: Consider strengthening type safety.The function signature could benefit from more specific types:
- Replace
any
inIFormElement<any, IEntityFieldGroupParams>
with a more specific type.- Consider replacing
AnyObject
with a more specific type that defines the expected structure of the context.
41-44
: Enhance error logging for missing template.The warning message could be more informative by including additional context such as the document element ID or name.
- console.warn('No template found for document field', documentElement); + console.warn('No template found for document field', { + documentId: documentElement.id, + fieldName: documentElement.name, + valueDestination: documentElement.valueDestination, + });
37-68
: Consider refactoring nested loops for better readability.The nested loop structure could be simplified by extracting the document processing logic into a separate function. This would improve readability and maintainability.
+const processDocumentElement = ( + documentElement: IFormElement<any, IDocumentFieldParams>, + context: AnyObject, + entityIndex: number, + stack: TDeepthLevelStack, + entityId: string, +): IDocumentCreationResult | null => { + if (!documentElement?.params?.template) { + console.warn('No template found for document field', documentElement); + return null; + } + + const documentDestination = formatValueDestination(documentElement.valueDestination, [ + ...(stack || []), + entityIndex, + ]); + + const documentFile = getFileOrFileIdFromDocumentsList( + get(context, documentDestination), + documentElement, + ); + + if (!documentFile || !(documentFile instanceof File)) { + return null; + } + + const payload = buildDocumentFormData(documentElement, { entityId }, documentFile); + + return { + payload, + documentDefinition: documentElement, + valueDestination: documentDestination, + }; +}; export const buildDocumentsCreationPayload = ( element: IFormElement<any, IEntityFieldGroupParams>, context: AnyObject, dependencies: IDocumentCreationDependencies, ): IDocumentCreationResult[] => { const documentElements = (element.children?.filter(child => child.element === 'documentfield') || []) as Array<IFormElement<any, IDocumentFieldParams>>; if (!documentElements?.length) { return []; } const { entityId, stack } = dependencies; - const documentPayload: IDocumentCreationResult[] = []; const entities = get(context, element.valueDestination, []); - // Outer loop for correct index calculation - for (let entityIndex = 0; entityIndex < entities.length; entityIndex++) { - // Inner loop for document elements, each entity can have multiple document fields - for (const documentElement of documentElements) { - if (!documentElement?.params?.template) { - console.warn('No template found for document field', documentElement); - continue; - } - - const documentDestination = formatValueDestination(documentElement.valueDestination, [ - ...(stack || []), - entityIndex, - ]); - - const documentFile = getFileOrFileIdFromDocumentsList( - get(context, documentDestination), - documentElement, - ); - - if (!documentFile || !(documentFile instanceof File)) { - continue; - } - - const payload = buildDocumentFormData(documentElement, { entityId }, documentFile); - - documentPayload.push({ - payload, - documentDefinition: documentElement, - valueDestination: documentDestination, - }); - } - } - - return documentPayload; + return entities.flatMap((_, entityIndex) => + documentElements + .map(documentElement => + processDocumentElement(documentElement, context, entityIndex, stack, entityId) + ) + .filter((result): result is IDocumentCreationResult => result !== null) + ); };
56-58
: Enhance file type validation.The file instance check could be more robust by using a type guard function.
+const isValidFile = (file: unknown): file is File => { + return file instanceof File && file.size > 0; +}; - if (!documentFile || !(documentFile instanceof File)) { + if (!documentFile || !isValidFile(documentFile)) { continue; }packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useValues/useValues.ts (1)
37-37
: Improve type safety by avoiding 'any'.Consider using a more specific type union instead of 'any' for better type safety.
- let newParentValue: any; + let newParentValue: unknown[] | Record<string, unknown> | undefined;apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.ts (1)
9-11
: Add JSDoc documentation for the interface.Consider adding documentation to explain the purpose of the interface and the expected format of the transform string.
+/** + * Parameters for the sync plugin + * @interface ISyncPluginParams + * @property {string} [transform] - Optional JSONata transformation expression to apply to the context + */ export interface ISyncPluginParams { transform?: string; }packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.ts (1)
27-48
: Add error handling for jsonata evaluation and improve logging.Two improvements needed:
- Add try-catch for jsonata evaluation
- Use a more descriptive logging message for missing default value
const addItem = useCallback(async () => { let initialValue = { __id: crypto.randomUUID(), }; const expression = element.params?.defaultValue; if (!expression) { - console.log('Default value is missing for', element.id); + console.warn(`Default value expression is missing for field "${element.id}"`); onChange([...(value || []), initialValue]); return; } + try { const result = await jsonata(expression).evaluate(values); + initialValue = { + ...initialValue, + ...result, + }; + } catch (error) { + console.error(`Failed to evaluate default value expression for field "${element.id}":`, error); + initialValue = { + ...initialValue, + }; + } - initialValue = { - ...initialValue, - ...result, - }; onChange([...(value || []), initialValue]); }, [value, values, onChange, element.params?.defaultValue, element.id]);packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/useEntitySync.ts (4)
49-52
: Add cleanup for debounced function to avoid memory leaks.
debouncedSync
can still run after the component unmounts, which can cause state updates on an unmounted component. Consider callingdebouncedSync.cancel()
within a cleanup function inuseEffect
to prevent unwanted side effects.
77-81
: Improve error handling flow.Currently, on failure to sync the entity, the code only logs the error and shows a toast. Consider adding retry logic or more robust error handling to ensure data integrity.
116-116
: Evaluate performance impact of structuredClone.Using
structuredClone
can be expensive for large or deeply nested objects. If performance becomes an issue, consider using a more selective cloning approach or a specialized utility.
128-128
: Remove or replace console.log in production environments.Relying on console.log for debugging can lead to clutter in production. Using a dedicated logging solution or removing this statement can help keep logs clean.
packages/ui/src/common/hooks/useHttp/types.ts (1)
1-7
: Consider enhancing type safety with additional type parameters.The interface could benefit from additional type parameters to improve type safety:
-export interface IHttpParams { +export interface IHttpParams<TBody = any, TResponse = any> { url: string; resultPath: string; headers?: Record<string, string>; method?: 'POST' | 'PUT' | 'GET' | 'DELETE'; timeout?: number; + body?: TBody; + params?: Record<string, string>; + response?: TResponse; }packages/ui/src/common/hooks/useHttp/utils/format-headers.ts (1)
3-15
: Add type guards for non-string header values.The function assumes all header values are strings, but they could be numbers or other types.
export const formatHeaders = ( headers: Record<string, string>, metadata: Record<string, string> = {}, ) => { const formattedHeaders: Record<string, string> = {}; Object.entries(headers).forEach(([key, value]) => { + if (typeof value !== 'string') { + formattedHeaders[key] = String(value); + return; + } + const formattedValue = formatString(value, metadata); formattedHeaders[key] = formattedValue; }); return formattedHeaders; };packages/ui/src/common/hooks/useHttp/utils/request.ts (2)
7-7
: Fix typo in type nameTReuqestParams
.The type name contains a typo and should be
TRequestParams
.-export type TReuqestParams = Omit<IHttpParams, 'resultPath'>; +export type TRequestParams = Omit<IHttpParams, 'resultPath'>;
9-38
: Consider enhancing request functionality with cancellation and retry support.The request utility could benefit from additional features:
- Add request cancellation support using
AbortController
.- Add retry logic for failed requests.
- Add response type validation.
Example implementation:
export const request = async ( request: TRequestParams, metadata: AnyObject = {}, data?: any, + signal?: AbortSignal, + retryCount = 3, params?: AnyObject, ) => { const { url: _url, headers = {}, method, timeout = 5000 } = request; const formattedUrl = formatString(_url, { ...metadata, ...params }); const formattedHeaders = formatHeaders(headers, metadata); + let attempts = 0; + + const makeRequest = async () => { try { const config = { url: formattedUrl, method, headers: formattedHeaders, data, timeout, + signal, }; const response = await axios(config); return response.data; } catch (error) { + if (error.name === 'AbortError') { + throw error; + } + + attempts++; + + if (attempts < retryCount) { + return makeRequest(); + } + console.error('Failed to perform request.', error); throw error; } + }; + + return makeRequest(); };packages/ui/src/common/hooks/useHttp/useHttp.ts (1)
7-50
: Consider enhancing hook with request lifecycle management.The hook could benefit from additional features:
- Add request cancellation on unmount.
- Add request debouncing/throttling.
- Add request caching.
Example implementation:
+import { useEffect, useRef } from 'react'; +import debounce from 'lodash/debounce'; export const useHttp = (params: IHttpParams, metadata: AnyObject) => { const [responseError, setResponseError] = useState<Error | null>(null); const [isLoading, setIsLoading] = useState(false); + const abortController = useRef<AbortController>(); + const cache = useRef<Map<string, any>>(new Map()); + useEffect(() => { + return () => { + abortController.current?.abort(); + }; + }, []); + const getCacheKey = (payload?: any, params?: AnyObject) => { + return JSON.stringify({ ...params, payload, metadata }); + }; const runRequest = useCallback( async ( requestPayload?: any, other?: { params?: AnyObject; + useCache?: boolean; + debounceMs?: number; }, ) => { + const cacheKey = getCacheKey(requestPayload, other?.params); + + if (other?.useCache && cache.current.has(cacheKey)) { + return cache.current.get(cacheKey); + } + + abortController.current?.abort(); + abortController.current = new AbortController(); setIsLoading(true); setResponseError(null); try { const response = await request( { ...params, url: params.url, }, metadata, requestPayload, + abortController.current.signal, other?.params, ); + if (other?.useCache) { + cache.current.set(cacheKey, response); + } return params.resultPath ? get(response, params.resultPath) : response; } catch (error) { + if (error.name === 'AbortError') { + return; + } console.error(error); setResponseError(error as Error); throw error; } finally { setIsLoading(false); } }, [params, metadata], ); + const debouncedRunRequest = useCallback( + (debounceMs: number) => + debounce(runRequest, debounceMs, { leading: true, trailing: false }), + [runRequest], + ); return { isLoading, error: responseError, run: runRequest, + runDebounced: debouncedRunRequest, }; };packages/ui/src/common/hooks/useHttp/utils/format-headers.unit.test.ts (1)
11-58
: Add test cases for edge cases.The test suite is well-structured but could benefit from additional test cases to improve coverage:
- Test handling of null/undefined headers
- Test handling of non-string header values (e.g., numbers)
- Test handling of special characters in header values
Here's an example of additional test cases:
+ it('should handle null/undefined headers', () => { + expect(formatHeaders(null)).toEqual({}); + expect(formatHeaders(undefined)).toEqual({}); + expect(mockedFormatString).not.toHaveBeenCalled(); + }); + + it('should handle non-string header values', () => { + const headers = { + 'X-Number': 123, + 'X-Boolean': true, + }; + + mockedFormatString + .mockReturnValueOnce('123') + .mockReturnValueOnce('true'); + + const result = formatHeaders(headers); + + expect(result).toEqual({ + 'X-Number': '123', + 'X-Boolean': 'true', + }); + }); + + it('should handle special characters in header values', () => { + const headers = { + 'X-Special': 'test{value}with{special}chars', + }; + + mockedFormatString.mockReturnValueOnce('testvaluewithspecialchars'); + + const result = formatHeaders(headers); + + expect(result).toEqual({ + 'X-Special': 'testvaluewithspecialchars', + }); + });packages/ui/src/common/hooks/useHttp/utils/request.unit.test.ts (1)
11-94
: Enhance test coverage with additional scenarios.The test suite should include additional test cases to cover more HTTP scenarios:
- Different HTTP methods (PUT, DELETE, etc.)
- Request timeouts
- Different response status codes
- Request cancellation
Here's an example of additional test cases:
+ it('should handle different HTTP methods', async () => { + const methods = ['PUT', 'DELETE', 'PATCH'] as const; + const mockResponse = { data: { result: 'success' } }; + + for (const method of methods) { + const requestParams = { + url: 'http://api.example.com/test', + method, + headers: {}, + } as const; + + mockFormatString.mockReturnValue('http://api.example.com/test'); + mockFormatHeaders.mockReturnValue({}); + mockAxios.mockResolvedValueOnce(mockResponse); + + await request(requestParams, {}); + + expect(mockAxios).toHaveBeenCalledWith(expect.objectContaining({ + method, + })); + } + }); + + it('should handle request timeout', async () => { + const requestParams = { + url: 'http://api.example.com/test', + method: 'GET', + headers: {}, + timeout: 1000, + } as const; + + mockFormatString.mockReturnValue('http://api.example.com/test'); + mockFormatHeaders.mockReturnValue({}); + mockAxios.mockRejectedValueOnce(new Error('timeout')); + + const consoleSpy = vi.spyOn(console, 'error'); + + await expect(request(requestParams, {})).rejects.toThrow('timeout'); + expect(consoleSpy).toHaveBeenCalled(); + }); + + it('should handle different response status codes', async () => { + const requestParams = { + url: 'http://api.example.com/test', + method: 'GET', + headers: {}, + } as const; + + mockFormatString.mockReturnValue('http://api.example.com/test'); + mockFormatHeaders.mockReturnValue({}); + mockAxios.mockRejectedValueOnce({ + response: { + status: 404, + data: { message: 'Not found' }, + }, + }); + + await expect(request(requestParams, {})).rejects.toMatchObject({ + response: { + status: 404, + }, + }); + }); + + it('should handle request cancellation', async () => { + const requestParams = { + url: 'http://api.example.com/test', + method: 'GET', + headers: {}, + } as const; + + mockFormatString.mockReturnValue('http://api.example.com/test'); + mockFormatHeaders.mockReturnValue({}); + mockAxios.mockRejectedValueOnce({ + name: 'CanceledError', + message: 'Request canceled', + }); + + await expect(request(requestParams, {})).rejects.toMatchObject({ + name: 'CanceledError', + }); + });packages/ui/src/common/hooks/useHttp/useHttp.unit.test.ts (1)
10-149
: Add test cases for advanced hook scenarios.The test suite should include additional test cases to cover advanced hook functionality:
- Concurrent requests handling
- Request cancellation on component unmount
- Request retry logic
- Request debouncing
Here's an example of additional test cases:
+ it('should handle concurrent requests', async () => { + const mockResponse1 = { data: { items: ['item1'] } }; + const mockResponse2 = { data: { items: ['item2'] } }; + + vi.mocked(request) + .mockImplementationOnce(() => new Promise(resolve => setTimeout(() => resolve(mockResponse1), 100))) + .mockImplementationOnce(() => new Promise(resolve => setTimeout(() => resolve(mockResponse2), 50))); + + const { result, rerender } = renderHook(() => useHttp(mockParams, mockMetadata)); + + const promise1 = result.current.run(); + const promise2 = result.current.run(); + + rerender(); + expect(result.current.isLoading).toBe(true); + + const [response1, response2] = await Promise.all([promise1, promise2]); + + expect(response1).toEqual(['item1']); + expect(response2).toEqual(['item2']); + }); + + it('should cancel request on unmount', async () => { + vi.mocked(request).mockImplementationOnce( + () => + new Promise(resolve => { + setTimeout(() => resolve(mockResponse), 100); + }), + ); + + const { result, unmount } = renderHook(() => useHttp(mockParams, mockMetadata)); + + const promise = result.current.run(); + unmount(); + + await expect(promise).rejects.toThrow('Component unmounted'); + }); + + it('should retry failed requests', async () => { + const mockError = new Error('Network error'); + vi.mocked(request) + .mockRejectedValueOnce(mockError) + .mockRejectedValueOnce(mockError) + .mockResolvedValueOnce(mockResponse); + + const { result } = renderHook(() => useHttp({ ...mockParams, retries: 2 }, mockMetadata)); + + const response = await result.current.run(); + + expect(request).toHaveBeenCalledTimes(3); + expect(response).toEqual(['item1', 'item2']); + }); + + it('should debounce requests', async () => { + vi.useFakeTimers(); + vi.mocked(request).mockResolvedValue(mockResponse); + + const { result } = renderHook(() => useHttp({ ...mockParams, debounceMs: 100 }, mockMetadata)); + + result.current.run(); + result.current.run(); + result.current.run(); + + await vi.advanceTimersByTimeAsync(50); + expect(request).not.toHaveBeenCalled(); + + await vi.advanceTimersByTimeAsync(50); + expect(request).toHaveBeenCalledTimes(1); + + vi.useRealTimers(); + });packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.ts (3)
1-6
: Consider optimizing hook parameters.Since the hook only uses the
children
property fromelement
, consider destructuring it directly in the parameters for better clarity and to avoid unnecessary object spreading.-export const useChildrenDisabledOnLock = (element: IFormElement, isLocked: boolean) => { - const { children: _children } = element; +export const useChildrenDisabledOnLock = ({ children: _children }: IFormElement, isLocked: boolean) => {
19-22
: Simplify the JSON Logic condition.The current condition
{"==": [1, 1]}
is a verbose way to represent a constant true value. Consider using a simpler representation or extracting it as a named constant for better maintainability.- value: { - '==': [1, 1], - }, + value: true,
12-34
: Consider memoizing the lockChildren function.For forms with deep nested structures, the recursive
lockChildren
function could benefit from memoization to prevent unnecessary recalculations.+ const lockChildren = useMemo(() => (children: IFormElement[]) => { return children.map(child => { // ... existing implementation }); - }; + }, []); // Empty dependency array as it doesn't use any external valuespackages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (3)
37-46
: Consider enumerations for type safety
While string fields fordocumentType
anddocumentVariant
are fine, consider enums if these fields have a finite set of acceptable values.
107-131
: Typo and potential error handling
• There's a typo in the console warning ("migging" should be "missing").
• Consider surroundingdeleteDocument
with try/catch to handle request failures gracefully.
159-161
: Async onClick
Stopping event propagation and awaitingclearFileAndInput()
is correct. Consider additional error handling ifdeleteDocument
fails.packages/ui/src/components/organisms/Form/Validator/validators/document/types.ts (1)
1-5
: IDocumentValidatorParams naming
id
might be clearer asdocumentId
to avoid ambiguity with other IDs.packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.ts (1)
5-30
: Enhance error messages
All validation failures throw the same message. Consider using context-specific messages for easier debugging (e.g., “Document array is empty,” “Document with the given ID not found,” etc.).packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.unit.test.ts (1)
24-32
: Add error scenario test cases.The test suite lacks coverage for error scenarios.
Consider adding tests for:
- Invalid template structure
- Missing required fields
- Edge cases with null/undefined values
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (4)
25-28
: Consider improving error handling in useHttp setup.The current implementation might throw runtime errors if
element.params?.httpParams?.createDocument
is undefined or has an invalid shape.Consider adding type guards or default values:
- const { run: uploadDocument, isLoading: isUploading } = useHttp( - (element.params?.httpParams?.createDocument || {}) as IHttpParams, - metadata, - ); + const createDocumentParams = element.params?.httpParams?.createDocument; + const { run: uploadDocument, isLoading: isUploading } = useHttp( + createDocumentParams && typeof createDocumentParams === 'object' + ? createDocumentParams as IHttpParams + : { url: '', method: 'POST' }, + metadata, + );
42-54
: Enhance error handling with more detailed warnings.While the current error handling is good, it could be more informative for debugging.
Consider enhancing the error messages:
- if (!createDocument) { - console.warn('Upload settings are missing on element', element, 'Upload will be skipped.'); + if (!createDocument?.url) { + console.warn( + 'Document upload URL is missing.', + { elementId: element.id, valueDestination: element.valueDestination }, + 'Upload will be skipped.', + ); return; } - if (!metadata.entityId) { - console.warn('Entity ID is missing on element', element, 'Upload will be skipped.'); + if (!metadata?.entityId) { + console.warn( + 'Entity ID is missing in metadata.', + { elementId: element.id, metadata }, + 'Upload will be skipped.', + ); return; }
56-60
: Add file validation before building form data.The current implementation doesn't validate the file parameter before passing it to buildDocumentFormData.
Consider adding file validation:
+ const file = e.target?.files?.[0]; + + if (!file) { + console.warn( + 'No file selected for upload.', + { elementId: element.id }, + 'Upload will be skipped.', + ); + + return; + } + const documentUploadPayload = buildDocumentFormData( element, { businessId: metadata.businessId as string }, - e.target?.files?.[0] as File, + file, );
118-129
: Consider optimizing useCallback dependencies.The current dependencies array includes
valuesRef
which is a ref and doesn't need to be in the dependencies array.Consider removing unnecessary dependencies:
[ uploadOn, params, metadata, addTask, removeTask, onChange, uploadDocument, id, element, - valuesRef, ],
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.unit.test.ts (3)
12-39
: Add test cases for new document template fields.While the mock data is updated with new fields, there are no specific test cases validating the behavior when these fields have edge cases.
Consider adding test cases for:
- Missing optional fields (version, issuingVersion)
- Empty or malformed issuer object
- Invalid document type/variant combinations
47-64
: Add negative test cases for document matching.The current tests only verify basic matching scenarios. Consider adding tests for edge cases in document matching.
Add test cases for:
- Documents with matching ID but mismatched category/type
- Documents with matching category/type but different versions
- Documents with malformed pages array
Also applies to: 66-87
89-124
: Improve test coverage for default values.The test for default values could be more comprehensive.
Consider testing:
- Missing template object
- Template with minimal required fields
- Template with null/undefined optional fields
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/helpers/get-entity-field-group-document-value-destination.ts (2)
3-3
: Enhance error message for better debugging.While the error handling is good, the error message could be more descriptive by including the valid type options.
- throw new Error(`Invalid entity field group type in documentfield: ${type}`); + const validTypes = Object.keys(valueDestinationsMap).join(', '); + throw new Error(`Invalid entity field group type in documentfield: ${type}. Valid types are: ${validTypes}`);Also applies to: 9-11
4-7
: LGTM! Well-typed map with clear structure.The use of Record type ensures type safety, and the paths are consistently structured.
Consider adding a comment explaining the purpose of the
$0
placeholder in the paths:const valueDestinationsMap: Record<TEntityFieldGroupType, string> = { + // $0 is replaced at runtime with the actual array index of the director/ubo director: 'entity.data.additionalInfo.directors[$0].additionalInfo.documents', ubo: 'entity.data.additionalInfo.ubos[$0].documents', };
services/workflows-service/src/collection-flow/collection-flow-entity.service.ts (2)
17-54
: Consider validating theentity
data before creation.You are setting all fields from the incoming DTO without additional validation inside the service. Ensuring robust validation here (e.g., verifying required fields, permissible values, etc.) can prevent invalid data from propagating.
56-71
: Consider partial or selective updates.Your
updateEntity
method overwrites fields inendUser
with the full DTO. If only some fields need updating, partial updates or explicit merges can reduce the likelihood of unintentionally clearing data.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx (1)
161-161
: Minor UI refinement.For clarity, replace “No File Choosen” with conventional spelling, e.g., “No File Chosen.”
- <span className="truncate text-sm">{file ? file.name : 'No File Choosen'}</span> + <span className="truncate text-sm">{file ? file.name : 'No File Chosen'}</span>packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/EntityFieldGroupDocument.tsx (1)
102-102
: Correct the console warning typo.“Migging” appears to be a typo. Consider correcting to “missing” for clarity.
- console.warn('Template id is migging in element', element); + console.warn('Template id is missing in element', element);packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/utils/transform.ts (1)
6-6
: Fix typo in variable name.The variable name 'transfomer' has a typo and should be 'transformer'.
- const transfomer = jsonata(expression); + const transformer = jsonata(expression);packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/helpers/get-entity-group-value-destination.ts (1)
3-16
: Consider making the destination paths more maintainable.While the implementation is correct, the hardcoded deeply nested paths could be fragile if the entity data structure changes.
Consider extracting the paths to a configuration file or constants and using path segments that can be joined:
-const destinationsMap: Record<TEntityFieldGroupType, string> = { - director: 'entity.data.additionalInfo.directors', - ubo: 'entity.data.additionalInfo.ubos', -}; +const ENTITY_PATHS = { + BASE: ['entity', 'data', 'additionalInfo'], + TYPES: { + director: 'directors', + ubo: 'ubos', + }, +} as const; + +const destinationsMap: Record<TEntityFieldGroupType, string> = { + director: [...ENTITY_PATHS.BASE, ENTITY_PATHS.TYPES.director].join('.'), + ubo: [...ENTITY_PATHS.BASE, ENTITY_PATHS.TYPES.ubo].join('.'), +};services/workflows-service/src/collection-flow/dto/create-collection-flow-document.schema.ts (1)
5-18
: Consider making the properties field more type-safe.The
properties
field usingType.Any()
could lead to type-safety issues and make it harder to validate the data structure at runtime.Consider defining a more specific schema for the properties field based on your document types:
-properties: Type.Record(Type.String(), Type.Any()), +properties: Type.Record(Type.String(), Type.Union([ + Type.String(), + Type.Number(), + Type.Boolean(), + Type.Object({ + // Add specific property schemas based on your document types + }), +])),packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/EntityFieldProvider.tsx (1)
9-23
: Add prop types validation for better type safety.While the component implementation is correct, adding prop types validation would make it more robust.
Consider adding prop types validation:
+import PropTypes from 'prop-types'; + export const EntityFieldProvider = ({ children, entityFieldGroupType, isSyncing, }: IEntityFieldProviderProps) => { const context = useMemo( () => ({ entityFieldGroupType, isSyncing, }), [entityFieldGroupType, isSyncing], ); return <EntityFieldContext.Provider value={context}>{children}</EntityFieldContext.Provider>; }; + +EntityFieldProvider.propTypes = { + children: PropTypes.node.isRequired, + entityFieldGroupType: PropTypes.oneOf(['director', 'ubo']).isRequired, + isSyncing: PropTypes.bool.isRequired, +};services/workflows-service/src/end-user/dtos/end-user-create.ts (1)
67-69
: Consider adding type validation for additionalInfo values.Using
Record<string, any>
could allow unsafe values. Consider adding validation or type constraints for the values.@IsOptional() -@IsObject() +@IsObject() +@ApiProperty({ + type: 'object', + additionalProperties: { + oneOf: [ + { type: 'string' }, + { type: 'number' }, + { type: 'boolean' } + ] + } +}) additionalInfo?: Record<string, any>;packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useIsEntityFieldsValid/useIsEntityFieldsValid.ts (1)
9-12
: Add TypeScript types for the hook parameters.Consider adding more specific types for better type safety.
-export const useEntityFieldsIsValid = ( - element: IFormElement<any, any>, - entityGroupIndex: number, -) => { +export const useEntityFieldsIsValid = <T = unknown, V = unknown>( + element: IFormElement<T, V>, + entityGroupIndex: number, +): boolean => {services/workflows-service/src/collection-flow/dto/create-entity-input.dto.ts (1)
56-58
: Consider improving type safety of additionalInfo.The
additionalInfo
property is typed asRecord<string, any>
, which could lead to type safety issues. Consider defining a more specific interface or type for the expected structure ofadditionalInfo
.- additionalInfo?: Record<string, any>; + additionalInfo?: Record<string, string | number | boolean | null>;packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.ts (1)
19-32
: Improve error handling and logging.The error handling could be enhanced to:
- Log the error details for debugging.
- Provide more specific error messages.
- Consider retrying the delete operation.
if (element.params?.httpParams?.deleteDocument) { const fileOrFileId = getFileOrFileIdFromDocumentsList( value as unknown as IDocumentTemplate[], element as IFormElement<'documentfield', IDocumentFieldParams>, ); if (!(fileOrFileId instanceof File)) { try { await request(element.params?.httpParams?.deleteDocument, metadata); } catch (error) { - toast.error(`Failed to delete document on hide. ${(error as Error)?.message}`); + const errorMessage = error instanceof Error ? error.message : 'Unknown error'; + console.error('Document deletion failed:', { error, fileOrFileId }); + toast.error(`Failed to delete document: ${errorMessage}. Please try again or contact support.`); } } }apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.unit.test.ts (1)
41-55
: Consider enhancing error handling testThe error handling test could be improved by:
- Verifying the exact error message content
- Testing different error types
- Ensuring the error doesn't propagate to the caller
it('should handle errors and show toast message', async () => { const mockError = new Error('Sync failed'); mockedSyncContext.mockRejectedValueOnce(mockError); const consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {}); - const result = await syncPlugin(mockContext, {} as any, {}); + let result; + try { + result = await syncPlugin(mockContext, {} as any, {}); + } catch (error) { + fail('Error should not propagate to caller'); + } expect(mockedToastError).toHaveBeenCalledTimes(1); expect(mockedToastError).toHaveBeenCalledWith('Failed to sync using plugin.'); expect(consoleSpy).toHaveBeenCalledWith(mockError); expect(result).toBe(mockContext); consoleSpy.mockRestore(); });packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.unit.test.ts (2)
18-21
: Avoid type casting withunknown
Using
unknown
type casting could mask type errors. Consider creating proper mock objects that match the expected types.-expect(() => documentValidator(null as unknown as TDocument[], mockParams)).toThrow( +expect(() => documentValidator(null as any, mockParams)).toThrow( 'Test message', ); -const mockDocuments = [{ id: 'wrong-id', pages: [] }] as unknown as TDocument[]; +const mockDocuments: TDocument[] = [{ + id: 'wrong-id', + pages: [], + propertiesSchema: {}, +}];Also applies to: 28-31
8-15
: Extract mock data to a shared fixtureConsider moving the mock params to a shared test fixture to improve maintainability and reusability.
+const createMockParams = (overrides = {}) => ({ + message: 'Test message', + value: { + id: 'test-id', + pageNumber: 0, + pageProperty: 'ballerineFileId', + ...overrides, + }, +}) as ICommonValidator<IDocumentValidatorParams, TBaseValidators | 'document'>; describe('documentValidator', () => { - const mockParams = { - message: 'Test message', - value: { - id: 'test-id', - pageNumber: 0, - pageProperty: 'ballerineFileId', - }, - } as ICommonValidator<IDocumentValidatorParams, TBaseValidators | 'document'>; + const mockParams = createMockParams();packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.unit.test.ts (1)
41-41
: Update test descriptions to reflect async behavior.The test descriptions should indicate that these are async tests for better clarity.
- it('should return a function that calls onChange with undefined for unknown element types', async () => { + it('should asynchronously return a function that calls onChange with undefined for unknown element types', async () => { - it('should use documentFieldValueCleaner for document field type', async () => { + it('should asynchronously use documentFieldValueCleaner for document field type', async () => {Also applies to: 49-49, 54-54, 66-66
apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/CollectionFlowUI.tsx (1)
22-22
: Consider documenting metadata structure and usage.The metadata handling is correctly implemented, but it would be beneficial to add JSDoc comments describing the expected structure and purpose of the metadata.
+/** + * @property metadata - UI schema metadata that extends the form's metadata object + * @example + * { + * customField: 'value', + * // ... other metadata properties + * } + */ metadata: UISchema['metadata'];Also applies to: 37-37, 66-66, 68-68
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx (2)
45-75
: Consider memoizing elementsOverride dependencies.The
elementsOverride
object is recreated on every render but only depends onelementsMap
. Consider extractingEntityFieldGroupDocument
to a constant.+const ELEMENTS_OVERRIDE = { + documentfield: EntityFieldGroupDocument, +}; const elementsOverride = useMemo( () => ({ ...elementsMap, - documentfield: EntityFieldGroupDocument, + ...ELEMENTS_OVERRIDE, }), [elementsMap], );
80-108
: Consider extracting button section to a separate component.The button section could be extracted to improve readability and maintainability.
+const AddEntityButton = ({ onClick, disabled, label }: { onClick: () => void; disabled: boolean; label: string }) => ( + <div className="flex flex-row justify-end"> + <Button onClick={onClick} disabled={disabled}> + {label} + </Button> + </div> +); return ( <div className="flex flex-col gap-4" data-testid={`${fieldId}-fieldlist`}> {/* ... entity fields mapping ... */} - <div className="flex flex-row justify-end"> - <Button onClick={addItem} disabled={disabled}> - {addButtonLabel} - </Button> - </div> + <AddEntityButton onClick={addItem} disabled={disabled} label={addButtonLabel} /> {/* ... other components ... */} </div> );packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts (1)
75-85
: Consider testing different error scenarios.The error handling test could be expanded to cover different types of errors and their specific messages.
+ it('should handle network error specifically', async () => { + const documents = [{ id: 'template-1' }]; + const networkError = new Error('Network Error'); + networkError.name = 'NetworkError'; + vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); + vi.mocked(request).mockRejectedValue(networkError); + + const result = await documentFieldValueCleaner(documents, mockElement); + + expect(toast.error).toHaveBeenCalledWith('Failed to delete document on hide. Network Error'); + expect(result).toEqual([]); + }); + + it('should handle 404 error specifically', async () => { + const documents = [{ id: 'template-1' }]; + const notFoundError = new Error('Not Found'); + notFoundError.name = '404'; + vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); + vi.mocked(request).mockRejectedValue(notFoundError); + + const result = await documentFieldValueCleaner(documents, mockElement); + + expect(toast.error).toHaveBeenCalledWith('Failed to delete document on hide. Document not found'); + expect(result).toEqual([]); + });packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/FileField.tsx (1)
39-42
: Consider adding error handling for document deletion.While the
useHttp
hook is correctly implemented for document deletion, there's no visible error handling for failed deletions.const { run: deleteDocument, isLoading: isDeletingDocument } = useHttp( (element.params?.httpParams?.deleteDocument || {}) as IHttpParams, metadata, + { + onError: (error) => { + console.error('Failed to delete document:', error); + // Consider showing a user-friendly error message + }, + } );services/workflows-service/src/collection-flow/controllers/collection-flow.controller.ts (1)
186-214
: Consider consolidating duplicate logic for directors and ubos.The mapping logic for directors and ubos is nearly identical. Consider extracting it into a shared function.
const createOrReuseEntity = async ( entity: { ballerineEntityId?: string; firstName: string; lastName: string; email: string; }, projectId: string, endUserService: EndUserService, ) => { if (entity.ballerineEntityId) { return entity; } const { id } = await endUserService.create({ data: { firstName: entity.firstName, lastName: entity.lastName, email: entity.email, projectId, }, }); return { ballerineEntityId: id, ...entity, }; };packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.stories.tsx (1)
285-314
: Consider adding step validation for ownership percentage.The percentage of ownership validation could be enhanced to ensure the total ownership across all UBOs doesn't exceed 100%.
Consider adding a custom validator to check the total ownership:
validate: [ { type: 'required', value: {}, message: 'Percentage of ownership is required', }, { type: 'minimum', value: { minimum: 0, }, message: 'Percentage of ownership must be greater than 0', }, { type: 'maximum', value: { maximum: 100, }, message: 'Percentage of ownership must be less than 100', }, + { + type: 'custom', + value: { + validator: 'totalOwnership', + params: { maximum: 100 } + }, + message: 'Total ownership percentage across all UBOs cannot exceed 100%', + }, ],
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (93)
apps/kyb-app/src/domains/collection-flow/types/index.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/CollectionFlowV2.tsx
(3 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/CollectionFlowUI.tsx
(4 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins.repository.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.unit.test.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePluginsHandler/usePluginsHandler.ts
(1 hunks)packages/ui/package.json
(1 hunks)packages/ui/src/common/hooks/useHttp/index.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/types.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/useHttp.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/useHttp.unit.test.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/utils/format-headers.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/utils/format-headers.unit.test.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/utils/request.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/utils/request.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/_stories/FileUploadShowcase/FileUploadShowcase.tsx
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx
(5 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.unit.test.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.unit.test.ts
(5 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/remove-document-from-list-by-template-id/remove-document-from-list-by-template-id.unit.test.ts
(3 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.stories.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/EntityFieldGroupDocument.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/helpers/get-entity-field-group-document-value-destination.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-creation.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/update-entities.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useIsEntityFieldsValid/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useIsEntityFieldsValid/useIsEntityFieldsValid.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/utils/transform.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/helpers/get-entity-group-value-destination.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/helpers/get-entity-group-value-destination.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/useEntitySync.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/EntityFieldProvider.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/entity-field-group-type.context.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityField/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityField/useEntityField.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityFieldGroupType/useEntityFieldGroupType.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/types.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/types/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/utils/delay.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FieldList/FieldList.tsx
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FieldList/hooks/useFieldList/useFieldList.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/FileField.tsx
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/helpers.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/helpers.unit.test.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.ts
(3 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.unit.test.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/index.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/upload-file.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/upload-file.unit.test.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.unit.test.ts
(3 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useValues/useValues.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/providers/TaskRunner/TaskRunner.tsx
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/repositories/fields-repository.ts
(2 hunks)packages/ui/src/components/organisms/Form/Validator/types/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/types.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/index.ts
(2 hunks)services/workflows-service/prisma/data-migrations
(1 hunks)services/workflows-service/src/collection-flow/collection-flow-entity.service.ts
(1 hunks)services/workflows-service/src/collection-flow/collection-flow.module.ts
(4 hunks)services/workflows-service/src/collection-flow/collection-flow.service.ts
(3 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.controller.ts
(3 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.entity.controller.ts
(1 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts
(5 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.no-user.controller.ts
(1 hunks)services/workflows-service/src/collection-flow/dto/create-collection-flow-document.schema.ts
(1 hunks)services/workflows-service/src/collection-flow/dto/create-entity-input.dto.ts
(1 hunks)services/workflows-service/src/document/dtos/document.dto.ts
(1 hunks)services/workflows-service/src/end-user/dtos/end-user-create.ts
(3 hunks)
💤 Files with no reviewable changes (7)
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FieldList/FieldList.tsx
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.unit.test.ts
- packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/upload-file.unit.test.ts
- packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/upload-file.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/helpers.unit.test.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/helpers.ts
✅ Files skipped from review due to trivial changes (17)
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityField/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/types/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useIsEntityFieldsValid/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/index.ts
- services/workflows-service/prisma/data-migrations
- packages/ui/src/common/hooks/useHttp/index.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePluginsHandler/usePluginsHandler.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FieldList/hooks/useFieldList/useFieldList.ts
- packages/ui/src/components/organisms/Form/Validator/validators/document/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/utils/delay.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- services/workflows-service/src/document/dtos/document.dto.ts
- services/workflows-service/src/collection-flow/collection-flow.service.ts
⏰ Context from checks skipped due to timeout of 90000ms (7)
- GitHub Check: test_windows
- GitHub Check: test_linux
- GitHub Check: build (windows-latest)
- GitHub Check: Analyze (javascript)
- GitHub Check: build (ubuntu-latest)
- GitHub Check: format
- GitHub Check: lint
🔇 Additional comments (66)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.ts (1)
14-16
: LGTM! Improved readability with block syntax.The expanded conditional block enhances code readability and maintains consistent style.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityFieldGroupType/useEntityFieldGroupType.ts (2)
1-2
: LGTM! Clean and focused imports.The imports are well-organized with clear separation between external React dependencies and internal context.
4-6
: LGTM! Follows React hook conventions.The hook implementation follows React best practices with proper naming convention and straightforward context access.
packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.ts (6)
2-2
: LGTM! Import and signature changes look good.The changes improve type safety by making
params
required and add necessary imports for the new implementation.Also applies to: 4-4, 15-15
23-24
: Verify error handling for missing httpParams.The non-null assertion (
!
) onelement.params
andhttpParams
could lead to runtime errors if these values are undefined.Consider adding a type guard or default value:
- const { run, isLoading } = useHttp(element.params!.httpParams!.createDocument || {}, metadata); + const { run, isLoading } = useHttp(element.params?.httpParams?.createDocument || {}, metadata);
42-46
: LGTM! File upload implementation looks good.The FormData construction and upload logic is clean and follows best practices.
57-61
: LGTM! Submit handler implementation looks good.The task runner integration is well implemented with proper error handling.
79-79
: Dependencies list is complete.All required dependencies for the useCallback hook are properly included.
82-86
: LGTM! Return value changes look good.Using
isLoading
from useHttp instead of managing local state is a good simplification.packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/remove-document-from-list-by-template-id/remove-document-from-list-by-template-id.unit.test.ts (1)
2-2
: LGTM! Type import aligns with schema updates.The import of
IDocumentTemplate
and its usage throughout the tests properly reflects the new document structure.packages/ui/src/components/organisms/Form/DynamicForm/_stories/FileUploadShowcase/FileUploadShowcase.tsx (2)
41-43
: Potential value collision in document fields.Both document fields write to the same
documents
destination. While they have different template IDs, this could lead to unexpected behavior if both fields are used simultaneously.Consider:
- Using unique
valueDestination
for each field- Verifying if this is the intended behavior for document fields
Also applies to: 60-62
87-110
: Clean and well-structured component implementation!The component follows React best practices with proper state management and clear separation of concerns. The mock data and console.log are acceptable for showcase purposes.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.unit.test.ts (2)
1-20
: LGTM! Well-structured mock setup.The import statements are organized logically, and the mock setup follows testing best practices with proper scoping.
1-148
: LGTM! Well-structured and comprehensive test suite.The test file demonstrates good practices with:
- Clear test organization using describe blocks
- Comprehensive mock setup
- Good coverage of core functionality
- Clean and maintainable test cases
The suggested improvements are minor enhancements to an already solid test suite.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.unit.test.ts (1)
1-5
: LGTM! Well-structured test setup.The imports are appropriate and follow testing best practices, using standard testing libraries.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts (1)
10-19
: LGTM! Well-structured interface definitions.The interfaces are well-defined with clear purposes and proper typing.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/update-entities.ts (1)
1-1
: LGTM! Clean import statement with proper type usage.packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useValues/useValues.ts (1)
1-26
: LGTM! Well-structured hook initialization.The hook's interface and initialization are well-designed with proper TypeScript typing, state management, and callback memoization.
apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.ts (1)
1-7
: LGTM! Well-organized imports and constants.The imports are properly structured and the plugin name is correctly defined as a constant for reusability.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.ts (2)
1-15
: LGTM! Well-structured imports and interface definition.The imports are appropriate and the interface is properly typed with clear generic parameters.
76-82
: LGTM! Clean and well-structured return value.The hook returns all necessary values and maintains a clear interface.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/useEntitySync.ts (1)
28-29
: Ensureelement.params
is never undefined.Currently, the code uses a non-null assertion for
element.params
, which may cause runtime errors ifparams
is ever undefined. Consider adding a guard check to prevent unexpected crashes.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.ts (1)
38-39
: LGTM!The return statement is clean and the hook's dependency array is correctly specified.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (6)
1-2
: Imports look good
No immediate concerns with the new imports.
8-8
: Confirmed import usage
The import fromuseDynamicForm
is straightforward and aligns well with the updated context.
24-35
: Interface definition is coherent
IDocumentTemplate
is clearly defined and consistent with other code patterns.
52-52
: Good usage of metadata
Destructuringmetadata
viauseDynamicForm
is consistent with the existing pattern.
57-61
: Double-check fallback logic
Using(element.params?.httpParams?.deleteDocument || {})
is convenient. Confirm that an empty object won’t break subsequent HTTP calls if certain fields are required.
139-141
: Robust pointer disabling
Disabling pointer events while deleting or uploading is a clean approach to prevent accidental clicks.packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.ts (1)
1-3
: Imports
Imports forTDocument
and validator types look appropriate.packages/ui/src/components/organisms/Form/Validator/validators/index.ts (1)
2-2
: LGTM!The addition of the
documentValidator
to thebaseValidatorsMap
is clean and follows the existing pattern.Also applies to: 19-19
packages/ui/src/components/organisms/Form/Validator/types/index.ts (1)
17-18
: LGTM!The addition of the 'document' type to TBaseValidators is clean and follows the existing pattern.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (1)
2-2
: LGTM! Good improvements to the imports and type definitions.The changes improve the code by:
- Centralizing HTTP handling through the useHttp hook
- Simplifying types by removing unnecessary generic type parameter
- Adding a helper for building document form data
Also applies to: 5-5, 13-13, 18-18
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/helpers/get-entity-field-group-document-value-destination.ts (2)
1-1
: LGTM! Clean import statement.The import is specific and the type is consistently used throughout the file.
5-6
: Verify document paths align with schema.Please ensure these paths align with the updated document management schema structure.
✅ Verification successful
🏁 Script executed:
#!/bin/bash # Search for schema definitions or type declarations that match these paths rg -A 5 "directors.*additionalInfo.*documents|ubos.*documents" --type=typescriptLength of output: 117
Below is a revised script that uses the recognized file type for TypeScript files. Please run it to search for occurrences of these schema paths:
If this search still does not produce any output—or if you do not see the expected schema definitions—please verify manually that the following paths in
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/helpers/get-entity-field-group-document-value-destination.ts
(lines 5-6) match the updated document management schema design:
director: 'entity.data.additionalInfo.directors[$0].additionalInfo.documents'
ubo: 'entity.data.additionalInfo.ubos[$0].documents'
Manual confirmation will ensure that these definitions align with the intended schema structure.
🏁 Script executed:
[run_scripts] #!/bin/bash # Search for references to the director and ubo document path properties in TypeScript files rg -A 5 'directors\[\$0\]\.additionalInfo\.documents|ubos\[\$0\]\.documents' --type=tsLength of output: 2564
Document paths are consistently used and appear to match the current schema design.
- The search confirms that the ubo path (
'entity.data.additionalInfo.ubos[$0].documents'
) is referenced in both the helper and story files.- The director path (
'entity.data.additionalInfo.directors[$0].additionalInfo.documents'
) follows a similar pattern and is used in the helper file, indicating its intended structure.Based on these findings, the references seem to be aligned with the updated document management schema. Please perform a manual check of the schema definition if a dedicated schema file exists.
services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts (1)
5-8
: LGTM! Well-structured dependency injection.The new dependencies are properly injected and the imports are well-organized for the document management functionality.
Also applies to: 41-43
services/workflows-service/src/collection-flow/collection-flow-entity.service.ts (2)
30-34
: Check for missingbusinessId
in downstream operations.Although you throw a
BadRequestException
whenbusinessId
is undefined, ensure downstream services and caller components gracefully handle cases where no validbusinessId
is returned from upstream.
73-94
: Confirm deletion order.The
deleteEntity
method deletes records inendUsersOnBusinesses
, thenendUser
, thendocument
. If any references depend on the end-user or the documents, re-check the sequence to avoid orphans or foreign key violations.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx (1)
85-91
: Validate the response fromcreateEntity
.Here, you assume
createEntity
returns a validstring
forcreatedEntityId
. Confirm that your HTTP response always includes a valid identifier to avoid unexpected undefined or null references.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/entity-field-group-type.context.ts (1)
1-4
: LGTM!The context implementation follows React's best practices and provides proper typing through the
IEntityFieldProviderContext
interface.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/types.ts (1)
3-6
: LGTM!The interface is well-defined with appropriate property types and optionality.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityField/useEntityField.ts (1)
4-12
: LGTM!The hook implementation follows React's best practices with proper error handling and descriptive error messages.
apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins.repository.ts (1)
4-4
: LGTM!The addition of the
syncPlugin
follows the established pattern for plugin registration and maintains consistency with the existing codebase.Also applies to: 12-12
services/workflows-service/src/end-user/dtos/end-user-create.ts (1)
46-51
: LGTM!The
country
field is properly decorated with validation and follows the established pattern.services/workflows-service/src/collection-flow/controllers/collection-flow.entity.controller.ts (2)
31-34
: Add validation for the delete operation.The
deleteEntity
endpoint should validate that the entity exists and handle potential foreign key constraints.❌ Verification inconclusive
🏁 Script executed:
#!/bin/bash # Search for entity existence and FK constraint validation in the service ast-grep --pattern 'async deleteEntity($params) { $$$ if (!$entity) { $$$ } $$$ }'Length of output: 96
Validation Missing in deleteEntity Handler
The current implementation of the
deleteEntity
endpoint incollection-flow.entity.controller.ts
simply delegates to the service method without checking if the entity exists or handling potential foreign key constraint issues. This absence of validations could result in unexpected errors when the endpoint is invoked.
- File:
services/workflows-service/src/collection-flow/controllers/collection-flow.entity.controller.ts
- Issue Details:
- No existence check before calling
deleteEntity
.- No error handling for foreign key constraint violations.
Please add appropriate validations in the endpoint or within the called service to:
- Verify the entity exists (and return an appropriate error if not).
- Gracefully handle any deletion errors related to foreign key constraints.
26-29
: Add validation for the update operation.The
updateEntity
endpoint should validate that the entity exists before attempting to update it.❌ Verification inconclusive
🏁 Script executed:
#!/bin/bash # Search for entity existence validation in the service ast-grep --pattern 'async updateEntity($params) { $$$ if (!$entity) { $$$ } $$$ }'Length of output: 96
Validation Missing in Update Endpoint
After verifying the code, no validation is implemented to check for the entity’s existence prior to attempting an update. The controller simply calls the service’s update method without first ensuring that the entity exists. This confirms the review comment's concern.
- Location:
services/workflows-service/src/collection-flow/controllers/collection-flow.entity.controller.ts
(controller merely delegates to the service)- Impact: The update operation does not check whether the specified entity exists, which can lead to unintended errors or states when updating non-existent entities.
Please add entity existence validation in either the controller or the service layer to handle this scenario appropriately.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/helpers/get-entity-group-value-destination.unit.test.ts (1)
1-44
: Well-structured test suite with comprehensive coverage!The test suite follows testing best practices with:
- Clear test descriptions
- Proper Arrange/When/Then structure
- Coverage of success and error cases
- Type safety with TEntityFieldGroupType
apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.unit.test.ts (1)
24-26
: Good practice using beforeEach for mock cleanup!Clearing mocks before each test ensures test isolation.
packages/ui/src/components/organisms/Form/DynamicForm/repositories/fields-repository.ts (1)
7-7
: Clean integration of EntityFieldGroup!The EntityFieldGroup is properly imported and registered in the baseFields object, following the existing pattern.
Also applies to: 27-27
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.unit.test.ts (2)
3-3
: LGTM: Mock setup for useDynamicForm is well-implemented.The mock setup for
useDynamicForm
and its metadata is correctly implemented with proper cleanup inbeforeEach
.Also applies to: 20-22, 38-38
87-102
: LGTM: Test for metadata updates is well-implemented.The test correctly verifies that the metadata reference is updated when metadata changes.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx (1)
23-43
: LGTM: Well-structured interfaces with clear type definitions.The interfaces are well-defined with clear separation of concerns between create, update, and delete operations.
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts (1)
9-24
: LGTM: Comprehensive mock setup for external dependencies.The mocks for
request
,toast
, and helper functions are well-organized and properly isolated.packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/FileField.tsx (3)
20-27
: LGTM! Interface changes improve document management.The new
httpParams
structure withcreateDocument
anddeleteDocument
provides better separation of concerns and clearer API contract compared to the previousuploadSettings
.
67-79
: Verify document deletion before clearing the input.The current implementation might clear the input even if document deletion fails.
Consider restructuring the
clearFileAndInput
function to ensure the input is only cleared after successful document deletion:const clearFileAndInput = useCallback(async () => { - onChange(undefined); - const fileId = value; if (typeof fileId === 'string') { - await deleteDocument({ ids: [fileId] }); + try { + await deleteDocument({ ids: [fileId] }); + onChange(undefined); + } catch (error) { + return; // Don't clear input if deletion fails + } + } else { + onChange(undefined); } if (inputRef.current) { inputRef.current.value = ''; } }, [onChange, value, deleteDocument]);
84-90
: LGTM! Proper handling of loading states.The component correctly disables interaction during document deletion with
isDeletingDocument
.services/workflows-service/src/collection-flow/collection-flow.module.ts (1)
43-45
: LGTM! Well-structured module integration.The addition of
DocumentModule
,CollectionFlowEntityController
, andCollectionFlowEntityService
follows NestJS best practices for module organization and dependency injection.Also applies to: 60-60, 68-68, 98-98
services/workflows-service/src/collection-flow/controllers/collection-flow.no-user.controller.ts (2)
43-50
: LGTM! Improved context passing with tokenScope.The addition of
tokenScope
parameter togetFlowConfiguration
call provides necessary context for flow configuration retrieval.
112-118
: LGTM! Enhanced error handling with specific exceptions.The error handling now properly differentiates between
BadRequestException
and other errors, providing better error context.services/workflows-service/src/collection-flow/controllers/collection-flow.controller.ts (1)
156-184
: LGTM! Improved entity handling for directors.The addition of
ballerineEntityId
check prevents duplicate entity creation and properly handles existing directors.apps/kyb-app/src/pages/CollectionFlow/versions/v2/CollectionFlowV2.tsx (5)
83-85
: LGTM!The addition of block statements improves code readability while maintaining the same logic.
91-93
: LGTM!The addition of block statements improves code readability while maintaining the same logic.
95-97
: LGTM!The addition of block statements improves code readability while maintaining the same logic.
99-101
: LGTM!The addition of block statements improves code readability while maintaining the same logic.
103-105
: LGTM!The addition of block statements improves code readability while maintaining the same logic.
packages/ui/package.json (1)
76-76
: Verify the security and compatibility of the sonner package.A new toast notification library has been added. Let's verify its security and compatibility.
✅ Verification successful
🌐 Web query:
What is the latest version of sonner and are there any known security vulnerabilities?
💡 Result:
The latest version of sonner appears to be v2.0.0-beta.1 (released December 2024) based on GitHub release notes[1]. This beta version focuses on bug fixes and styling improvements rather than breaking changes.
No security vulnerabilities are mentioned in available sources for sonner specifically[1]. However, other search results highlight general software vulnerability patterns developers should consider:
Key observations:
- Sonner's changelogs focus on UX improvements rather than security patches[1]
- Common vulnerability patterns in software include:
- Server-Side Request Forgery (SSRF)[8]
- Code injection risks[5]
- Dependency chain vulnerabilities (like Log4j's CVE-2021-44228)[3]
For security-conscious implementations, consider pairing sonner with code analysis tools like SonarQube that detect issues like improper input validation or insecure dependencies[5][9]. The absence of reported vulnerabilities doesn't guarantee security - proper code review and dependency monitoring remain essential[3][8].
Citations:
- 1: https://github.com/emilkowalski/sonner/releases
- 2: https://www.sonarsource.com/blog/excessive-expansion-uncovering-critical-security-vulnerabilities-in-jenkins/
- 3: https://www.comparitech.com/blog/information-security/cybersecurity-vulnerability-statistics/
- 4: https://www.anthropic.com/news/claude-3-5-sonnet
- 5: https://www.sonarsource.com/solutions/security/
- 6: https://community.sonarsource.com/t/upgraded-from-9-9-to-10-4-1-i-am-now-struggling-to-find-my-vulnerabilities-as-previously-listed/111523
- 7: https://ai.yale.edu/posts/2025-02-10-announcing-claude-35-sonnet-on-the-clarity-platform
- 8: https://waverleysoftware.com/blog/top-software-vulnerabilities/
- 9: https://community.sonarsource.com/t/sonarqube-security-issues-vulnerabilities-disappeared-after-a-few-days-on-its-own/109263
- 10: https://sonarr.tv
Sonner Package Verification: Secure and Compatible
- The installed version (^1.4.3) of sonner does not have any reported security vulnerabilities.
- The latest release is v2.0.0-beta.1 (December 2024), but since it is a beta, upgrading may pose compatibility risks.
- The current implementation appears secure; however, consider additional testing before moving to a beta release in production.
export const updateEntities = (entitiesList: IEntity[], updatedEntity: IEntity) => { | ||
return entitiesList.map(entity => { | ||
if (entity.__id === updatedEntity.__id) { | ||
const newEntity = { | ||
...entity, | ||
ballerineEntityId: updatedEntity.ballerineEntityId, | ||
}; | ||
|
||
return newEntity; | ||
} | ||
|
||
return entity; | ||
}); | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add input validation and enhance type safety.
The function could benefit from some defensive programming practices and improved type safety:
- Add input validation
- Use proper TypeScript type safety for the entity properties
- Consider spreading all updated properties instead of just ballerineEntityId
Here's a suggested improvement:
-import { IEntity } from '../../../types';
+import { IEntity } from '../../../types';
+
+type EntityWithId = IEntity & {
+ __id: string;
+ ballerineEntityId?: string;
+};
-export const updateEntities = (entitiesList: IEntity[], updatedEntity: IEntity) => {
+export const updateEntities = (
+ entitiesList: EntityWithId[] | undefined | null,
+ updatedEntity: EntityWithId | undefined | null,
+): EntityWithId[] => {
+ if (!entitiesList?.length || !updatedEntity) {
+ return entitiesList ?? [];
+ }
+
return entitiesList.map(entity => {
if (entity.__id === updatedEntity.__id) {
- const newEntity = {
+ return {
...entity,
- ballerineEntityId: updatedEntity.ballerineEntityId,
+ ...updatedEntity,
+ __id: entity.__id, // Preserve original ID
};
-
- return newEntity;
}
return entity;
});
};
This improvement:
- Adds proper type safety for required properties
- Handles null/undefined inputs
- Updates all properties from updatedEntity while preserving the original ID
- Removes unnecessary variable assignment
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
export const updateEntities = (entitiesList: IEntity[], updatedEntity: IEntity) => { | |
return entitiesList.map(entity => { | |
if (entity.__id === updatedEntity.__id) { | |
const newEntity = { | |
...entity, | |
ballerineEntityId: updatedEntity.ballerineEntityId, | |
}; | |
return newEntity; | |
} | |
return entity; | |
}); | |
}; | |
import { IEntity } from '../../../types'; | |
type EntityWithId = IEntity & { | |
__id: string; | |
ballerineEntityId?: string; | |
}; | |
export const updateEntities = ( | |
entitiesList: EntityWithId[] | undefined | null, | |
updatedEntity: EntityWithId | undefined | null, | |
): EntityWithId[] => { | |
if (!entitiesList?.length || !updatedEntity) { | |
return entitiesList ?? []; | |
} | |
return entitiesList.map(entity => { | |
if (entity.__id === updatedEntity.__id) { | |
return { | |
...entity, | |
...updatedEntity, | |
__id: entity.__id, // Preserve original ID | |
}; | |
} | |
return entity; | |
}); | |
}; |
if (Array.isArray(parentValue)) { | ||
newParentValue = [...parentValue]; | ||
} | ||
|
||
if (isObject(parentValue)) { | ||
newParentValue = { ...parentValue }; | ||
} | ||
|
||
set(newValues, parentValueDestination, newParentValue); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve parent value handling logic.
The current implementation has two potential issues:
- The if conditions for array and object are not mutually exclusive, which could lead to overwriting.
- The shallow copy might not be sufficient for deeply nested structures.
Consider this improved implementation:
- if (Array.isArray(parentValue)) {
- newParentValue = [...parentValue];
- }
-
- if (isObject(parentValue)) {
- newParentValue = { ...parentValue };
- }
+ if (Array.isArray(parentValue)) {
+ newParentValue = [...parentValue];
+ } else if (isObject(parentValue)) {
+ // Consider using a deep clone for nested structures
+ newParentValue = structuredClone(parentValue);
+ }
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if (Array.isArray(parentValue)) { | |
newParentValue = [...parentValue]; | |
} | |
if (isObject(parentValue)) { | |
newParentValue = { ...parentValue }; | |
} | |
set(newValues, parentValueDestination, newParentValue); | |
} | |
if (Array.isArray(parentValue)) { | |
newParentValue = [...parentValue]; | |
} else if (isObject(parentValue)) { | |
// Consider using a deep clone for nested structures | |
newParentValue = structuredClone(parentValue); | |
} | |
set(newValues, parentValueDestination, newParentValue); | |
} |
export const syncPlugin: TPluginRunner<ISyncPluginParams> = async (context, _, pluginParams) => { | ||
try { | ||
const syncPayload = pluginParams?.transform | ||
? await jsonata(pluginParams.transform).evaluate(context) | ||
: context; | ||
|
||
await syncContext(syncPayload as CollectionFlowContext); | ||
} catch (error) { | ||
toast.error('Failed to sync using plugin.'); | ||
console.error(error); | ||
} | ||
|
||
return context; | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve error handling and type safety.
Several improvements can be made to enhance reliability and debugging:
- The type cast to
CollectionFlowContext
should be validated - Error messages should be more informative
- Error type should be narrowed for better error handling
- Consider indicating sync failure in the returned context
Apply these improvements:
export const syncPlugin: TPluginRunner<ISyncPluginParams> = async (context, _, pluginParams) => {
+ let syncSuccessful = false;
try {
const syncPayload = pluginParams?.transform
? await jsonata(pluginParams.transform).evaluate(context)
: context;
+ // Validate the sync payload structure
+ if (!isValidCollectionFlowContext(syncPayload)) {
+ throw new Error('Invalid sync payload structure');
+ }
+
await syncContext(syncPayload as CollectionFlowContext);
+ syncSuccessful = true;
} catch (error) {
- toast.error('Failed to sync using plugin.');
+ const errorMessage = error instanceof Error ? error.message : 'Unknown error';
+ toast.error(`Sync plugin failed: ${errorMessage}`);
console.error(error);
}
- return context;
+ return {
+ ...context,
+ syncStatus: syncSuccessful ? 'success' : 'failed'
+ };
};
+/**
+ * Type guard to validate CollectionFlowContext structure
+ */
+function isValidCollectionFlowContext(payload: unknown): payload is CollectionFlowContext {
+ return (
+ typeof payload === 'object' &&
+ payload !== null
+ // Add more specific checks based on CollectionFlowContext structure
+ );
+}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
export const syncPlugin: TPluginRunner<ISyncPluginParams> = async (context, _, pluginParams) => { | |
try { | |
const syncPayload = pluginParams?.transform | |
? await jsonata(pluginParams.transform).evaluate(context) | |
: context; | |
await syncContext(syncPayload as CollectionFlowContext); | |
} catch (error) { | |
toast.error('Failed to sync using plugin.'); | |
console.error(error); | |
} | |
return context; | |
}; | |
export const syncPlugin: TPluginRunner<ISyncPluginParams> = async (context, _, pluginParams) => { | |
let syncSuccessful = false; | |
try { | |
const syncPayload = pluginParams?.transform | |
? await jsonata(pluginParams.transform).evaluate(context) | |
: context; | |
// Validate the sync payload structure | |
if (!isValidCollectionFlowContext(syncPayload)) { | |
throw new Error('Invalid sync payload structure'); | |
} | |
await syncContext(syncPayload as CollectionFlowContext); | |
syncSuccessful = true; | |
} catch (error) { | |
const errorMessage = error instanceof Error ? error.message : 'Unknown error'; | |
toast.error(`Sync plugin failed: ${errorMessage}`); | |
console.error(error); | |
} | |
return { | |
...context, | |
syncStatus: syncSuccessful ? 'success' : 'failed' | |
}; | |
}; | |
/** | |
* Type guard to validate CollectionFlowContext structure | |
*/ | |
function isValidCollectionFlowContext(payload: unknown): payload is CollectionFlowContext { | |
return ( | |
typeof payload === 'object' && | |
payload !== null | |
// Add more specific checks based on CollectionFlowContext structure | |
); | |
} |
metadata: { | ||
businessId: string; | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🛠️ Refactor suggestion
🧩 Analysis chain
Consider making the metadata property optional for backward compatibility.
The new required metadata
property could break existing code that uses the UISchema
interface. Consider making it optional using the ?
modifier.
- metadata: {
+ metadata?: {
businessId: string;
};
Let's verify the impact on existing components:
🏁 Script executed:
#!/bin/bash
# Search for components using UISchema to ensure they provide the metadata property
rg -l "UISchema" | xargs rg "metadata"
# Search for direct usage of the interface
ast-grep --pattern 'interface $_ extends UISchema'
Length of output: 855
Refactor UISchema Metadata to Optional
The search results confirm that at least one component (e.g., in CollectionFlowV2.tsx
) is using optional chaining (schema?.metadata
), indicating that not all usages guarantee the presence of metadata
. This inconsistency could lead to runtime or type issues when integrating with a required field.
-
Location:
apps/kyb-app/src/domains/collection-flow/types/index.ts
(UISchema definition) -
Observation: Components (e.g., in
CollectionFlowV2.tsx
andCollectionFlowUI.tsx
) expect metadata might be absent. -
Action: Refactor the
metadata
field inUISchema
to be optional:- metadata: { + metadata?: { businessId: string; };
This change will ensure backward compatibility and prevent potential errors when components receive an undefined metadata
.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
metadata: { | |
businessId: string; | |
}; | |
metadata?: { | |
businessId: string; | |
}; |
const { run: deleteEntity, isLoading } = useHttp( | ||
element.params!.httpParams?.deleteEntity, | ||
metadata, | ||
); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider adding null check for element.params.
The non-null assertion (!
) on element.params
could lead to runtime errors if params is undefined.
- const { run: deleteEntity, isLoading } = useHttp(
- element.params!.httpParams?.deleteEntity,
- metadata,
- );
+ const { run: deleteEntity, isLoading } = useHttp(
+ element.params?.httpParams?.deleteEntity,
+ metadata,
+ );
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const { run: deleteEntity, isLoading } = useHttp( | |
element.params!.httpParams?.deleteEntity, | |
metadata, | |
); | |
const { run: deleteEntity, isLoading } = useHttp( | |
element.params?.httpParams?.deleteEntity, | |
metadata, | |
); |
const isValid = useMemo(() => { | ||
const validationErrors = validate( | ||
values, | ||
validationSchema.map(schema => ({ | ||
...schema, | ||
valueDestination: formatValueDestination(schema.valueDestination!, [ | ||
...(stack || []), | ||
entityGroupIndex, | ||
]), | ||
})), | ||
); | ||
|
||
return validationErrors?.length === 0; | ||
}, [validationSchema, values, stack, entityGroupIndex]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for validation failures.
The validation logic should handle potential errors and provide meaningful feedback.
const isValid = useMemo(() => {
- const validationErrors = validate(
- values,
- validationSchema.map(schema => ({
- ...schema,
- valueDestination: formatValueDestination(schema.valueDestination!, [
- ...(stack || []),
- entityGroupIndex,
- ]),
- })),
- );
+ try {
+ const validationErrors = validate(
+ values,
+ validationSchema.map(schema => ({
+ ...schema,
+ valueDestination: formatValueDestination(schema.valueDestination!, [
+ ...(stack || []),
+ entityGroupIndex,
+ ]),
+ })),
+ );
- return validationErrors?.length === 0;
+ return validationErrors?.length === 0;
+ } catch (error) {
+ console.error('Validation error:', error);
+ return false;
+ }
}, [validationSchema, values, stack, entityGroupIndex]);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const isValid = useMemo(() => { | |
const validationErrors = validate( | |
values, | |
validationSchema.map(schema => ({ | |
...schema, | |
valueDestination: formatValueDestination(schema.valueDestination!, [ | |
...(stack || []), | |
entityGroupIndex, | |
]), | |
})), | |
); | |
return validationErrors?.length === 0; | |
}, [validationSchema, values, stack, entityGroupIndex]); | |
const isValid = useMemo(() => { | |
try { | |
const validationErrors = validate( | |
values, | |
validationSchema.map(schema => ({ | |
...schema, | |
valueDestination: formatValueDestination(schema.valueDestination!, [ | |
...(stack || []), | |
entityGroupIndex, | |
]), | |
})), | |
); | |
return validationErrors?.length === 0; | |
} catch (error) { | |
console.error('Validation error:', error); | |
return false; | |
} | |
}, [validationSchema, values, stack, entityGroupIndex]); |
export class CreateEntityInputDto { | ||
@IsString() | ||
entityType!: BusinessPosition; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add enum validation for entityType.
The entityType
property should be validated to ensure it contains a valid BusinessPosition
enum value.
+ @ApiProperty({
+ enum: BusinessPosition,
+ required: true,
+ })
@IsString()
+ @IsEnum(BusinessPosition)
entityType!: BusinessPosition;
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
export class CreateEntityInputDto { | |
@IsString() | |
entityType!: BusinessPosition; | |
export class CreateEntityInputDto { | |
@ApiProperty({ | |
enum: BusinessPosition, | |
required: true, | |
}) | |
@IsString() | |
@IsEnum(BusinessPosition) | |
entityType!: BusinessPosition; | |
} |
@@ -32,6 +34,8 @@ export const TaskRunner = ({ children }: ITaskRunnerProps) => { | |||
|
|||
setIsRunning(false); | |||
|
|||
setTasks([]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider moving task reset to finally block.
The task reset should happen even if an error occurs during task execution.
const runTasks = useCallback(
async <TContext extends AnyObject>(context: TContext) => {
if (isRunning) {
return context;
}
setIsRunning(true);
const tasksCompose = asyncCompose(...tasks.map(task => task.run));
- await tasksCompose(context);
-
- setIsRunning(false);
-
- setTasks([]);
-
- return context;
+ try {
+ await tasksCompose(context);
+ return context;
+ } finally {
+ setIsRunning(false);
+ setTasks([]);
+ }
},
[tasks, isRunning],
);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
setTasks([]); | |
const runTasks = useCallback( | |
async <TContext extends AnyObject>(context: TContext) => { | |
if (isRunning) { | |
return context; | |
} | |
setIsRunning(true); | |
const tasksCompose = asyncCompose(...tasks.map(task => task.run)); | |
try { | |
await tasksCompose(context); | |
return context; | |
} finally { | |
setIsRunning(false); | |
setTasks([]); | |
} | |
}, | |
[tasks, isRunning], | |
); |
const metadata = { | ||
apiUrl: 'http://localhost:3000/api/v1/', | ||
token: 'e3a69aa3-c1ad-42f3-87ac-5105cff81a94', | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove hardcoded API token from the code.
The metadata object contains a hardcoded API token which could pose a security risk if committed to the repository.
Consider using environment variables or a secure configuration management system instead:
-const metadata = {
- apiUrl: 'http://localhost:3000/api/v1/',
- token: 'e3a69aa3-c1ad-42f3-87ac-5105cff81a94',
-};
+const metadata = {
+ apiUrl: process.env.API_URL || 'http://localhost:3000/api/v1/',
+ token: process.env.API_TOKEN,
+};
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const metadata = { | |
apiUrl: 'http://localhost:3000/api/v1/', | |
token: 'e3a69aa3-c1ad-42f3-87ac-5105cff81a94', | |
}; | |
const metadata = { | |
apiUrl: process.env.API_URL || 'http://localhost:3000/api/v1/', | |
token: process.env.API_TOKEN, | |
}; |
httpParams: { | ||
createEntity: { | ||
httpParams: { | ||
url: '{apiUrl}collection-flow/entity', | ||
method: 'POST', | ||
headers: { | ||
Authorization: 'Bearer {token}', | ||
}, | ||
resultPath: 'entityId', | ||
}, | ||
transform: `{ | ||
"firstName": entity.firstName, | ||
"lastName": entity.lastName, | ||
"email": entity.email, | ||
"phone": entity.phone, | ||
"country": entity.country, | ||
"dateOfBirth": entity.dateOfBirth, | ||
"nationality": entity.nationality, | ||
"passportNumber": entity.passportNumber, | ||
"address": entity.street & ", " & entity.city & ", " & entity.country, | ||
"nationalId": entity.nationalId, | ||
"isAuthorizedSignatory": entity.isAuthorizedSignatory, | ||
"city": entity.city, | ||
"additionalInfo": { | ||
"fullAddress": entity.street & ", " & entity.city & ", " & entity.country, | ||
"companyName": context.entity.data.companyName, | ||
"customerCompany": context.collectionFlow.additionalInformation.customerCompany, | ||
"placeOfBirth": entity.placeOfBirth, | ||
"percentageOfOwnership": entity.ownershipPercentage, | ||
"role": entity.role | ||
} | ||
}`, | ||
}, | ||
deleteEntity: { | ||
url: '{apiUrl}collection-flow/entity/{entityId}', | ||
method: 'DELETE', | ||
headers: { | ||
Authorization: 'Bearer {token}', | ||
}, | ||
}, | ||
updateEntity: { | ||
httpParams: { | ||
url: '{apiUrl}collection-flow/entity/{entityId}', | ||
method: 'PUT', | ||
headers: { | ||
Authorization: 'Bearer {token}', | ||
}, | ||
}, | ||
transform: `{ | ||
"firstName": entity.firstName, | ||
"lastName": entity.lastName, | ||
"email": entity.email, | ||
"phone": entity.phone, | ||
"country": entity.country, | ||
"dateOfBirth": entity.dateOfBirth, | ||
"nationality": entity.nationality, | ||
"passportNumber": entity.passportNumber, | ||
"address": entity.street & ", " & entity.city & ", " & entity.country, | ||
"nationalId": entity.nationalId, | ||
"isAuthorizedSignatory": entity.isAuthorizedSignatory, | ||
"city": entity.city, | ||
"additionalInfo": { | ||
"fullAddress": entity.street & ", " & entity.city & ", " & entity.country, | ||
"companyName": context.entity.data.companyName, | ||
"customerCompany": context.collectionFlow.additionalInformation.customerCompany, | ||
"placeOfBirth": entity.placeOfBirth, | ||
"percentageOfOwnership": entity.ownershipPercentage, | ||
"role": entity.role | ||
} | ||
}`, | ||
}, | ||
uploadDocument: { | ||
url: '{apiUrl}collection-flow/files', | ||
method: 'POST', | ||
headers: { | ||
Authorization: 'Bearer {token}', | ||
}, | ||
resultPath: 'id', | ||
}, | ||
deleteDocument: { | ||
url: '{apiUrl}collection-flow/files', | ||
method: 'DELETE', | ||
headers: { | ||
Authorization: 'Bearer {token}', | ||
}, | ||
}, | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider adding error handling for API operations.
The HTTP parameters for entity operations (create, update, delete) should include error handling configurations.
Add error handling configurations:
httpParams: {
createEntity: {
httpParams: {
url: '{apiUrl}collection-flow/entity',
method: 'POST',
headers: {
Authorization: 'Bearer {token}',
},
resultPath: 'entityId',
+ errorHandler: {
+ onError: 'showToast',
+ message: 'Failed to create entity',
+ },
+ retryConfig: {
+ retries: 3,
+ backoff: true,
+ },
},
// ... rest of the configuration
},
// Add similar error handling for updateEntity and deleteEntity
},
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
httpParams: { | |
createEntity: { | |
httpParams: { | |
url: '{apiUrl}collection-flow/entity', | |
method: 'POST', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
resultPath: 'entityId', | |
}, | |
transform: `{ | |
"firstName": entity.firstName, | |
"lastName": entity.lastName, | |
"email": entity.email, | |
"phone": entity.phone, | |
"country": entity.country, | |
"dateOfBirth": entity.dateOfBirth, | |
"nationality": entity.nationality, | |
"passportNumber": entity.passportNumber, | |
"address": entity.street & ", " & entity.city & ", " & entity.country, | |
"nationalId": entity.nationalId, | |
"isAuthorizedSignatory": entity.isAuthorizedSignatory, | |
"city": entity.city, | |
"additionalInfo": { | |
"fullAddress": entity.street & ", " & entity.city & ", " & entity.country, | |
"companyName": context.entity.data.companyName, | |
"customerCompany": context.collectionFlow.additionalInformation.customerCompany, | |
"placeOfBirth": entity.placeOfBirth, | |
"percentageOfOwnership": entity.ownershipPercentage, | |
"role": entity.role | |
} | |
}`, | |
}, | |
deleteEntity: { | |
url: '{apiUrl}collection-flow/entity/{entityId}', | |
method: 'DELETE', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
}, | |
updateEntity: { | |
httpParams: { | |
url: '{apiUrl}collection-flow/entity/{entityId}', | |
method: 'PUT', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
}, | |
transform: `{ | |
"firstName": entity.firstName, | |
"lastName": entity.lastName, | |
"email": entity.email, | |
"phone": entity.phone, | |
"country": entity.country, | |
"dateOfBirth": entity.dateOfBirth, | |
"nationality": entity.nationality, | |
"passportNumber": entity.passportNumber, | |
"address": entity.street & ", " & entity.city & ", " & entity.country, | |
"nationalId": entity.nationalId, | |
"isAuthorizedSignatory": entity.isAuthorizedSignatory, | |
"city": entity.city, | |
"additionalInfo": { | |
"fullAddress": entity.street & ", " & entity.city & ", " & entity.country, | |
"companyName": context.entity.data.companyName, | |
"customerCompany": context.collectionFlow.additionalInformation.customerCompany, | |
"placeOfBirth": entity.placeOfBirth, | |
"percentageOfOwnership": entity.ownershipPercentage, | |
"role": entity.role | |
} | |
}`, | |
}, | |
uploadDocument: { | |
url: '{apiUrl}collection-flow/files', | |
method: 'POST', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
resultPath: 'id', | |
}, | |
deleteDocument: { | |
url: '{apiUrl}collection-flow/files', | |
method: 'DELETE', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
}, | |
}, | |
httpParams: { | |
createEntity: { | |
httpParams: { | |
url: '{apiUrl}collection-flow/entity', | |
method: 'POST', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
resultPath: 'entityId', | |
errorHandler: { | |
onError: 'showToast', | |
message: 'Failed to create entity', | |
}, | |
retryConfig: { | |
retries: 3, | |
backoff: true, | |
}, | |
}, | |
transform: `{ | |
"firstName": entity.firstName, | |
"lastName": entity.lastName, | |
"email": entity.email, | |
"phone": entity.phone, | |
"country": entity.country, | |
"dateOfBirth": entity.dateOfBirth, | |
"nationality": entity.nationality, | |
"passportNumber": entity.passportNumber, | |
"address": entity.street & ", " & entity.city & ", " & entity.country, | |
"nationalId": entity.nationalId, | |
"isAuthorizedSignatory": entity.isAuthorizedSignatory, | |
"city": entity.city, | |
"additionalInfo": { | |
"fullAddress": entity.street & ", " & entity.city & ", " & entity.country, | |
"companyName": context.entity.data.companyName, | |
"customerCompany": context.collectionFlow.additionalInformation.customerCompany, | |
"placeOfBirth": entity.placeOfBirth, | |
"percentageOfOwnership": entity.ownershipPercentage, | |
"role": entity.role | |
} | |
}`, | |
}, | |
deleteEntity: { | |
url: '{apiUrl}collection-flow/entity/{entityId}', | |
method: 'DELETE', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
}, | |
updateEntity: { | |
httpParams: { | |
url: '{apiUrl}collection-flow/entity/{entityId}', | |
method: 'PUT', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
}, | |
transform: `{ | |
"firstName": entity.firstName, | |
"lastName": entity.lastName, | |
"email": entity.email, | |
"phone": entity.phone, | |
"country": entity.country, | |
"dateOfBirth": entity.dateOfBirth, | |
"nationality": entity.nationality, | |
"passportNumber": entity.passportNumber, | |
"address": entity.street & ", " & entity.city & ", " & entity.country, | |
"nationalId": entity.nationalId, | |
"isAuthorizedSignatory": entity.isAuthorizedSignatory, | |
"city": entity.city, | |
"additionalInfo": { | |
"fullAddress": entity.street & ", " & entity.city & ", " & entity.country, | |
"companyName": context.entity.data.companyName, | |
"customerCompany": context.collectionFlow.additionalInformation.customerCompany, | |
"placeOfBirth": entity.placeOfBirth, | |
"percentageOfOwnership": entity.ownershipPercentage, | |
"role": entity.role | |
} | |
}`, | |
}, | |
uploadDocument: { | |
url: '{apiUrl}collection-flow/files', | |
method: 'POST', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
resultPath: 'id', | |
}, | |
deleteDocument: { | |
url: '{apiUrl}collection-flow/files', | |
method: 'DELETE', | |
headers: { | |
Authorization: 'Bearer {token}', | |
}, | |
}, | |
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
♻️ Duplicate comments (1)
services/workflows-service/src/document/document.service.ts (1)
29-113
:⚠️ Potential issueWrap file upload and document creation in a transaction.
The file upload and document creation operations are not wrapped in a transaction. If document creation fails after a successful file upload, it will result in orphaned files in the storage. Consider using a transaction to ensure atomicity.
async create( { file, metadata, projectId, ...data }: Static<typeof CreateDocumentSchema> & { file: Express.Multer.File; metadata: Omit< Static<typeof CreateDocumentFileSchema>, 'documentId' | 'fileId' | 'projectId' >; projectId: string; }, args?: Prisma.DocumentCreateArgs, transaction?: PrismaTransactionClient, ) { + const tx = transaction ?? await this.repository.startTransaction(); + try { if (!data.businessId && !data.endUserId) { throw new BadRequestException('Business or end user id is required'); } // ... validation code ... const workflowRuntimeData = await this.workflowService.getWorkflowRuntimeDataById( data.workflowRuntimeDataId, {}, [projectId], ); const uploadedFile = await this.fileService.uploadNewFile(projectId, workflowRuntimeData, { ...file, mimetype: file.mimetype || (await getFileMetadata({ file: file.originalname || '', fileName: file.originalname || '', }))?.mimeType || '', }); const document = await this.repository.create( { ...data, ...(data.businessId && { businessId: data.businessId }), ...(data.endUserId && { endUserId: data.endUserId }), projectId, }, args, tx, ); await this.documentFileService.create( { documentId: document.id, fileId: uploadedFile.id, projectId, ...metadata, }, undefined, tx, ); const entityId = getEntityId(); const result = await this.getByEntityIdAndWorkflowId(entityId, data.workflowRuntimeDataId, [projectId]); + if (!transaction) await tx.commit(); + return result; + } catch (error) { + if (!transaction) await tx.rollback(); + throw error; + } }
🧹 Nitpick comments (19)
services/workflows-service/src/ui-definition/ui-definition.service.ts (1)
54-54
: LGTM! Consider documenting the default behavior.Making the
args
parameter optional is a good improvement that aligns with the pattern used ingetByWorkflowDefinitionId
. This change provides more flexibility while maintaining backward compatibility.Consider adding a JSDoc comment to document what happens when
args
is omitted. This would help other developers understand the default behavior.+/** + * Get UI definition by runtime ID. + * @param runtimeId - The runtime ID to look up. + * @param uiContext - The UI context to filter by. + * @param projectIds - The project IDs to scope the query. + * @param args - Optional query arguments. When omitted, only filters by runtime's uiDefinitionId if available. + */ async getByRuntimeId( runtimeId: string, uiContext: keyof typeof UiDefinitionContext, projectIds: TProjectIds, args?: Omit<Prisma.UiDefinitionFindFirstOrThrowArgs, 'where'>, )apps/backoffice-v2/src/domains/documents/hooks/schemas/request-documents.ts (1)
4-6
: Consider adding validation rules for document IDs.The schema could benefit from additional validation rules:
- Minimum/maximum array length to prevent empty arrays or excessive requests
- Document ID format validation (e.g., UUID, specific pattern)
export const RequestDocumentsSchema = z.object({ - documentIds: z.array(z.string()), + documentIds: z.array(z.string().uuid()).min(1).max(10), });apps/backoffice-v2/src/domains/documents/fetchers.ts (1)
6-14
: Enhance error handling and network resilience.Consider adding:
- Retry logic for transient failures
- Request timeout configuration
- More specific error type handling beyond Zod validation
Example enhancement:
export const fetchDocumentsTrackerItems = async ({ workflowId, retries = 3, timeout = 5000 }: { workflowId: string; retries?: number; timeout?: number; }) => { try { const [documentsTrackerItems, error] = await apiClient({ endpoint: `../external/documents/tracker/${workflowId}`, method: Method.GET, schema: DocumentsTrackerSchema, timeout, retry: { attempts: retries, backoff: 'exponential' } }); return handleZodError(error, documentsTrackerItems); } catch (error) { if (error instanceof NetworkError) { // Handle network errors } throw error; } };Also applies to: 16-25
apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts (3)
6-7
: Enhance status and decision field validation.Consider:
- Using more specific status values
- Converting decision to an enum type
- status: z.enum(['provided', 'unprovided', 'requested']), - decision: z.string().nullable(), + status: z.enum(['DOCUMENT_PROVIDED', 'DOCUMENT_MISSING', 'DOCUMENT_REQUESTED', 'DOCUMENT_EXPIRED']), + decision: z.enum(['APPROVED', 'REJECTED', 'PENDING']).nullable(),
8-15
: Add constraints to document properties.Consider adding validation rules for:
issuingCountry
: ISO country code formatversion
: Semantic versioning format- Required vs optional fields
properties: z.object({ type: z.string(), templateId: z.string(), category: z.string(), - issuingCountry: z.string(), + issuingCountry: z.string().length(2).toUpperCase(), issuingVersion: z.string(), - version: z.string(), + version: z.string().regex(/^\d+\.\d+\.\d+$/), }),
16-22
: Strengthen entity type validation.Consider:
- Converting entityType to an enum
- Adding validation for name fields
entity: z.object({ - entityType: z.string(), + entityType: z.enum(['INDIVIDUAL', 'COMPANY', 'ORGANIZATION']), id: z.string(), companyName: z.string().optional(), - firstName: z.string().optional(), - lastName: z.string().optional(), + firstName: z.string().min(1).max(100).optional(), + lastName: z.string().min(1).max(100).optional(), }),apps/backoffice-v2/src/domains/documents/hooks/mutations/useRequestDocumentsMutation.tsx (2)
24-36
: Enhance error handling with more specific error types.The error handling could be improved by:
- Adding type safety for the error object.
- Handling more specific HTTP error codes.
Consider this improvement:
- onError: (error: unknown) => { + onError: (error: unknown) => { + type ErrorWithMessage = { + message: string; + }; + + const isErrorWithMessage = (error: unknown): error is ErrorWithMessage => + isObject(error) && 'message' in error; + if (error instanceof HttpError && error.code === 400) { toast.error(error.message); return; } + + if (error instanceof HttpError) { + switch (error.code) { + case 401: + toast.error(t('toast:request_documents.unauthorized')); + return; + case 403: + toast.error(t('toast:request_documents.forbidden')); + return; + case 404: + toast.error(t('toast:request_documents.not_found')); + return; + } + } toast.error( t(`toast:request_documents.error`, { - errorMessage: isObject(error) && 'message' in error ? error.message : error, + errorMessage: isErrorWithMessage(error) ? error.message : String(error), }), ); },
9-11
: Add type safety for the success callback.The
onSuccess
callback type could be more specific.Consider this improvement:
-export const useRequestDocumentsMutation = (options?: { - onSuccess?: <TData>(data: TData) => void; -}) => { +type RequestDocumentsResponse = { + success: boolean; + message: string; +}; + +export const useRequestDocumentsMutation = (options?: { + onSuccess?: (data: RequestDocumentsResponse) => void; +}) => {apps/backoffice-v2/src/common/components/molecules/DocumentTracker/constants.tsx (1)
10-48
: Standardize icon sizes and extract common styles.The icon implementation has inconsistencies:
- Icon sizes vary between 18 and 16.5
- Common styles are duplicated
Consider these improvements:
+const ICON_SIZE = 18; +const getContainerClass = (color: string) => `bg-${color}/20`; + export const Icon = { CHECK: ( <CheckCircle - size={18} + size={ICON_SIZE} className={`stroke-success`} containerProps={{ - className: 'bg-success/20', + className: getContainerClass('success'), }} /> ), X: ( <XCircle - size={18} + size={ICON_SIZE} className={`stroke-destructive`} containerProps={{ - className: 'bg-destructive/20', + className: getContainerClass('destructive'), }} /> ), // ... similar changes for other icons - MARKED: <FilePlus2Icon className="stroke-warning" size={16.5} />, + MARKED: <FilePlus2Icon className="stroke-warning" size={ICON_SIZE} />,services/workflows-service/src/document/types.ts (2)
24-42
: Consider adding validation for name fields.While the
EntitySchema
is well-structured using discriminated unions, consider adding string length constraints and character validation for name fields.z.object({ entityType: z.literal('ubo'), id: z.string(), - firstName: z.string(), - lastName: z.string(), + firstName: z.string().min(1).max(100).regex(/^[a-zA-Z\s-']+$/), + lastName: z.string().min(1).max(100).regex(/^[a-zA-Z\s-']+$/), }),
44-50
: Consider adding custom error messages.The schema could benefit from custom error messages to improve the developer experience when validation fails.
const DocumentTrackerDocumentSchema = z.object({ - documentId: z.string().nullable(), + documentId: z.string().nullable().describe('Unique identifier for the document'), - status: z.nativeEnum(DocumentStatus), + status: z.nativeEnum(DocumentStatus, { + errorMap: (issue, ctx) => ({ message: 'Invalid document status' }) + }), // ... rest of the schema });packages/ui/src/components/molecules/AccordionCard/AccordionCard.Item.tsx (1)
53-65
: Consider using a more stable key for list items.Using array indices as keys can lead to performance issues and bugs when items are reordered or removed. Consider using a more stable unique identifier if available.
- subitems.map(({ leftIcon, text, rightIcon, itemClassName }, index) => { + subitems.map(({ leftIcon, text, rightIcon, itemClassName, id }, index) => { return ( <li {...liProps} className={ctw(`flex items-center gap-x-2`, itemClassName, liProps?.className)} - key={index} + key={id ?? index} // Fallback to index if id is not available >services/workflows-service/src/document/document.repository.ts (2)
67-83
: Consider using single-record update for updateById.The method uses
updateMany
internally, but since we're updating by ID (which should be unique), usingupdate
would be more appropriate and could provide better type safety for the return value.async updateById( id: string, projectIds: TProjectId[], data: Prisma.DocumentUpdateInput, - args?: Prisma.DocumentUpdateManyArgs, + args?: Omit<Prisma.DocumentUpdateArgs, 'where' | 'data'>, transaction: PrismaTransactionClient = this.prismaService, ) { - return await transaction.document.updateMany({ + return await transaction.document.update({ ...args, where: { - ...args?.where, id, projectId: { in: projectIds }, }, data, }); }
35-51
: Consider adding index hint for query optimization.The
findByEntityIdAndWorkflowId
method uses an OR condition onbusinessId
andendUserId
. Consider adding an index hint in the schema to optimize this query pattern.-- Add to your schema.prisma file model Document { // ... other fields @@index([businessId, workflowRuntimeDataId]) @@index([endUserId, workflowRuntimeDataId]) }apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx (1)
48-56
: Simplify document selection logic.The click handler can be simplified for better readability and maintainability.
-onClick={() => { - if (selectedIdsToRequest.includes(documentId)) { - setSelectedIdsToRequest(prev => prev.filter(id => id !== documentId)); - } - - if (status === 'unprovided') { - setSelectedIdsToRequest(prev => [...prev, documentId]); - } -}} +onClick={() => { + if (status !== 'unprovided') return; + + setSelectedIdsToRequest(prev => + prev.includes(documentId) + ? prev.filter(id => id !== documentId) + : [...prev, documentId] + ); +}}apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (2)
39-72
: Consider adding loading state to the request button.The dialog implementation for requesting documents is well-structured, but it lacks loading state feedback during the request submission.
Apply this diff to add loading state:
- <Button type="button" onClick={onRequestDocuments}> + <Button type="button" onClick={onRequestDocuments} disabled={isLoading}> + {isLoading ? ( + <> + <Loader2 className="mr-2 h-4 w-4 animate-spin" /> + Sending... + </> + ) : ( Send email + )} </Button>
123-161
: Consider adding error boundary.The
AccordionContent
component handles loading and empty states but lacks error handling.Consider wrapping the component with an error boundary to gracefully handle rendering errors:
import { ErrorBoundary } from 'react-error-boundary'; const ErrorFallback = () => ( <div className="flex h-20 items-center justify-center text-sm text-red-500"> Error loading documents </div> ); const AccordionContent = memo(({ documents, isLoading, getSubItems }: AccordionContentProps) => { return ( <ErrorBoundary FallbackComponent={ErrorFallback}> {/* existing implementation */} </ErrorBoundary> ); });services/workflows-service/src/document/document.unit.test.ts (1)
261-357
: Add more edge case tests.The edge cases section could be expanded to include more scenarios.
Consider adding tests for:
- Invalid country codes in issuer
- Missing or invalid version numbers
- Duplicate document IDs
- Maximum document size validation
- Invalid file types
services/workflows-service/src/document/document.service.ts (1)
442-451
: Implement the email flow for document requests.The TODO comment indicates that email notifications for document requests are not implemented. This is an important feature for notifying users about document requests.
Would you like me to help implement the email flow for document requests? I can:
- Create an email service integration
- Design email templates for document requests
- Implement the email sending logic
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (25)
apps/backoffice-v2/public/locales/en/toast.json
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/constants.tsx
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/common/hooks/useSerializedSearchParams/useSerializedSearchParams.tsx
(2 hunks)apps/backoffice-v2/src/domains/documents/fetchers.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/mutations/useRequestDocumentsMutation.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/queries/useDocumentsTrackerItemsQuery.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/query-keys.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/schemas/request-documents.ts
(1 hunks)apps/backoffice-v2/src/domains/workflow-definitions/fetchers.ts
(1 hunks)apps/backoffice-v2/src/pages/Entity/components/Case/components/CaseOverview/CaseOverview.tsx
(2 hunks)packages/common/src/schemas/documents/workflow/config-schema.ts
(1 hunks)packages/ui/src/components/molecules/AccordionCard/AccordionCard.Item.tsx
(1 hunks)packages/ui/src/components/molecules/AccordionCard/types.ts
(1 hunks)services/workflows-service/prisma/data-migrations
(1 hunks)services/workflows-service/src/document/document.controller.external.ts
(1 hunks)services/workflows-service/src/document/document.module.ts
(1 hunks)services/workflows-service/src/document/document.repository.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)services/workflows-service/src/document/document.unit.test.ts
(1 hunks)services/workflows-service/src/document/types.ts
(1 hunks)services/workflows-service/src/ui-definition/ui-definition.service.ts
(1 hunks)services/workflows-service/src/workflow/schemas/zod-schemas.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- services/workflows-service/src/document/document.module.ts
- services/workflows-service/prisma/data-migrations
⏰ Context from checks skipped due to timeout of 90000ms (6)
- GitHub Check: test_windows
- GitHub Check: test_linux
- GitHub Check: build (windows-latest)
- GitHub Check: Analyze (javascript)
- GitHub Check: lint
- GitHub Check: build (ubuntu-latest)
🔇 Additional comments (15)
apps/backoffice-v2/src/domains/documents/hooks/query-keys.ts (1)
4-9
: LGTM! Well-structured query key implementation.The query key is properly structured with:
- Unique prefix for document-related queries
- Proper inclusion of workflowId for cache management
- Clean integration with the fetcher function
apps/backoffice-v2/src/common/hooks/useSerializedSearchParams/useSerializedSearchParams.tsx (1)
8-8
: LGTM! Hash preservation is properly implemented.The changes correctly preserve the URL hash during navigation by:
- Extracting the hash from useLocation
- Appending it to the URL during navigation
- Including it in the dependency array
Also applies to: 23-23, 27-27
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/constants.tsx (1)
50-54
: Consider adding exhaustive status handling.The
documentStatusToIcon
mapping might need to handle future status additions.Let's verify if there are other document statuses in use:
✅ Verification successful
🏁 Script executed:
#!/bin/bash # Search for document status references rg -t typescript -t javascript "status['\"]?\s*[:=]\s*['\"](\w+)['\"]" -g '!**/constants.tsx'Length of output: 136
I'll run an updated search without using the unrecognized file type alias. Let's try searching directly in files with the .ts and .tsx extensions (excluding the constants file):
🏁 Script executed:
#!/bin/bash # Searching for document status references in .ts and .tsx files excluding constants.tsx rg 'status\s*[:=]\s*["'"'"'](\w+)["'"'"']' --glob '*.ts' --glob '*.tsx' --glob '!**/constants.tsx'Length of output: 9833
I'll search for the definition of
TrackedDocument
to see what statuses are expected. Let's run a shell script to search for its interface/type definition.
🏁 Script executed:
#!/bin/bash # Search for the definition of TrackedDocument in .ts and .tsx files rg "TrackedDocument" --glob "*.ts" --glob "*.tsx"Length of output: 1115
Exhaustiveness Confirmed for DocumentTracker Statuses
The search results indicate that the only statuses used for
TrackedDocument
are"unprovided"
,"provided"
, and"requested"
. The type imported from@/domains/documents/hooks/schemas/document
(i.e.TrackedDocument['status']
) appears to be limited to these three values, and our code’sdocumentStatusToIcon
mapping covers them all exhaustively.packages/ui/src/components/molecules/AccordionCard/types.ts (1)
14-14
: LGTM! Good addition of the optional styling property.The
itemClassName
property provides flexibility for styling individual items while maintaining backward compatibility.services/workflows-service/src/document/types.ts (1)
4-12
: LGTM! Well-structured document schema.The schema effectively validates document properties with appropriate types and constraints.
packages/ui/src/components/molecules/AccordionCard/AccordionCard.Item.tsx (1)
57-57
: LGTM! Good use of the ctw utility for className composition.The className composition is well-structured, properly merging the new itemClassName with existing classes.
services/workflows-service/src/document/document.repository.ts (1)
6-8
: LGTM! Good use of dependency injection.The repository properly uses NestJS's dependency injection pattern with PrismaService.
apps/backoffice-v2/src/pages/Entity/components/Case/components/CaseOverview/CaseOverview.tsx (1)
69-71
: LGTM! Clean integration of DocumentTracker.The conditional rendering follows the same pattern as other feature toggles in the component.
packages/common/src/schemas/documents/workflow/config-schema.ts (1)
62-62
: LGTM! Schema extension follows existing patterns.The addition of
isDocumentTrackerEnabled
maintains consistency with other feature flags in the schema.apps/backoffice-v2/src/domains/workflow-definitions/fetchers.ts (1)
30-30
: LGTM! Consistent schema implementation.The
isDocumentTrackerEnabled
property is correctly implemented with a default value offalse
, following the pattern of other feature flags.services/workflows-service/src/workflow/schemas/zod-schemas.ts (1)
9-9
: LGTM!The addition of
isDocumentTrackerEnabled
property to theConfigSchema
aligns with the broader document management enhancements in this PR.apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (1)
22-31
: LGTM!The component's props and hooks are well-structured, with clear separation of concerns. The
useDocumentTracker
hook encapsulates the document tracking logic effectively.services/workflows-service/src/document/document.controller.external.ts (1)
39-46
: Add file type validation for uploads.The file interceptor uses
fileFilter
but there's no explicit validation of allowed file types. This could be a security risk.services/workflows-service/src/document/document.service.ts (1)
115-141
: LGTM! Well-structured document retrieval with file handling.The method effectively retrieves documents and their associated files, including signed URLs for access.
apps/backoffice-v2/public/locales/en/toast.json (1)
124-127
: LGTM! Clear and consistent toast messages.The new document request messages follow the established pattern and provide clear feedback to users.
endpoint: `../external/documents/tracker/${workflowId}`, | ||
method: Method.GET, | ||
schema: DocumentsTrackerSchema, | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Avoid using relative paths in API endpoints.
Using relative paths (../external/...
) can be fragile and hard to maintain. Consider:
- Moving base URLs to environment configuration
- Using absolute paths or URL builders
business: data?.business.map(item => ({ | ||
...item, | ||
url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item.type)}`, | ||
})), | ||
individuals: { | ||
ubos: data?.individuals.ubos.map(item => ({ | ||
...item, | ||
url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item?.type)}`, | ||
})), | ||
directors: data?.individuals.directors.map(item => ({ | ||
...item, | ||
url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item?.type)}`, | ||
})), | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Reduce code duplication in URL construction.
The URL construction logic is duplicated across three map functions.
Consider this improvement:
+ const createItemWithUrl = (item: { category?: string; type: string }) => ({
+ ...item,
+ url: `${pathname}?${search}#${titleCase(item?.category ?? '')} - ${titleCase(item.type)}`,
+ });
+
return useQuery({
...documentsQueryKeys.trackerItems({ workflowId }),
enabled: isAuthenticated,
select: data => ({
- business: data?.business.map(item => ({
- ...item,
- url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item.type)}`,
- })),
+ business: data?.business.map(createItemWithUrl) ?? [],
individuals: {
- ubos: data?.individuals.ubos.map(item => ({
- ...item,
- url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item?.type)}`,
- })),
- directors: data?.individuals.directors.map(item => ({
- ...item,
- url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item?.type)}`,
- })),
+ ubos: data?.individuals.ubos.map(createItemWithUrl) ?? [],
+ directors: data?.individuals.directors.map(createItemWithUrl) ?? [],
},
}),
});
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
business: data?.business.map(item => ({ | |
...item, | |
url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item.type)}`, | |
})), | |
individuals: { | |
ubos: data?.individuals.ubos.map(item => ({ | |
...item, | |
url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item?.type)}`, | |
})), | |
directors: data?.individuals.directors.map(item => ({ | |
...item, | |
url: `${pathname}?${search}#${titleCase(item?.category)} - ${titleCase(item?.type)}`, | |
})), | |
}, | |
const createItemWithUrl = (item: { category?: string; type: string }) => ({ | |
...item, | |
url: `${pathname}?${search}#${titleCase(item?.category ?? '')} - ${titleCase(item.type)}`, | |
}); | |
return useQuery({ | |
...documentsQueryKeys.trackerItems({ workflowId }), | |
enabled: isAuthenticated, | |
select: data => ({ | |
business: data?.business.map(createItemWithUrl) ?? [], | |
individuals: { | |
ubos: data?.individuals.ubos.map(createItemWithUrl) ?? [], | |
directors: data?.individuals.directors.map(createItemWithUrl) ?? [], | |
}, | |
}), | |
}); |
}, | ||
}); | ||
|
||
const onRequestDocuments = () => requestDocuments({ documentIds: selectedIdsToRequest }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for document request mutation.
The onRequestDocuments
function should handle potential errors when requesting documents.
-const onRequestDocuments = () => requestDocuments({ documentIds: selectedIdsToRequest });
+const onRequestDocuments = () => {
+ if (!selectedIdsToRequest.length) return;
+
+ requestDocuments({
+ documentIds: selectedIdsToRequest,
+ onError: (error) => {
+ console.error('Failed to request documents:', error);
+ // Handle error appropriately (e.g., show error notification)
+ }
+ });
+};
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const onRequestDocuments = () => requestDocuments({ documentIds: selectedIdsToRequest }); | |
const onRequestDocuments = () => { | |
if (!selectedIdsToRequest.length) return; | |
requestDocuments({ | |
documentIds: selectedIdsToRequest, | |
onError: (error) => { | |
console.error('Failed to request documents:', error); | |
// Handle error appropriately (e.g., show error notification) | |
} | |
}); | |
}; |
async createDocument( | ||
@Body() | ||
data: Omit<Static<typeof CreateDocumentSchema>, 'properties'> & { | ||
metadata: string; | ||
properties: string; | ||
}, | ||
@UploadedFile( | ||
new ParseFilePipeBuilder().addMaxSizeValidator({ maxSize: FILE_MAX_SIZE_IN_BYTE }).build({ | ||
fileIsRequired: true, | ||
exceptionFactory: (error: string) => { | ||
if (error.includes('expected size')) { | ||
throw new UnprocessableEntityException(FILE_SIZE_EXCEEDED_MSG); | ||
} | ||
|
||
throw new UnprocessableEntityException(error); | ||
}, | ||
}), | ||
) | ||
file: Express.Multer.File, | ||
@CurrentProject() projectId: string, | ||
) { | ||
const metadata = DocumentFileJsonSchema.parse(data.metadata); | ||
const properties = z | ||
.preprocess(value => { | ||
if (typeof value !== 'string') { | ||
return value; | ||
} | ||
|
||
return JSON.parse(value); | ||
}, z.record(z.string(), z.unknown())) | ||
.parse(data.properties); | ||
|
||
return await this.documentService.create({ | ||
...data, | ||
properties, | ||
metadata, | ||
file, | ||
projectId, | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider adding rate limiting.
The document creation endpoint could be vulnerable to abuse without rate limiting.
Add rate limiting using the @nestjs/throttler
package:
import { Throttle } from '@nestjs/throttler';
@Throttle({ default: { ttl: 60000, limit: 10 } }) // 10 requests per minute
@Post()
async createDocument(...)
async reuploadDocumentFileById( | ||
@Param('workflowRuntimeDataId') workflowRuntimeDataId: string, | ||
@Param('fileId') fileId: string, | ||
@UploadedFile( | ||
new ParseFilePipeBuilder().addMaxSizeValidator({ maxSize: FILE_MAX_SIZE_IN_BYTE }).build({ | ||
fileIsRequired: true, | ||
exceptionFactory: (error: string) => { | ||
if (error.includes('expected size')) { | ||
throw new UnprocessableEntityException(FILE_SIZE_EXCEEDED_MSG); | ||
} | ||
|
||
throw new UnprocessableEntityException(error); | ||
}, | ||
}), | ||
) | ||
file: Express.Multer.File, | ||
@CurrentProject() projectId: string, | ||
) { | ||
return await this.documentService.reuploadDocumentFileById( | ||
fileId, | ||
workflowRuntimeDataId, | ||
[projectId], | ||
file, | ||
); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add file cleanup on error.
The re-upload endpoint should clean up the uploaded file if the document service operation fails.
Wrap the service call in a try-catch block:
async reuploadDocumentFileById(
@Param('workflowRuntimeDataId') workflowRuntimeDataId: string,
@Param('fileId') fileId: string,
@UploadedFile(...) file: Express.Multer.File,
@CurrentProject() projectId: string,
) {
+ try {
return await this.documentService.reuploadDocumentFileById(
fileId,
workflowRuntimeDataId,
[projectId],
file,
);
+ } catch (error) {
+ // Clean up the uploaded file
+ await fs.unlink(file.path).catch(console.error);
+ throw error;
+ }
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async reuploadDocumentFileById( | |
@Param('workflowRuntimeDataId') workflowRuntimeDataId: string, | |
@Param('fileId') fileId: string, | |
@UploadedFile( | |
new ParseFilePipeBuilder().addMaxSizeValidator({ maxSize: FILE_MAX_SIZE_IN_BYTE }).build({ | |
fileIsRequired: true, | |
exceptionFactory: (error: string) => { | |
if (error.includes('expected size')) { | |
throw new UnprocessableEntityException(FILE_SIZE_EXCEEDED_MSG); | |
} | |
throw new UnprocessableEntityException(error); | |
}, | |
}), | |
) | |
file: Express.Multer.File, | |
@CurrentProject() projectId: string, | |
) { | |
return await this.documentService.reuploadDocumentFileById( | |
fileId, | |
workflowRuntimeDataId, | |
[projectId], | |
file, | |
); | |
} | |
async reuploadDocumentFileById( | |
@Param('workflowRuntimeDataId') workflowRuntimeDataId: string, | |
@Param('fileId') fileId: string, | |
@UploadedFile( | |
new ParseFilePipeBuilder().addMaxSizeValidator({ maxSize: FILE_MAX_SIZE_IN_BYTE }).build({ | |
fileIsRequired: true, | |
exceptionFactory: (error: string) => { | |
if (error.includes('expected size')) { | |
throw new UnprocessableEntityException(FILE_SIZE_EXCEEDED_MSG); | |
} | |
throw new UnprocessableEntityException(error); | |
}, | |
}), | |
) | |
file: Express.Multer.File, | |
@CurrentProject() projectId: string, | |
) { | |
try { | |
return await this.documentService.reuploadDocumentFileById( | |
fileId, | |
workflowRuntimeDataId, | |
[projectId], | |
file, | |
); | |
} catch (error) { | |
// Clean up the uploaded file | |
await fs.unlink(file.path).catch(console.error); | |
throw error; | |
} | |
} |
beforeEach(() => { | ||
// @ts-expect-error - We only need the service for unit testing parseDocumentsFromUISchema | ||
documentService = new DocumentService(); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider using proper mocking.
Using @ts-expect-error
to bypass constructor dependencies is not ideal. Consider proper dependency injection and mocking.
beforeEach(() => {
const mockDependencies = {
prisma: createMock<PrismaService>(),
configService: createMock<ConfigService>(),
// ... other dependencies
};
documentService = new DocumentService(mockDependencies);
});
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
♻️ Duplicate comments (1)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx (1)
28-34
:⚠️ Potential issueAdd error handling for document request mutation.
The mutation setup should handle potential errors when requesting documents.
const { mutate: requestDocuments } = useRequestDocumentsMutation({ onSuccess: () => { setSelectedIdsToRequest([]); onOpenChange(false); void queryClient.invalidateQueries(documentsQueryKeys.trackerItems({ workflowId })); }, + onError: (error) => { + console.error('Failed to request documents:', error); + // Handle error appropriately (e.g., show error notification) + }, });
🧹 Nitpick comments (16)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx (1)
23-25
: Consider using a more specific type for selectedIdsToRequest.Instead of using a generic array type with z.infer, consider creating a dedicated type for document identifiers to improve code maintainability and type safety.
- const [selectedIdsToRequest, setSelectedIdsToRequest] = useState< - Array<z.infer<typeof DocumentTrackerItemSchema>['identifiers']> - >([]); + type DocumentIdentifiers = z.infer<typeof DocumentTrackerItemSchema>['identifiers']; + const [selectedIdsToRequest, setSelectedIdsToRequest] = useState<DocumentIdentifiers[]>([]);apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts (3)
6-9
: Consider adding field constraints for user names.
Applying minimum length, maximum length, or trimming could ensure data quality forfirstName
andlastName
.
17-43
: Unify status enums with server definitions.
If the backend uses an enum for document status, consider matching'provided', 'unprovided', 'requested'
to maintain consistency across services.
54-66
: Standardizeentity
structure inRequestDocumentsSchema
.
Currently, it uses{ type: ... }
while the main schema uses{ entityType: ... }
. Unify these properties to avoid confusion and duplication.services/workflows-service/src/document/types.ts (1)
4-12
: Consider optional fields for certain document properties.
IfissuingVersion
orversion
might be undefined in some cases, allow optional or defaults to prevent validation pitfalls.services/workflows-service/src/document/document.repository.ts (4)
29-41
: Remove unnecessary await.The
await
keyword is redundant since the method is already returning a promise.Apply this diff to remove the unnecessary await:
- return await transaction.document.findMany({ + return transaction.document.findMany({
61-73
: Remove unnecessary await.The
await
keyword is redundant since the method is already returning a promise.Apply this diff to remove the unnecessary await:
- return await transaction.document.updateMany({ + return transaction.document.updateMany({
75-91
: Remove unnecessary await.The
await
keyword is redundant since the method is already returning a promise.Apply this diff to remove the unnecessary await:
- return await transaction.document.updateMany({ + return transaction.document.updateMany({
93-107
: Remove unnecessary await.The
await
keyword is redundant since the method is already returning a promise.Apply this diff to remove the unnecessary await:
- return await transaction.document.deleteMany({ + return transaction.document.deleteMany({apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (2)
22-114
: Add ARIA labels for better accessibility.The component should include ARIA labels for interactive elements to improve accessibility.
Apply this diff to add ARIA labels:
- <AccordionCard> + <AccordionCard aria-label="Document tracker"> <AccordionCard.Title className={`flex-row items-center justify-between`} rightChildren={ selectedIdsToRequest.length > 0 ? ( - <Dialog open={open} onOpenChange={onOpenChange}> + <Dialog open={open} onOpenChange={onOpenChange} aria-label="Request documents dialog"> <DialogTrigger> - <Button className="h-7 bg-warning px-2 text-sm"> + <Button className="h-7 bg-warning px-2 text-sm" aria-label="Request documents">
116-183
: Add error boundary for better error handling.The component should be wrapped in an error boundary to gracefully handle rendering errors.
Create an error boundary component and wrap the
AccordionContent
:class DocumentTrackerErrorBoundary extends React.Component { state = { hasError: false }; static getDerivedStateFromError() { return { hasError: true }; } render() { if (this.state.hasError) { return <div>Error loading documents. Please try again.</div>; } return this.props.children; } } // Usage: <DocumentTrackerErrorBoundary> <AccordionContent documentTrackerItems={documentTrackerItems} isLoading={isLoadingDocuments} getSubItems={getSubItems} /> </DocumentTrackerErrorBoundary>services/workflows-service/src/document/document.controller.external.ts (3)
152-157
: Consider adding caching for performance.The document retrieval endpoint could benefit from caching to improve performance.
Add caching using the
@nestjs/cache-manager
package:import { CacheInterceptor, CacheTTL } from '@nestjs/cache-manager'; @UseInterceptors(CacheInterceptor) @CacheTTL(30) // Cache for 30 seconds async getDocumentsByWorkflowId(...)
233-239
: Add optimistic locking for concurrent updates.The endpoint should implement optimistic locking to handle concurrent updates safely.
Add version field and optimistic locking:
async updateDocumentById( @Param('documentId') documentId: string, @Body() data: Static<typeof UpdateDocumentSchema>, + @Headers('If-Match') version: string, @CurrentProject() projectId: string, ) { + if (!version) { + throw new PreconditionFailedException('Version header is required'); + } return await this.documentService.updateById(documentId, [projectId], data); }
313-318
: Consider implementing soft delete.Instead of permanently deleting documents, consider implementing a soft delete mechanism.
Add soft delete functionality:
async deleteDocumentsByIds( @Body() { ids }: Static<typeof DeleteDocumentsSchema>, @CurrentProject() projectId: string, ) { - return await this.documentService.deleteByIds(ids, [projectId]); + return await this.documentService.softDeleteByIds(ids, [projectId]); }services/workflows-service/src/document/document.service.ts (2)
121-147
: Cache signed URLs for better performance.Fetching signed URLs for each file can be expensive. Consider caching the URLs with an appropriate TTL.
Add caching for signed URLs:
@Injectable() export class DocumentService { private readonly urlCache = new Map<string, { url: string; expiry: number }>(); private readonly URL_CACHE_TTL = 3600000; // 1 hour private async getSignedUrl(fileId: string, projectId: string): Promise<string> { const cached = this.urlCache.get(fileId); if (cached && cached.expiry > Date.now()) { return cached.url; } const { signedUrl } = await this.storageService.fetchFileContent({ id: fileId, projectIds: [projectId], format: 'signed-url', }); this.urlCache.set(fileId, { url: signedUrl, expiry: Date.now() + this.URL_CACHE_TTL, }); return signedUrl; } }
200-231
: Optimize promise handling for better performance.The nested
Promise.all
calls could be flattened for better performance.Optimize promise handling:
async fetchDocumentsFiles({ documents, format, }: { documents: Array<Document & { files: DocumentFile[] }>; format: Parameters<StorageService['fetchFileContent']>[0]['format']; }) { - return await Promise.all( - documents?.map(async document => { - const files = await Promise.all( - document.files?.map(async file => { - const uploadedFile = await this.storageService.fetchFileContent({ - id: file.fileId, - projectIds: [document.projectId], - format, - }); - - return { - ...file, - mimeType: uploadedFile.mimeType, - signedUrl: uploadedFile.signedUrl, - }; - }) ?? [], - ); - - return { - ...document, - files, - }; - }) ?? [], - ); + const filePromises = documents?.flatMap(document => + document.files?.map(async file => { + const uploadedFile = await this.storageService.fetchFileContent({ + id: file.fileId, + projectIds: [document.projectId], + format, + }); + + return { + documentId: document.id, + file: { + ...file, + mimeType: uploadedFile.mimeType, + signedUrl: uploadedFile.signedUrl, + }, + }; + }) ?? [], + ) ?? []; + + const files = await Promise.all(filePromises); + + return documents?.map(document => ({ + ...document, + files: files + .filter(f => f.documentId === document.id) + .map(f => f.file), + })) ?? []; }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (10)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/fetchers.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/queries/useDocumentsTrackerItemsQuery.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts
(1 hunks)apps/backoffice-v2/src/pages/Entity/components/Case/components/CaseOverview/CaseOverview.tsx
(2 hunks)services/workflows-service/src/document/document.controller.external.ts
(1 hunks)services/workflows-service/src/document/document.repository.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)services/workflows-service/src/document/types.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- apps/backoffice-v2/src/pages/Entity/components/Case/components/CaseOverview/CaseOverview.tsx
- apps/backoffice-v2/src/domains/documents/hooks/queries/useDocumentsTrackerItemsQuery.ts
⏰ Context from checks skipped due to timeout of 90000ms (6)
- GitHub Check: test_windows
- GitHub Check: build (windows-latest)
- GitHub Check: test_linux
- GitHub Check: Analyze (javascript)
- GitHub Check: build (ubuntu-latest)
- GitHub Check: lint
🔇 Additional comments (20)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx (1)
111-119
: LGTM! Well-structured hook return value.The hook returns a clean and well-organized set of values and handlers that provide all necessary functionality for document tracking.
apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts (4)
1-3
: Imports look correct.
No issues noted in these import statements.
4-5
: Type inference fromDocumentTrackerItemSchema
looks sound.
DefiningTrackedDocument
withz.infer
ensures consistent typing throughout the codebase.
11-15
: Constant object forEntityType
is coherent.
Defining entity types as aconst
object is a solid approach for clarity and type safety.
45-52
: Verify handling of empty array fields.
The schema permits empty arrays forbusiness
,ubos
, anddirectors
. Ensure the consuming code gracefully handles these edge cases (e.g., no items).apps/backoffice-v2/src/domains/documents/fetchers.ts (3)
1-6
: Initial imports and schema references look fine.
No immediate concerns regarding these imports.
7-15
: Avoid using relative paths in API endpoints.
This repeats a prior comment about nested../external/...
usage. Externalizing base URLs or using environment variables can simplify configuration.
17-26
: Validate response schema alignment.
Currently,RequestDocumentsSchema
is used for both request body and response validation. If the API response differs from the request format, consider using a separate schema.services/workflows-service/src/document/types.ts (6)
1-3
: Base imports look good.
No issues identified with referencing@prisma/client
enums.
14-15
: Straightforward type definition.
Usingz.infer
is a clean approach for deriving static types from schemas.
16-22
: StructuringTParsedDocuments
for business vs. individuals is clear.
This separation clarifies handling for different entity types.
24-42
: Discriminated union is well-structured.
Defining fields likecompanyName
,firstName
, andlastName
in separate objects keeps entity data distinct.
44-52
: Ensure enum alignment with front-end.
z.nativeEnum(DocumentStatus)
andz.nativeEnum(DocumentDecision)
might differ from front-end strings like'provided'
/'unprovided'
. Confirm they match to avoid serialization issues.
54-60
: Response schema structure is consistent.
Arrays forbusiness
and nested objects forindividuals
align with typical document flows.services/workflows-service/src/document/document.repository.ts (3)
10-19
: LGTM!The method correctly handles document creation with transaction support and proper argument spreading.
21-27
: LGTM!The method correctly handles bulk document creation with transaction support and proper argument spreading.
43-59
: LGTM!The method correctly handles document retrieval with proper filtering and transaction support.
services/workflows-service/src/document/document.controller.external.ts (2)
87-126
: Consider adding rate limiting.The document creation endpoint could be vulnerable to abuse without rate limiting.
Add rate limiting using the
@nestjs/throttler
package:import { Throttle } from '@nestjs/throttler'; @Throttle({ default: { ttl: 60000, limit: 10 } }) // 10 requests per minute @Post() async createDocument(...)
272-296
: Add file cleanup on error.The re-upload endpoint should clean up the uploaded file if the document service operation fails.
Wrap the service call in a try-catch block:
async reuploadDocumentFileById( @Param('workflowRuntimeDataId') workflowRuntimeDataId: string, @Param('fileId') fileId: string, @UploadedFile(...) file: Express.Multer.File, @CurrentProject() projectId: string, ) { + try { return await this.documentService.reuploadDocumentFileById( fileId, workflowRuntimeDataId, [projectId], file, ); + } catch (error) { + // Clean up the uploaded file + await fs.unlink(file.path).catch(console.error); + throw error; + } }services/workflows-service/src/document/document.service.ts (1)
35-119
: Handle partial failures during file upload.The logic uploads a file and then creates document records. If the upload fails or partially completes, the code throws exceptions.
Consider wrapping the upload + record creation in a single database transaction (if feasible) to avoid inconsistent states in case of partial failures or rollback required scenarios.
const compareIdentifiers = (identifiersA: any, identifiersB: any) => { | ||
return [ | ||
identifiersA.document.type === identifiersB.document.type, | ||
identifiersA.document.category === identifiersB.document.category, | ||
identifiersA.document.issuingCountry === identifiersB.document.issuingCountry, | ||
identifiersA.document.issuingVersion === identifiersB.document.issuingVersion, | ||
identifiersA.document.version === identifiersB.document.version, | ||
identifiersA.entity.id === identifiersB.entity.id, | ||
].every(Boolean); | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve type safety and performance of compareIdentifiers.
The function uses 'any' type and could be optimized for better performance and type safety.
- const compareIdentifiers = (identifiersA: any, identifiersB: any) => {
+ type DocumentIdentifiers = z.infer<typeof DocumentTrackerItemSchema>['identifiers'];
+ const compareIdentifiers = (identifiersA: DocumentIdentifiers, identifiersB: DocumentIdentifiers) => {
+ const documentA = identifiersA.document;
+ const documentB = identifiersB.document;
+
return [
- identifiersA.document.type === identifiersB.document.type,
- identifiersA.document.category === identifiersB.document.category,
- identifiersA.document.issuingCountry === identifiersB.document.issuingCountry,
- identifiersA.document.issuingVersion === identifiersB.document.issuingVersion,
- identifiersA.document.version === identifiersB.document.version,
+ documentA.type === documentB.type,
+ documentA.category === documentB.category,
+ documentA.issuingCountry === documentB.issuingCountry,
+ documentA.issuingVersion === documentB.issuingVersion,
+ documentA.version === documentB.version,
identifiersA.entity.id === identifiersB.entity.id,
].every(Boolean);
};
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const compareIdentifiers = (identifiersA: any, identifiersB: any) => { | |
return [ | |
identifiersA.document.type === identifiersB.document.type, | |
identifiersA.document.category === identifiersB.document.category, | |
identifiersA.document.issuingCountry === identifiersB.document.issuingCountry, | |
identifiersA.document.issuingVersion === identifiersB.document.issuingVersion, | |
identifiersA.document.version === identifiersB.document.version, | |
identifiersA.entity.id === identifiersB.entity.id, | |
].every(Boolean); | |
}; | |
type DocumentIdentifiers = z.infer<typeof DocumentTrackerItemSchema>['identifiers']; | |
const compareIdentifiers = (identifiersA: DocumentIdentifiers, identifiersB: DocumentIdentifiers) => { | |
const documentA = identifiersA.document; | |
const documentB = identifiersB.document; | |
return [ | |
documentA.type === documentB.type, | |
documentA.category === documentB.category, | |
documentA.issuingCountry === documentB.issuingCountry, | |
documentA.issuingVersion === documentB.issuingVersion, | |
documentA.version === documentB.version, | |
identifiersA.entity.id === identifiersB.entity.id, | |
].every(Boolean); | |
}; |
const onRequestDocuments = () => | ||
requestDocuments({ | ||
workflowId, | ||
documents: selectedIdsToRequest.map(identifier => ({ | ||
type: identifier.document.type, | ||
category: identifier.document.category, | ||
issuingCountry: identifier.document.issuingCountry, | ||
issuingVersion: identifier.document.issuingVersion, | ||
version: identifier.document.version, | ||
templateId: identifier.document.type, | ||
entity: { | ||
id: identifier.entity.id, | ||
type: identifier.entity.entityType, | ||
}, | ||
})), | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add validation before making document requests.
The onRequestDocuments
function should validate the selected IDs before making the request to prevent unnecessary API calls.
- const onRequestDocuments = () =>
+ const onRequestDocuments = () => {
+ if (!selectedIdsToRequest.length) {
+ return;
+ }
+
requestDocuments({
workflowId,
documents: selectedIdsToRequest.map(identifier => ({
type: identifier.document.type,
category: identifier.document.category,
issuingCountry: identifier.document.issuingCountry,
issuingVersion: identifier.document.issuingVersion,
version: identifier.document.version,
templateId: identifier.document.type,
entity: {
id: identifier.entity.id,
type: identifier.entity.entityType,
},
})),
});
+ };
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const onRequestDocuments = () => | |
requestDocuments({ | |
workflowId, | |
documents: selectedIdsToRequest.map(identifier => ({ | |
type: identifier.document.type, | |
category: identifier.document.category, | |
issuingCountry: identifier.document.issuingCountry, | |
issuingVersion: identifier.document.issuingVersion, | |
version: identifier.document.version, | |
templateId: identifier.document.type, | |
entity: { | |
id: identifier.entity.id, | |
type: identifier.entity.entityType, | |
}, | |
})), | |
}); | |
const onRequestDocuments = () => { | |
if (!selectedIdsToRequest.length) { | |
return; | |
} | |
requestDocuments({ | |
workflowId, | |
documents: selectedIdsToRequest.map(identifier => ({ | |
type: identifier.document.type, | |
category: identifier.document.category, | |
issuingCountry: identifier.document.issuingCountry, | |
issuingVersion: identifier.document.issuingVersion, | |
version: identifier.document.version, | |
templateId: identifier.document.type, | |
entity: { | |
id: identifier.entity.id, | |
type: identifier.entity.entityType, | |
}, | |
})), | |
}); | |
}; |
async requestDocuments( | ||
@Body() { workflowId, documents }: Static<typeof RequestUploadSchema>, | ||
@CurrentProject() projectId: TProjectId, | ||
) { | ||
return await this.documentService.requestDocumentsByIds(projectId, workflowId, documents); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add request size validation.
The endpoint should validate the size of the documents array to prevent abuse.
Add array size validation to the request schema:
const RequestUploadSchema = Type.Object({
workflowId: Type.String(),
- documents: Type.Array(
+ documents: Type.Array(
Type.Object({
type: Type.String(),
category: Type.String(),
issuingCountry: Type.String(),
issuingVersion: Type.String(),
version: Type.String(),
entity: Type.Object({
id: Type.String(),
type: Type.Union([Type.Literal('business'), Type.Literal('ubo'), Type.Literal('director')]),
}),
- }),
+ }), { maxItems: 100 }
),
});
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async requestDocuments( | |
@Body() { workflowId, documents }: Static<typeof RequestUploadSchema>, | |
@CurrentProject() projectId: TProjectId, | |
) { | |
return await this.documentService.requestDocumentsByIds(projectId, workflowId, documents); | |
} | |
const RequestUploadSchema = Type.Object({ | |
workflowId: Type.String(), | |
documents: Type.Array( | |
Type.Object({ | |
type: Type.String(), | |
category: Type.String(), | |
issuingCountry: Type.String(), | |
issuingVersion: Type.String(), | |
version: Type.String(), | |
entity: Type.Object({ | |
id: Type.String(), | |
type: Type.Union([Type.Literal('business'), Type.Literal('ubo'), Type.Literal('director')]), | |
}), | |
}), { maxItems: 100 } | |
), | |
}); |
async getDocumentsByEntityIdAndWorkflowId( | ||
@Param('entityId') entityId: string, | ||
@Param('workflowRuntimeDataId') workflowRuntimeDataId: string, | ||
@CurrentProject() projectId: string, | ||
) { | ||
return await this.documentService.getByEntityIdAndWorkflowId(entityId, workflowRuntimeDataId, [ | ||
projectId, | ||
]); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add pagination for large result sets.
The endpoint should support pagination to handle large numbers of documents efficiently.
Add pagination parameters to the method:
async getDocumentsByEntityIdAndWorkflowId(
@Param('entityId') entityId: string,
@Param('workflowRuntimeDataId') workflowRuntimeDataId: string,
+ @Query('page') page: number = 1,
+ @Query('limit') limit: number = 10,
@CurrentProject() projectId: string,
) {
return await this.documentService.getByEntityIdAndWorkflowId(
entityId,
workflowRuntimeDataId,
- [projectId],
+ [projectId],
+ {
+ skip: (page - 1) * limit,
+ take: limit,
+ },
);
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async getDocumentsByEntityIdAndWorkflowId( | |
@Param('entityId') entityId: string, | |
@Param('workflowRuntimeDataId') workflowRuntimeDataId: string, | |
@CurrentProject() projectId: string, | |
) { | |
return await this.documentService.getByEntityIdAndWorkflowId(entityId, workflowRuntimeDataId, [ | |
projectId, | |
]); | |
} | |
async getDocumentsByEntityIdAndWorkflowId( | |
@Param('entityId') entityId: string, | |
@Param('workflowRuntimeDataId') workflowRuntimeDataId: string, | |
@Query('page') page: number = 1, | |
@Query('limit') limit: number = 10, | |
@CurrentProject() projectId: string, | |
) { | |
return await this.documentService.getByEntityIdAndWorkflowId( | |
entityId, | |
workflowRuntimeDataId, | |
[projectId], | |
{ | |
skip: (page - 1) * limit, | |
take: limit, | |
}, | |
); | |
} |
async updateById( | ||
id: string, | ||
projectIds: TProjectId[], | ||
data: Prisma.DocumentUpdateInput, | ||
args?: Prisma.DocumentUpdateManyArgs, | ||
transaction?: PrismaTransactionClient, | ||
) { | ||
await this.repository.updateById(id, projectIds, data, args, transaction); | ||
|
||
const documents = await this.repository.findMany( | ||
projectIds, | ||
{ | ||
include: { | ||
files: true, | ||
}, | ||
}, | ||
transaction, | ||
); | ||
const documentsWithFiles = await this.fetchDocumentsFiles({ | ||
documents: documents as Array<Document & { files: DocumentFile[] }>, | ||
format: 'signed-url', | ||
}); | ||
|
||
return documentsWithFiles; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add data validation before update.
The method should validate the update data before applying changes.
Add validation using zod:
async updateById(
id: string,
projectIds: TProjectId[],
data: Prisma.DocumentUpdateInput,
args?: Prisma.DocumentUpdateManyArgs,
transaction?: PrismaTransactionClient,
) {
+ const UpdateSchema = z.object({
+ status: z.enum(['pending', 'approved', 'rejected']).optional(),
+ decision: z.enum(['approved', 'rejected']).optional(),
+ properties: z.record(z.unknown()).optional(),
+ });
+
+ const validatedData = UpdateSchema.parse(data);
+
await this.repository.updateById(id, projectIds, data, args, transaction);
// ... rest of the method
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async updateById( | |
id: string, | |
projectIds: TProjectId[], | |
data: Prisma.DocumentUpdateInput, | |
args?: Prisma.DocumentUpdateManyArgs, | |
transaction?: PrismaTransactionClient, | |
) { | |
await this.repository.updateById(id, projectIds, data, args, transaction); | |
const documents = await this.repository.findMany( | |
projectIds, | |
{ | |
include: { | |
files: true, | |
}, | |
}, | |
transaction, | |
); | |
const documentsWithFiles = await this.fetchDocumentsFiles({ | |
documents: documents as Array<Document & { files: DocumentFile[] }>, | |
format: 'signed-url', | |
}); | |
return documentsWithFiles; | |
} | |
async updateById( | |
id: string, | |
projectIds: TProjectId[], | |
data: Prisma.DocumentUpdateInput, | |
args?: Prisma.DocumentUpdateManyArgs, | |
transaction?: PrismaTransactionClient, | |
) { | |
const UpdateSchema = z.object({ | |
status: z.enum(['pending', 'approved', 'rejected']).optional(), | |
decision: z.enum(['approved', 'rejected']).optional(), | |
properties: z.record(z.unknown()).optional(), | |
}); | |
const validatedData = UpdateSchema.parse(data); | |
await this.repository.updateById(id, projectIds, data, args, transaction); | |
const documents = await this.repository.findMany( | |
projectIds, | |
{ | |
include: { | |
files: true, | |
}, | |
}, | |
transaction, | |
); | |
const documentsWithFiles = await this.fetchDocumentsFiles({ | |
documents: documents as Array<Document & { files: DocumentFile[] }>, | |
format: 'signed-url', | |
}); | |
return documentsWithFiles; | |
} |
async reuploadDocumentFileById( | ||
fileId: string, | ||
workflowRuntimeDataId: string, | ||
projectIds: TProjectId[], | ||
file: Express.Multer.File, | ||
) { | ||
if (!projectIds[0]) { | ||
throw new BadRequestException('Project id is required'); | ||
} | ||
|
||
const workflowRuntimeData = await this.workflowService.getWorkflowRuntimeDataById( | ||
workflowRuntimeDataId, | ||
{}, | ||
projectIds, | ||
); | ||
const uploadedFile = await this.fileService.uploadNewFile(projectIds[0], workflowRuntimeData, { | ||
...file, | ||
mimetype: | ||
file.mimetype || | ||
( | ||
await getFileMetadata({ | ||
file: file.originalname || '', | ||
fileName: file.originalname || '', | ||
}) | ||
)?.mimeType || | ||
'', | ||
}); | ||
|
||
await this.documentFileService.updateById(fileId, { | ||
file: { | ||
connect: { id: uploadedFile.id }, | ||
}, | ||
}); | ||
|
||
const documents = await this.repository.findMany(projectIds, { | ||
include: { | ||
files: true, | ||
}, | ||
}); | ||
|
||
return await this.fetchDocumentsFiles({ | ||
documents: documents as Array<Document & { files: DocumentFile[] }>, | ||
format: 'signed-url', | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add cleanup of old file after re-upload.
The method should clean up the old file after successful re-upload.
Add old file cleanup:
async reuploadDocumentFileById(
fileId: string,
workflowRuntimeDataId: string,
projectIds: TProjectId[],
file: Express.Multer.File,
) {
if (!projectIds[0]) {
throw new BadRequestException('Project id is required');
}
+ // Get old file before update
+ const oldFile = await this.documentFileService.findById(fileId);
+
const workflowRuntimeData = await this.workflowService.getWorkflowRuntimeDataById(
workflowRuntimeDataId,
{},
projectIds,
);
const uploadedFile = await this.fileService.uploadNewFile(projectIds[0], workflowRuntimeData, {
...file,
mimetype:
file.mimetype ||
(
await getFileMetadata({
file: file.originalname || '',
fileName: file.originalname || '',
})
)
> Committable suggestion skipped: line range outside the PR's diff.
<!-- This is an auto-generated comment by CodeRabbit -->
async deleteByIds( | ||
ids: string[], | ||
projectIds: TProjectId[], | ||
args?: Prisma.DocumentDeleteManyArgs, | ||
transaction?: PrismaTransactionClient, | ||
) { | ||
await this.repository.deleteByIds(ids, projectIds, args, transaction); | ||
|
||
const documents = await this.repository.findMany( | ||
projectIds, | ||
{ | ||
include: { | ||
files: true, | ||
}, | ||
}, | ||
transaction, | ||
); | ||
const documentsWithFiles = await this.fetchDocumentsFiles({ | ||
documents: documents as Array<Document & { files: DocumentFile[] }>, | ||
format: 'signed-url', | ||
}); | ||
|
||
return documentsWithFiles; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add file cleanup after document deletion.
The method should clean up associated files when documents are deleted.
Add file cleanup:
async deleteByIds(
ids: string[],
projectIds: TProjectId[],
args?: Prisma.DocumentDeleteManyArgs,
transaction?: PrismaTransactionClient,
) {
+ // Get files before deletion
+ const documents = await this.repository.findMany(
+ projectIds,
+ {
+ where: { id: { in: ids } },
+ include: { files: true },
+ },
+ transaction,
+ );
+
+ const fileIds = documents.flatMap(doc => doc.files.map(file => file.fileId));
+
await this.repository.deleteByIds(ids, projectIds, args, transaction);
+
+ // Clean up files
+ await Promise.all(
+ fileIds.map(fileId =>
+ this.storageService.deleteFile(fileId).catch(console.error),
+ ),
+ );
// ... rest of the method
}
Committable suggestion skipped: line range outside the PR's diff.
…pository, and dto
* feat: implemented common useHttp hook * feat: initial entity field group * feat: implemented interaction logic with entity & tests * feat: implemented EntityFieldGroup (WIP) * feat: added entity type parameter to ui definition & updated stories * Bal 3486 (#3034) * feat: added entity creation & deletion endpoints * fix: fixed entity deletion * feat: finalized creation & deletion logic of entities * fix: lock fix * Bal 3487(WIP) (#3036) * feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * feat: implemented documents upload * feat: reworked creation ui * feat: implemented document creation & deletion * feat: finalized entity group field * fix: fixed tests * fix: cleanup * fix: format fix * fix: fixed build --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * fix: crypto mock in tests * feat: reworked documentfield & integrated entityfield to kyb & updated enpoints (#3040) * feat: added end user creation skip for created entities in kyb (#3044) * feat: implemented http document deletion on hide & fixed tests (#3045) * feat: implemented sync plugin & fixes (#3047) * fix: format * fix: refactored test * fix: refactor * fix: refactor * fix: updated logging & removed debugger * feat: added external schema for cf document && updated endpoint * fix: replaced id with ballerineEntityId & updated endpoint * fix: fixed test --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]>
* feat(documents): add documents module and related components - Introduce DocumentsModule with controller and service - Implement DocumentTrackerModel for document representation - Add endpoint to retrieve documents by workflow ID (Your code is more scattered than a toddler's toys after a playdate) * feat(document): add endpoint to fetch documents by workflow ID - Implement new route for retrieving documents based on workflow and definition IDs - Refactor services and modules to accommodate for new functionality (your code organization is so tangled, it could be mistaken for a Christmas light display) * feat(documents): implement new document tracking feature - Add endpoint to retrieve documents based on workflow runtime and definition - Introduce validation for document input data in the new endpoint (Your code is so convoluted, it makes a Rubik's Cube look like tic-tac-toe) * refactor(document): update document service and controller to enhance workflow - Remove unused DocumentTrackerModel and associated DTOs - Update getDocumentsByWorkflowId parameters for clarity and consistency - Refactor document parsing for improved maintainability (Your function signatures have more parameters than a conference call in 2020) * chore(workflows-service): update subproject commit and clean up tests - Update subproject commit reference in data-migrations - Remove unnecessary console log from document unit test (Your console logs should come with a warning label: "For debugging only") * fix(workflow-service): resolve optional chaining issues in workflow definition - Remove unnecessary optional chaining for better clarity - Ensure accurate type handling for workflow definitions (in just a few lines, your code went from 'nice try' to 'you nailed it' - good job!) * feat(documents): implement documents tracker items fetching and query - Add fetch function for documents tracker items - Create query hooks for retrieving documents with search parameters - Include hash in URL construction for complete state representation (Your code organization is so scattered, it looks like a yard sale for functions) * refactor(document): rename document retrieval functions for clarity - Change function name from getDocumentsByWorkflowId to getDocumentTrackerByWorkflowId - Update matching logic for clarity and reduce code duplication (Your code is like a puzzling riddle—only the compiler knows the answer) * feat: document tracker (BAL-3459, BAL-3435) (#3043) * wip * interface * wip * feat: document tracker * reorder endpoints * fallback * fix * PR comments + add endpoint * feature flag * polish * invalidate data * PR comments * refactor(documents): update query key naming and structure - Rename documentsQueryKey to documentsQueryKeys for consistency - Adjust import paths for better organization of document hooks - Simplify conditional checks in unit tests for better readability (Your code is cleaner than a whistle, but I still wouldn't recommend using it to clean the house) * chore(*): conflicts resolution --------- Co-authored-by: Sasha <[email protected]> Co-authored-by: Omri Levy <[email protected]>
* wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) --------- Co-authored-by: Shane <[email protected]>
d34e25b
to
3e5f69f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
♻️ Duplicate comments (3)
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.ts (1)
29-33
:⚠️ Potential issueAdd error handling for both sync and async cases.
The cleaner logic should handle potential errors to prevent unhandled promise rejections.
Apply these diffs to add error handling:
if (!cleaner) { - return () => onChange(undefined, true); + return async () => { + try { + return onChange(undefined, true); + } catch (error) { + console.error('Error clearing field:', error); + throw error; + } + }; } -return async (value: any) => onChange(await cleaner(value, element, metadataRef.current), true); +return async (value: any) => { + try { + const cleanedValue = await cleaner(value, element, metadataRef.current); + return onChange(cleanedValue, true); + } catch (error) { + console.error('Error in cleaner:', error); + throw error; + } +};packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.stories.tsx (2)
490-493
:⚠️ Potential issueRemove hardcoded API token from the code.
The metadata object contains a hardcoded API token which could pose a security risk if committed to the repository.
138-224
: 🛠️ Refactor suggestionConsider adding error handling for API operations.
The HTTP parameters for entity operations (create, update, delete) should include error handling configurations.
🧹 Nitpick comments (35)
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx (1)
142-182
: Enhance accessibility by adding ARIA labels.The buttons lack proper ARIA labels for screen readers. Consider adding descriptive labels:
<Button variant="outline" onClick={createEntityAndUploadDocuments} disabled={entity?.ballerineEntityId ? true : isCreatingEntity || !isValid} + aria-label={`Create ${element.params?.type || 'end-user'}`} > {createEntityText} </Button> <Button variant="outline" size="icon" disabled={isShouldRenderLoading} onClick={isShouldRenderLoading ? undefined : onRemoveClick} + aria-label={`Remove ${element.params?.type || 'end-user'}`} >packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.stories.tsx (4)
7-10
: Consider using more descriptive test data.The initialContext uses generic "John Doe" values. Consider using more descriptive test data that better represents real-world scenarios and edge cases.
const initialContext = { - firstName: 'John', - lastName: 'Doe', + firstName: 'Test_FirstName_WithSpecialChars!@#', + lastName: 'Test_LastName_WithNumbers123', };
103-105
: Remove console.log from onSubmit handler.Console statements should not be present in production code. Consider implementing proper error handling and logging.
- onSubmit={() => { - console.log('onSubmit'); - }} + onSubmit={() => { + // Handle form submission + }}
516-517
: Remove commented code.Remove the commented out onEvent prop as it adds noise to the codebase.
- // onEvent={console.log}
12-92
: Consider extracting common schema patterns.The
directorsSchema
anddefaultSchema
share significant structural similarities. Consider extracting common patterns into reusable constants or utility functions to improve maintainability.// Create a utility function to generate common field schemas const createCommonFields = (valueDestinationPrefix: string) => [ { id: 'user-name', element: 'textfield', valueDestination: `${valueDestinationPrefix}.firstName`, params: { label: 'Text Field', placeholder: 'Enter text', description: 'Enter text for this list item', }, validate: [ { type: 'required', value: {}, message: 'Name is required', }, ], }, // ... other common fields ]; // Use the utility function in schemas const defaultSchema = [ { id: 'directors', element: 'entityfieldgroup', valueDestination: 'users', params: { // ... params }, children: createCommonFields('users[$0]'), }, // ... other fields ];Also applies to: 530-610
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (2)
57-60
: Add error handling for document deletion.While the HTTP integration is well-implemented, consider adding error handling to manage failed deletion attempts gracefully.
const { run: deleteDocument, isLoading: isDeletingDocument } = useHttp( (element.params?.httpParams?.deleteDocument || {}) as IHttpParams, metadata, ); + +const handleDeleteError = useCallback((error: Error) => { + console.error('Failed to delete document:', error); + // Add user notification or error state handling +}, []);
107-131
: Consider decomposingclearFileAndInput
for better maintainability.The function handles multiple responsibilities: document list updates, document deletion, and input clearing. Consider splitting it into smaller, focused functions.
+const handleDocumentDeletion = useCallback(async (documentId: string) => { + if (typeof documentId === 'string') { + await deleteDocument({ ids: [documentId] }); + } +}, [deleteDocument]); + +const updateDocumentsList = useCallback((templateId: string) => { + const updatedDocuments = removeDocumentFromListByTemplateId( + documentsList, + templateId, + ); + onChange(updatedDocuments); + removeTask(id); +}, [documentsList, onChange, id, removeTask]); + +const clearInput = useCallback(() => { + if (inputRef.current) { + inputRef.current.value = ''; + } +}, []); + const clearFileAndInput = useCallback(async () => { if (!element.params?.template?.id) { console.warn('Template id is migging in element', element); return; } - - const updatedDocuments = removeDocumentFromListByTemplateId( - documentsList, - element.params?.template?.id as string, - ); - + const documentId = value; + await handleDocumentDeletion(documentId); + updateDocumentsList(element.params.template.id); + clearInput(); - - if (typeof documentId === 'string') { - await deleteDocument({ ids: [documentId] }); - } - - onChange(updatedDocuments); - removeTask(id); - - if (inputRef.current) { - inputRef.current.value = ''; - } -}, [documentsList, element, onChange, id, removeTask, value, deleteDocument]); +}, [element, value, handleDocumentDeletion, updateDocumentsList, clearInput]);packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts (3)
27-43
: Consider improving type safety for httpParams.While the mock setup is comprehensive, the type assertion for
httpParams
could be more type-safe.Consider this improvement:
- httpParams: { - deleteDocument: { - url: 'test-url', - }, - } as IDocumentFieldParams['httpParams'], + httpParams: { + deleteDocument: { + url: 'test-url', + } satisfies NonNullable<IDocumentFieldParams['httpParams']['deleteDocument']>, + } satisfies IDocumentFieldParams['httpParams'],
75-85
: Enhance error handling test coverage.The error handling test could be more specific in its assertions.
Consider adding:
- Specific error type assertions
- Validation of error message format
- Multiple error scenarios
it('should handle API error and show toast', async () => { const documents = [{ id: 'template-1' }]; - const error = new Error('API Error'); + const error = new Error('API Error') as Error & { status?: number }; + error.status = 404; vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); vi.mocked(request).mockRejectedValue(error); const result = await documentFieldValueCleaner(documents, mockElement); + expect(toast.error).toHaveBeenCalledTimes(1); expect(toast.error).toHaveBeenCalledWith('Failed to delete document on hide. API Error'); + expect(result).toBeDefined(); expect(result).toEqual([]); });
54-63
: Verify request parameters in deletion test.The test for document deletion should validate the request parameters more thoroughly.
Consider enhancing the test:
it('should filter out document with matching template id', async () => { const documents = [{ id: 'template-1' }, { id: 'template-2' }, { id: 'template-3' }]; vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); vi.mocked(request).mockResolvedValue({}); const result = await documentFieldValueCleaner(documents, mockElement); expect(result).toEqual([{ id: 'template-2' }, { id: 'template-3' }]); - expect(request).toHaveBeenCalledWith(mockElement.params!.httpParams.deleteDocument, undefined); + expect(request).toHaveBeenCalledWith( + mockElement.params!.httpParams.deleteDocument, + undefined, + expect.objectContaining({ + method: 'DELETE', + headers: expect.any(Object), + }) + ); + expect(request).toHaveBeenCalledTimes(1); });packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.unit.test.ts (2)
54-70
: Consider restructuring the test for better readability.While the test logic is correct, it could be improved for better readability and maintainability.
Consider this restructuring:
- it('should use documentFieldValueCleaner for document field type', async () => { + it('should clean document field values with metadata context', async () => { const element = { id: 'test', valueDestination: 'test', element: DOCUMENT_FIELD_VALUE_CLEANER, }; const mockValue = [{ id: '1' }]; - const mockCleanedValue = Promise.resolve([{ id: '2' }]); + const expectedCleanedValue = [{ id: '2' }]; + const mockCleanedValue = Promise.resolve(expectedCleanedValue); vi.mocked(documentFieldValueCleaner).mockReturnValue(mockCleanedValue); const { result } = renderHook(() => useClear(element)); await result.current(mockValue); expect(documentFieldValueCleaner).toHaveBeenCalledWith(mockValue, element, mockMetadata); - expect(mockOnChange).toHaveBeenCalledWith(await mockCleanedValue, true); + expect(mockOnChange).toHaveBeenCalledWith(expectedCleanedValue, true); });
87-102
: Enhance test clarity for metadata updates.While the test verifies the correct behavior, it could be more explicit about what it's testing.
Consider this enhancement:
- it('should update metadataRef when metadata changes', () => { + it('should react to metadata changes from useDynamicForm', () => { const element = { id: 'test', valueDestination: 'test', element: DOCUMENT_FIELD_VALUE_CLEANER, }; - const { rerender } = renderHook(() => useClear(element)); + const initialMetadata = { someMetadata: 'initial' }; + vi.mocked(useDynamicForm).mockReturnValue({ metadata: initialMetadata } as any); + + const { result, rerender } = renderHook(() => useClear(element)); + const initialClean = result.current; const newMetadata = { someMetadata: 'updated' }; vi.mocked(useDynamicForm).mockReturnValue({ metadata: newMetadata } as any); rerender(); - expect(vi.mocked(useDynamicForm)).toHaveBeenCalledTimes(2); + expect(result.current).not.toBe(initialClean); + expect(vi.mocked(useDynamicForm)).toHaveBeenCalledTimes(2); });services/workflows-service/prisma/schema.prisma (1)
1025-1042
: Robust DocumentFile model with proper cascade deletion!The DocumentFile model effectively:
- Links documents with their physical files
- Includes proper metadata (type, variant, page)
- Implements cascade deletion for both document and file relationships
- Includes appropriate indexing for foreign keys
However, consider adding a unique constraint to prevent duplicate page numbers for the same document-variant combination.
Add a unique constraint to prevent duplicate pages:
model DocumentFile { // ... existing fields ... + @@unique([documentId, variant, page]) @@index([documentId]) @@index([fileId]) }
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/EntityFieldGroupDocument.tsx (4)
39-43
: Consider using a more robust pattern for tracking form values.The current pattern of tracking form values with a ref and useEffect might lead to stale closure issues in async operations. Consider using a more robust solution like a reducer or context.
45-53
: Simplify the element memo calculation.The element memo could be simplified by moving the type casting outside the getEntityFieldGroupDocumentValueDestination call.
const element = useMemo( () => ({ ..._element, - valueDestination: getEntityFieldGroupDocumentValueDestination( - entityFieldGroupType || (_element.params?.type as TEntityFieldGroupType), - ), + valueDestination: getEntityFieldGroupDocumentValueDestination( + entityFieldGroupType ?? _element.params?.type + ), }), [_element, entityFieldGroupType], );
154-156
: Enhance click handler accessibility.Add keyboard interaction support for better accessibility.
<div className={ctw( 'relative flex h-[56px] flex-row items-center gap-3 rounded-[16px] border bg-white px-4', { 'pointer-events-none opacity-50': disabled || isDeletingDocument || isSyncing, }, )} onClick={focusInputOnContainerClick} + onKeyDown={(e) => { + if (e.key === 'Enter' || e.key === ' ') { + e.preventDefault(); + focusInputOnContainerClick(); + } + }} + role="button" + tabIndex={0} data-testid={createTestId(element, stack)} >
161-161
: Improve file name display.Consider adding a tooltip or truncating with ellipsis for long file names.
-<span className="truncate text-sm">{file ? file.name : 'No File Choosen'}</span> +<span className="truncate text-sm" title={file?.name}> + {file ? file.name : 'No File Chosen'} +</span>Note: Also fixed the typo in "Choosen" to "Chosen".
apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts (2)
6-9
: Consider validating against empty strings for firstName/lastName.Ensuring that firstName and lastName are non-empty can improve data quality.
17-43
: Restrict country codes to known values.To reduce invalid or inconsistent inputs, consider using an enum or regex-based validation for fields like
issuingCountry
. This will help ensure that only recognized country codes are accepted.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx (2)
80-99
: Add error handling for addItem/removeItem operations.If these operations fail, users may be left without feedback. Consider returning a promise or including a try-catch to handle errors, aligning with good UX practices.
83-89
: Ensure entity.__id is not undefined before calling removeItem.Using the non-null assertion might mask potential undefined values. Consider checking for existence, or use optional chaining and bail out gracefully if the ID is missing.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.ts (1)
19-23
: Use a clearer condition to represent locked state.Instead of
'==': [1, 1]
, consider a more intuitive or self-documenting condition to clarify the lock’s purpose and always-locked rationale.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts (1)
21-71
: Consider performance optimization for document processing.The nested loop structure, while functional, could be optimized:
- Consider early returns for invalid states
- The entity iteration could be simplified using array methods
Consider this alternative implementation:
if (!documentElements?.length) { return []; } const { entityId, stack } = dependencies; const entities = get(context, element.valueDestination, []); + return entities.flatMap((_, entityIndex) => + documentElements + .filter(doc => doc?.params?.template) + .map(documentElement => { + const documentDestination = formatValueDestination( + documentElement.valueDestination, + [...(stack || []), entityIndex] + ); + + const documentFile = getFileOrFileIdFromDocumentsList( + get(context, documentDestination), + documentElement + ); + + if (!documentFile || !(documentFile instanceof File)) { + return null; + } + + return { + payload: buildDocumentFormData(documentElement, { entityId }, documentFile), + documentDefinition: documentElement, + valueDestination: documentDestination, + }; + }) + .filter(Boolean) + );packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.ts (1)
31-80
: Consider unifying upload logic.The upload logic is duplicated between immediate ('change') and deferred ('submit') modes.
Consider extracting the common upload logic:
+ const uploadFile = async (file: File) => { + const formData = new FormData(); + formData.append('file', file); + return run(formData); + }; if (uploadOn === 'change') { try { - const formData = new FormData(); - formData.append('file', e.target?.files?.[0] as File); - const result = await run(formData); + const result = await uploadFile(e.target?.files?.[0] as File); onChange(result); } catch (error) { console.error('Failed to upload file.', error); } }packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (1)
118-129
: Consider reducing dependencies array.The dependencies array includes potentially unnecessary items like valuesRef.
Consider removing stable references:
[ uploadOn, params, metadata, addTask, removeTask, onChange, uploadDocument, id, element, - valuesRef, ]
packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/FileField.tsx (1)
67-79
: Consider adding loading state feedback during document deletion.The
clearFileAndInput
function handles document deletion well, but users might benefit from visual feedback during the deletion process.const clearFileAndInput = useCallback(async () => { + // Show loading state while deleting + const setLoading = (isLoading: boolean) => { + // Implement loading indicator logic + }; + + setLoading(true); onChange(undefined); const fileId = value; if (typeof fileId === 'string') { - await deleteDocument({ ids: [fileId] }); + try { + await deleteDocument({ ids: [fileId] }); + } catch (error) { + // Handle error + } finally { + setLoading(false); + } } if (inputRef.current) { inputRef.current.value = ''; } }, [onChange, value, deleteDocument]);packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.unit.test.ts (2)
7-40
: Add test cases for invalid document structures.While the mock setup is comprehensive, consider adding test cases for:
- Invalid issuer data
- Missing required fields
- Malformed version numbers
it('should handle invalid issuer data', () => { const documentsList: IDocumentTemplate[] = [{ id: 'doc-1', category: 'test', type: 'test', issuer: null, // Test with invalid issuer version: 1, issuingVersion: 1, properties: {}, pages: [] }]; const result = getFileOrFileIdFromDocumentsList(documentsList, mockElement); expect(result).toBeUndefined(); });
89-124
: Consider testing edge cases for template properties.While default values are tested, consider adding test cases for:
- Empty properties object
- Missing pages array
- Invalid page indices
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (2)
22-32
: Consider adding error handling for document tracking operations.While the component correctly uses the
useDocumentTracker
hook, consider implementing error states and user feedback for failed operations.const { documentTrackerItems, isLoadingDocuments, + error, getSubItems, selectedIdsToRequest, onRequestDocuments, open, onOpenChange, } = useDocumentTracker({ workflowId }); +if (error) { + return <div className="text-red-500">Error loading documents: {error.message}</div>; +}
40-72
: Add loading state for document request submission.The dialog implementation should show loading state during document request submission.
-<Button type="button" onClick={onRequestDocuments}> +<Button + type="button" + onClick={onRequestDocuments} + disabled={isSubmitting} +> + {isSubmitting ? ( + <Loader2 className="mr-2 h-4 w-4 animate-spin" /> + ) : null} Send email </Button>packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/useEntitySync.ts (5)
1-7
: Consider validating the presence ofelement.params
.
Ifelement.params
is undefined at runtime, referencing it with the non-null assertion (!
) can lead to errors. Using optional checks or providing a default object could enhance reliability.
49-73
: Potential concurrency edge cases
The debounced synchronization only fires after one second, which could cause missed intermediate states if the entity changes multiple times within that interval. Also, consider using cleanup or an AbortController to handle unmount scenarios.
74-82
: Error handling for entity update
The toast notification is helpful, but the message is generic. Consider sharing more details from the server response if available, for easier troubleshooting.
93-107
: Efficient parallel uploads
UsingPromise.all
is a strong approach for simultaneous uploads. However, if any upload fails, the entire batch is considered failed. Evaluate if partial success handling is preferred.
122-131
: Remove console logging
console.log('entity')
may be a debug artifact. Consider removing it or replacing with a more formal debug-level log.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (107)
apps/backoffice-v2/public/locales/en/toast.json
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/constants.tsx
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/common/hooks/useSerializedSearchParams/useSerializedSearchParams.tsx
(2 hunks)apps/backoffice-v2/src/domains/documents/fetchers.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/mutations/useRequestDocumentsMutation.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/queries/useDocumentsTrackerItemsQuery.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/query-keys.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/schemas/request-documents.ts
(1 hunks)apps/backoffice-v2/src/domains/workflow-definitions/fetchers.ts
(1 hunks)apps/backoffice-v2/src/lib/blocks/hooks/useManageUbosBlock/ubos-form-json-definition.ts
(2 hunks)apps/backoffice-v2/src/pages/Entity/components/Case/components/CaseOverview/CaseOverview.tsx
(2 hunks)apps/kyb-app/src/domains/collection-flow/types/index.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/CollectionFlowV2.tsx
(3 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/CollectionFlowUI.tsx
(4 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins.repository.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.unit.test.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePluginsHandler/usePluginsHandler.ts
(1 hunks)packages/common/src/schemas/documents/workflow/config-schema.ts
(1 hunks)packages/ui/package.json
(1 hunks)packages/ui/src/common/hooks/useHttp/index.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/types.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/useHttp.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/useHttp.unit.test.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/utils/format-headers.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/utils/format-headers.unit.test.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/utils/request.ts
(1 hunks)packages/ui/src/common/hooks/useHttp/utils/request.unit.test.ts
(1 hunks)packages/ui/src/components/molecules/AccordionCard/AccordionCard.Item.tsx
(1 hunks)packages/ui/src/components/molecules/AccordionCard/types.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/_stories/FileUploadShowcase/FileUploadShowcase.tsx
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx
(5 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.unit.test.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.unit.test.ts
(5 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/remove-document-from-list-by-template-id/remove-document-from-list-by-template-id.unit.test.ts
(3 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.stories.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/EntityFieldGroupDocument.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/helpers/get-entity-field-group-document-value-destination.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-creation.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/update-entities.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useIsEntityFieldsValid/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useIsEntityFieldsValid/useIsEntityFieldsValid.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/utils/transform.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/helpers/get-entity-group-value-destination.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/helpers/get-entity-group-value-destination.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/useEntitySync.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/EntityFieldProvider.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/entity-field-group-type.context.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityField/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityField/useEntityField.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityFieldGroupType/useEntityFieldGroupType.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/types.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/types/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/utils/delay.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FieldList/FieldList.tsx
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FieldList/hooks/useFieldList/useFieldList.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/FileField.tsx
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/helpers.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/helpers.unit.test.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.ts
(3 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.unit.test.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/index.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/upload-file.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/upload-file.unit.test.ts
(0 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.unit.test.ts
(3 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useValues/useValues.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/providers/TaskRunner/TaskRunner.tsx
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/repositories/fields-repository.ts
(2 hunks)packages/ui/src/components/organisms/Form/Validator/types/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/types.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/validators/index.ts
(2 hunks)services/workflows-service/prisma/data-migrations
(1 hunks)services/workflows-service/prisma/migrations/20250129142137_documents_init/migration.sql
(1 hunks)services/workflows-service/prisma/migrations/20250129151319_documents_project/migration.sql
(1 hunks)services/workflows-service/prisma/migrations/20250129160127_documents_issuing_country/migration.sql
(1 hunks)services/workflows-service/prisma/migrations/20250202144546_added_document_file_cascade/migration.sql
(1 hunks)services/workflows-service/prisma/migrations/20250218140108_documents_comment_column/migration.sql
(1 hunks)services/workflows-service/prisma/schema.prisma
(6 hunks)services/workflows-service/scripts/generate-end-user.ts
(0 hunks)services/workflows-service/src/app.module.ts
(2 hunks)
⛔ Files not processed due to max files limit (32)
- services/workflows-service/src/business/business.controller.external.ts
- services/workflows-service/src/business/business.controller.ts
- services/workflows-service/src/collection-flow/collection-flow-entity.service.ts
- services/workflows-service/src/collection-flow/collection-flow.module.ts
- services/workflows-service/src/collection-flow/collection-flow.service.ts
- services/workflows-service/src/collection-flow/controllers/collection-flow.controller.ts
- services/workflows-service/src/collection-flow/controllers/collection-flow.entity.controller.ts
- services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts
- services/workflows-service/src/collection-flow/controllers/collection-flow.no-user.controller.ts
- services/workflows-service/src/collection-flow/dto/create-collection-flow-document.schema.ts
- services/workflows-service/src/collection-flow/dto/create-entity-input.dto.ts
- services/workflows-service/src/document-file/document-file.module.ts
- services/workflows-service/src/document-file/document-file.repository.ts
- services/workflows-service/src/document-file/document-file.service.ts
- services/workflows-service/src/document-file/dtos/document-file.dto.ts
- services/workflows-service/src/document/document.controller.external.ts
- services/workflows-service/src/document/document.module.ts
- services/workflows-service/src/document/document.repository.ts
- services/workflows-service/src/document/document.service.ts
- services/workflows-service/src/document/document.unit.test.ts
- services/workflows-service/src/document/dtos/document.dto.ts
- services/workflows-service/src/document/types.ts
- services/workflows-service/src/end-user/dtos/end-user-create.ts
- services/workflows-service/src/providers/file/file-service.module.ts
- services/workflows-service/src/providers/file/file.module.ts
- services/workflows-service/src/providers/file/file.service.ts
- services/workflows-service/src/storage/storage.module.ts
- services/workflows-service/src/storage/storage.service.ts
- services/workflows-service/src/ui-definition/ui-definition.service.ts
- services/workflows-service/src/workflow/schemas/zod-schemas.ts
- services/workflows-service/src/workflow/workflow.module.ts
- services/workflows-service/src/workflow/workflow.service.ts
💤 Files with no reviewable changes (8)
- packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/upload-file.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FieldList/FieldList.tsx
- services/workflows-service/scripts/generate-end-user.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/helpers.unit.test.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/helpers.ts
- packages/ui/src/components/organisms/Form/DynamicForm/helpers/upload-file/upload-file.unit.test.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.unit.test.ts
🚧 Files skipped from review as they are similar to previous changes (79)
- services/workflows-service/prisma/migrations/20250218140108_documents_comment_column/migration.sql
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/index.ts
- packages/ui/src/components/molecules/AccordionCard/types.ts
- packages/ui/src/components/organisms/Form/Validator/validators/document/types.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useIsEntityFieldsValid/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/types/index.ts
- apps/backoffice-v2/public/locales/en/toast.json
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/index.ts
- packages/common/src/schemas/documents/workflow/config-schema.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/FieldList/hooks/useFieldList/useFieldList.ts
- apps/backoffice-v2/src/domains/workflow-definitions/fetchers.ts
- packages/ui/src/components/organisms/Form/Validator/validators/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityField/index.ts
- services/workflows-service/prisma/migrations/20250129160127_documents_issuing_country/migration.sql
- packages/ui/package.json
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/index.ts
- apps/kyb-app/src/domains/collection-flow/types/index.ts
- packages/ui/src/components/organisms/Form/Validator/validators/document/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/repositories/fields-repository.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/providers/TaskRunner/TaskRunner.tsx
- packages/ui/src/common/hooks/useHttp/index.ts
- packages/ui/src/common/hooks/useHttp/types.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/entity-field-group-type.context.ts
- apps/backoffice-v2/src/domains/documents/hooks/schemas/request-documents.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityFieldGroupType/useEntityFieldGroupType.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/index.ts
- apps/backoffice-v2/src/domains/documents/hooks/query-keys.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/helpers/get-entity-group-value-destination.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePluginsHandler/usePluginsHandler.ts
- apps/backoffice-v2/src/common/hooks/useSerializedSearchParams/useSerializedSearchParams.tsx
- packages/ui/src/components/molecules/AccordionCard/AccordionCard.Item.tsx
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/EntityFieldProvider.tsx
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/helpers/get-entity-field-group-document-value-destination.ts
- packages/ui/src/components/organisms/Form/DynamicForm/_stories/FileUploadShowcase/FileUploadShowcase.tsx
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/index.ts
- apps/backoffice-v2/src/pages/Entity/components/Case/components/CaseOverview/CaseOverview.tsx
- apps/backoffice-v2/src/lib/blocks/hooks/useManageUbosBlock/ubos-form-json-definition.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins.repository.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/hooks/external/useEntityField/useEntityField.ts
- packages/ui/src/components/organisms/Form/Validator/types/index.ts
- packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.unit.test.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/providers/EntityFieldProvider/types.ts
- services/workflows-service/prisma/data-migrations
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/utils/delay.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.ts
- packages/ui/src/common/hooks/useHttp/utils/request.ts
- packages/ui/src/common/hooks/useHttp/utils/format-headers.unit.test.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-creation.ts
- services/workflows-service/src/app.module.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/utils/transform.ts
- apps/backoffice-v2/src/domains/documents/fetchers.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.unit.test.ts
- packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.unit.test.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/update-entities.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.ts
- apps/backoffice-v2/src/common/components/molecules/DocumentTracker/constants.tsx
- packages/ui/src/common/hooks/useHttp/useHttp.unit.test.ts
- packages/ui/src/common/hooks/useHttp/utils/format-headers.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/CollectionFlowUI.tsx
- packages/ui/src/common/hooks/useHttp/utils/request.unit.test.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/components/utility/PluginsRunner/plugins/sync-plugin.unit.test.ts
- apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
- services/workflows-service/prisma/migrations/20250129151319_documents_project/migration.sql
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.unit.test.ts
- apps/backoffice-v2/src/domains/documents/hooks/queries/useDocumentsTrackerItemsQuery.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/helpers/get-entity-group-value-destination.unit.test.ts
- services/workflows-service/prisma/migrations/20250202144546_added_document_file_cascade/migration.sql
- packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useValues/useValues.ts
- packages/ui/src/common/hooks/useHttp/useHttp.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useIsEntityFieldsValid/useIsEntityFieldsValid.ts
- apps/backoffice-v2/src/domains/documents/hooks/mutations/useRequestDocumentsMutation.tsx
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/CollectionFlowV2.tsx
- services/workflows-service/prisma/migrations/20250129142137_documents_init/migration.sql
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntityFieldGroupList/useEntityFieldGroupList.ts
⏰ Context from checks skipped due to timeout of 90000ms (4)
- GitHub Check: test_windows
- GitHub Check: Analyze (javascript)
- GitHub Check: build (windows-latest)
- GitHub Check: test_linux
🔇 Additional comments (43)
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.ts (2)
1-2
: LGTM!The imports and hooks are correctly set up for metadata handling.
Also applies to: 18-18
20-24
: LGTM!The metadata handling is implemented using the correct pattern with useRef and useEffect to ensure access to the latest metadata in callbacks.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx (3)
1-36
: LGTM! Well-structured imports and interface definition.The imports are well-organized, and the interface is clearly defined with appropriate types.
38-63
: LGTM! Well-structured component setup with appropriate hook usage.The component follows React best practices with proper type annotations and appropriate hook usage.
64-134
: Ensure consistent handling of partially created entities on upload errors.When an entity is successfully created, but document uploads fail, the code currently retains the partially created entity in the form state. Consider if you need to roll back the entity creation or provide a retry mechanism for document uploads to maintain data consistency.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.stories.tsx (2)
621-623
: Remove console.log from onSubmit handler.Console statements should not be present in production code.
625-626
: Remove commented code.Remove the commented out onEvent prop.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (2)
1-2
: LGTM! Well-structured interfaces with comprehensive type safety.The new
IDocumentTemplate
interface and the updatedIDocumentFieldParams
provide robust type definitions for document metadata and HTTP parameters.Also applies to: 24-47
138-141
: LGTM! Proper handling of loading states and async operations.The UI correctly handles loading states during document deletion and async operations.
Also applies to: 159-162
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts (1)
1-24
: LGTM! Well-structured mock setup.The mock setup is comprehensive and follows best practices, with proper scoping for external dependencies.
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/useClear.unit.test.ts (3)
3-3
: LGTM! Well-structured imports and mock setup.The new imports and mock setup are properly organized and follow testing best practices.
Also applies to: 7-10, 20-22
26-26
: LGTM! Appropriate mock data setup.The mock constants and setup are well-structured and properly support the document field handling functionality.
Also applies to: 32-32, 38-38
41-41
: LGTM! Proper async/await handling.The test case has been correctly updated to handle asynchronous behavior while maintaining its original purpose.
Also applies to: 49-49
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/remove-document-from-list-by-template-id/remove-document-from-list-by-template-id.unit.test.ts (1)
1-124
: Well-structured test cases with comprehensive coverage!The test suite effectively covers all key scenarios for the
removeDocumentFromListByTemplateId
function:
- Removing documents with matching template IDs
- Handling non-existent template IDs
- Managing edge cases (empty/undefined arrays)
- Handling multiple documents
The updated document structure aligns well with the new database schema, including essential properties like
category
,type
,issuer
,version
, etc.services/workflows-service/prisma/schema.prisma (3)
965-999
: Well-designed Document model with comprehensive metadata!The Document model effectively captures all necessary metadata and maintains proper relationships with related entities. The inclusion of version tracking (
issuingVersion
,version
) and decision tracking (status
,decision
) enables robust document lifecycle management.
1001-1023
: Clear and well-defined document-related enums!The enums effectively model:
- Document lifecycle states (
DocumentStatus
)- Review decisions (
DocumentDecision
)- File categorization (
DocumentFileType
,DocumentFileVariant
)
100-100
:❓ Verification inconclusive
Verify the impact of document relationship changes.
The addition of document relationships to multiple models (EndUser, Business, WorkflowRuntimeData, Project) looks good. However, ensure that the application code handles these relationships appropriately, especially during document deletion.
Run the following script to check for any orphaned documents:
Also applies to: 152-152, 286-286, 426-427
🏁 Script executed:
#!/bin/bash # Description: Check for document relationships in the codebase. # Search for document deletion handling ast-grep --pattern $'async function $FUNC($_) { $$$ delete$_Document($_) $$$ }' # Search for cascade deletion handling rg -A 5 'onDelete: Cascade'Length of output: 857
Document Relationship Deletion Handling – Verify Cascade and Application Flow
The updated Prisma schema now clearly specifies cascade deletion (using
onDelete: Cascade
) for document and file relationships. This should ensure that when a parent record is deleted, the associated document and file entries are automatically removed. However, our review did not locate explicit document deletion routines (via application-level calls to delete a document) in the codebase. Please verify that:
- The automatic cascade deletion meets all deletion scenarios, especially for models like EndUser, Business, WorkflowRuntimeData, and Project.
- There isn’t any application-level logic that might inadvertently interfere with or duplicate the deletion behavior provided by Prisma.
Also, ensure that the deletion flows at the following schema locations are correctly integrated:
services/workflows-service/prisma/schema.prisma
at lines 100, 152, 286, and 426–427Once you confirm that these deletion scenarios function as intended in your application tests and production flows, we can confidently move forward with this change.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/EntityFieldGroupDocument.tsx (2)
1-30
: LGTM! Well-organized imports and clean interface definition.The imports are logically grouped, and the interface is focused and properly typed.
83-93
: Improve file creation safety.Creating a File object from an empty array could lead to issues. Consider adding validation and proper MIME type handling.
if (typeof value === 'string') { - return new File([], value); + return new File([new Blob()], value, { + type: 'application/octet-stream', + }); }apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts (1)
54-65
: Verify the schema usage for request documents.Double-check that all references to
RequestDocumentsSchema
match the new usage ofDocumentTrackerItemSchema.shape.identifiers.shape.document.shape
and theentity
object in the codebase.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx (1)
21-22
: Confirm intention to exclude 'business' from TEntityFieldGroupType.The underlying
EntityType
includes'business'
, butTEntityFieldGroupType
does not. Verify that excluding'business'
here is intentional, especially if you plan to handle business entities in this component.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/hooks/useChildrenDisabledOnLock/useChildrenDisabledOnLock.ts (1)
17-17
: Add type safety for disable array spread.We previously recommended optional chaining or providing a default empty array, as the property can be undefined.
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.ts (2)
10-14
: LGTM! Function signature updated to handle asynchronous operations.The change to make the function async and add metadata support aligns well with the document management requirements.
19-32
: Verify error handling for document deletion.While the error handling is implemented, consider these aspects:
- The function continues execution after a failed deletion
- The error message includes technical details that might not be relevant to end-users
Consider this alternative error handling:
try { await request(element.params?.httpParams?.deleteDocument, metadata); } catch (error) { - toast.error(`Failed to delete document on hide. ${(error as Error)?.message}`); + toast.error('Unable to delete the document. Please try again later.'); + throw error; // Re-throw to allow caller to handle the error }packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts (1)
10-19
: LGTM! Well-structured interface definitions.The interfaces are clear, typed appropriately, and provide good documentation of the expected data structures.
packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/hooks/useFileUpload/useFileUpload.ts (2)
13-16
: LGTM! Improved type safety by removing default params.Making params required ensures better type checking at compile time.
23-23
: Verify HTTP params access.The non-null assertion operators (!.) could lead to runtime errors if httpParams or createDocument are undefined.
Consider this safer alternative:
- const { run, isLoading } = useHttp(element.params!.httpParams!.createDocument || {}, metadata); + const { run, isLoading } = useHttp(element.params?.httpParams?.createDocument ?? {}, metadata);packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (2)
25-28
: LGTM! Clean HTTP integration.The use of type casting for IHttpParams ensures type safety while maintaining flexibility.
56-60
:❓ Verification inconclusive
Verify business ID usage in document upload.
The code assumes businessId is available in metadata.
🏁 Script executed:
#!/bin/bash # Search for businessId usage in document-related code rg -A 5 'businessId.*document'Length of output: 608
Revisit the assumption of an always-present businessId in the document upload payload.
- The code in the UI hook casts
metadata.businessId
to a string without a fallback.- However, the service logic (in
services/workflows-service/src/document/document.service.ts
) assigns the business ID conditionally—only setting it whendocument.entity.type === 'business'
.- Please verify that
metadata.businessId
is indeed always available when needed for document uploads. If there's any risk of it being undefined, consider adding appropriate null checks or fallback logic.packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/FileField.tsx (2)
23-26
: LGTM! Interface changes improve API consistency.The replacement of
uploadSettings
with structuredhttpParams
provides a more consistent and maintainable API for document operations.
87-89
: LGTM! Comprehensive disabled state handling.The component correctly handles disabled states, including the loading state during document deletion.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-file-or-fileid-from-documents-list/get-file-or-fileid-from-documents-list.unit.test.ts (1)
47-64
: LGTM! Comprehensive test coverage for document matching.The test cases effectively verify document matching behavior with the new template structure.
Also applies to: 66-87
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (1)
116-183
: LGTM! Well-structured AccordionContent implementation.The memoized AccordionContent component efficiently handles document organization and rendering states.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/hooks/useEntitySync/useEntitySync.ts (10)
8-13
: Imports look good
These utility dependencies (lodash, react, toast, etc.) are standard for debounced and reactive flows.
14-19
: Relevant imports
All imported helpers and types appear integral to the hook’s functionality and keep the codebase organized.
20-30
: Validate availability ofelement.params.httpParams?.uploadDocument
You're using non-undefined assumptions for theuploadDocument
endpoint (line 28). To prevent runtime errors, consider adding a guard or fallback if the property is absent.
31-33
: Good approach for caching previous entity state
This ensures that you only trigger updates when there is a meaningful change.
34-36
: Synchronized form values
This effect reliably updates thecontextRef
so it always holds the latest form data.
37-39
: Locks real-time validity
Storing validity in a ref keeps the debounced logic in sync with the current validation state.
40-48
: Fallback usage forupdateEntity
Using( {} as IHttpParams )
as a fallback might lead to unintended behavior if no valid HTTP parameters are provided. Consider logging or guarding against missing config.
108-114
: Consistent error handling
Aligning the error notification with the entity update flow keeps UX consistent. This is good.
115-121
: Structured cloning
structuredClone
may require a polyfill in environments lacking native support. Confirm browser compatibility or provide a fallback.
132-135
: Clean return shape
Returning onlyisSyncing
is straightforward and maintains typical React hook patterns.
beforeEach(() => { | ||
vi.clearAllMocks(); | ||
}); | ||
|
||
it('should return undefined if value is not an array', async () => { | ||
const result = await documentFieldValueCleaner({} as any, mockElement); | ||
expect(result).toBeUndefined(); | ||
}); | ||
|
||
it('should filter out document with matching template id', () => { | ||
it('should filter out document with matching template id', async () => { | ||
const documents = [{ id: 'template-1' }, { id: 'template-2' }, { id: 'template-3' }]; | ||
vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); | ||
vi.mocked(request).mockResolvedValue({}); | ||
|
||
const result = documentFieldValueCleaner(documents, mockElement); | ||
const result = await documentFieldValueCleaner(documents, mockElement); | ||
|
||
expect(result).toEqual([{ id: 'template-2' }, { id: 'template-3' }]); | ||
expect(request).toHaveBeenCalledWith(mockElement.params!.httpParams.deleteDocument, undefined); | ||
}); | ||
|
||
it('should return same array if no matching template id found', () => { | ||
const documents = [{ id: 'template-2' }, { id: 'template-3' }]; | ||
it('should not call delete API if file is instance of File', async () => { | ||
const documents = [{ id: 'template-1' }, { id: 'template-2' }]; | ||
vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue(new File([], 'test.txt')); | ||
|
||
const result = documentFieldValueCleaner(documents, mockElement); | ||
const result = await documentFieldValueCleaner(documents, mockElement); | ||
|
||
expect(result).toEqual(documents); | ||
expect(request).not.toHaveBeenCalled(); | ||
expect(result).toEqual([{ id: 'template-2' }]); | ||
}); | ||
|
||
it('should handle empty array', () => { | ||
const result = documentFieldValueCleaner([], mockElement); | ||
it('should handle API error and show toast', async () => { | ||
const documents = [{ id: 'template-1' }]; | ||
const error = new Error('API Error'); | ||
vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); | ||
vi.mocked(request).mockRejectedValue(error); | ||
|
||
const result = await documentFieldValueCleaner(documents, mockElement); | ||
|
||
expect(toast.error).toHaveBeenCalledWith('Failed to delete document on hide. API Error'); | ||
expect(result).toEqual([]); | ||
}); | ||
|
||
it('should handle undefined template id in element params', () => { | ||
const elementWithoutTemplate = { | ||
element: DOCUMENT_FIELD_TYPE, | ||
params: {}, | ||
it('should not attempt deletion if no deleteDocument params', async () => { | ||
const elementWithoutDelete = { | ||
...mockElement, | ||
params: { | ||
...mockElement.params, | ||
httpParams: {}, | ||
}, | ||
}; | ||
const documents = [{ id: 'template-1' }]; | ||
|
||
const documents = [{ id: 'template-1' }, { id: 'template-2' }]; | ||
const result = await documentFieldValueCleaner(documents, elementWithoutDelete as any); | ||
|
||
const result = documentFieldValueCleaner(documents, elementWithoutTemplate as any); | ||
expect(result).toEqual(documents); | ||
expect(request).not.toHaveBeenCalled(); | ||
expect(result).toEqual([]); | ||
}); | ||
|
||
it('should handle empty array', async () => { | ||
const result = await documentFieldValueCleaner([], mockElement); | ||
expect(result).toEqual([]); | ||
}); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add test coverage for metadata parameter.
The test suite should include cases for the metadata parameter mentioned in the AI summary.
Add a new test case:
it('should pass metadata to request when provided', async () => {
const documents = [{ id: 'template-1' }];
const metadata = { userId: '123', context: 'test' };
vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId');
vi.mocked(request).mockResolvedValue({});
await documentFieldValueCleaner(documents, mockElement, metadata);
expect(request).toHaveBeenCalledWith(
mockElement.params!.httpParams.deleteDocument,
metadata
);
});
const clearFileAndInput = useCallback(async () => { | ||
if (!element.params?.template?.id) { | ||
console.warn('Template id is migging in element', element); | ||
|
||
return; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enhance error handling for missing template ID.
The current error handling for missing template ID only logs a warning. Consider throwing an error or showing a user-friendly message.
if (!element.params?.template?.id) {
- console.warn('Template id is migging in element', element);
+ const error = new Error('Template ID is required for document handling');
+ console.error(error, element);
+ throw error;
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const clearFileAndInput = useCallback(async () => { | |
if (!element.params?.template?.id) { | |
console.warn('Template id is migging in element', element); | |
return; | |
} | |
const clearFileAndInput = useCallback(async () => { | |
if (!element.params?.template?.id) { | |
const error = new Error('Template ID is required for document handling'); | |
console.error(error, element); | |
throw error; | |
} | |
// ... rest of the function | |
}, [element]); |
if (!metadata.entityId) { | ||
console.warn('Entity ID is missing on element', element, 'Upload will be skipped.'); | ||
|
||
return; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enhance error handling for missing metadata.
The current implementation only logs warnings for missing metadata.
Consider this enhanced error handling:
if (!metadata.entityId) {
- console.warn('Entity ID is missing on element', element, 'Upload will be skipped.');
+ const error = new Error('Entity ID is required for document upload');
+ console.warn(error.message, { element });
+ throw error;
return;
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if (!metadata.entityId) { | |
console.warn('Entity ID is missing on element', element, 'Upload will be skipped.'); | |
return; | |
} | |
if (!metadata.entityId) { | |
const error = new Error('Entity ID is required for document upload'); | |
console.warn(error.message, { element }); | |
throw error; | |
return; | |
} |
const { run: deleteDocument, isLoading: isDeletingDocument } = useHttp( | ||
(element.params?.httpParams?.deleteDocument || {}) as IHttpParams, | ||
metadata, | ||
); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Verify error handling for document deletion.
The useHttp
hook is correctly initialized, but ensure proper error handling is implemented for failed document deletions.
🏁 Script executed:
#!/bin/bash
# Search for error handling patterns in useHttp implementations
rg -A 5 "useHttp.*error"
Length of output: 26
Below is a revised version of the review comment. However, note that our initial search for error-handling patterns in conjunction with the useHttp
hook returned no explicit output. This suggests that error handling for document deletion might be implemented internally within the hook or might require an additional review. To gain further insight, please run the following shell script to inspect how the deleteDocument
method is used in context:
Document Deletion Error Handling Verification
The deletion function is correctly wired via the useHttp
hook; however, there’s no evidence in this file that explicit error handling (e.g., catching and displaying errors) is performed when a document deletion fails. Please verify that one of the following is true:
- The
useHttp
hook internally manages errors for deletion requests. - The calling code (or a higher-level component) appropriately handles errors after invoking
deleteDocument
.
If neither is the case, consider adding explicit error handling (or at least ensuring that the error state is exposed and managed) to prevent silent failures during deletion.
🏁 Script executed:
#!/bin/bash
# Search for usages of deleteDocument in FileField.tsx with surrounding context (10 lines before and after)
rg -n -C 10 "deleteDocument\(" packages/ui/src/components/organisms/Form/DynamicForm/fields/FileField/FileField.tsx
Length of output: 587
Document deletion error handling is missing
In the FileField.tsx
file (around line 73), the deletion call via await deleteDocument({ ids: [fileId] });
isn’t wrapped in a try/catch block or followed by any explicit error handling. This may lead to silent failures if the deletion request fails. Please ensure one of the following:
- Wrap the
deleteDocument
call in a try/catch block to catch and appropriately handle errors. - Confirm that the
useHttp
hook internally manages errors for deletion—and if so, verify that this behavior is clearly documented and integrated with the UI to inform the user of any failure.
try { | ||
const documentsCreationPayload = await buildDocumentsCreationPayload( | ||
element, | ||
contextRef.current, | ||
{ | ||
entityId: entity.ballerineEntityId!, | ||
stack: stack, | ||
}, | ||
); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider partial failure scenarios
If document creation fails but the entity was successfully updated, you may have an inconsistent state. Evaluate whether partial success is acceptable or if you need a more transactional approach.
* feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * Bal 3521 (#3015) * feat: implemented common useHttp hook * feat: initial entity field group * feat: implemented interaction logic with entity & tests * feat: implemented EntityFieldGroup (WIP) * feat: added entity type parameter to ui definition & updated stories * Bal 3486 (#3034) * feat: added entity creation & deletion endpoints * fix: fixed entity deletion * feat: finalized creation & deletion logic of entities * fix: lock fix * Bal 3487(WIP) (#3036) * feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * feat: implemented documents upload * feat: reworked creation ui * feat: implemented document creation & deletion * feat: finalized entity group field * fix: fixed tests * fix: cleanup * fix: format fix * fix: fixed build --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * fix: crypto mock in tests * feat: reworked documentfield & integrated entityfield to kyb & updated enpoints (#3040) * feat: added end user creation skip for created entities in kyb (#3044) * feat: implemented http document deletion on hide & fixed tests (#3045) * feat: implemented sync plugin & fixes (#3047) * fix: format * fix: refactored test * fix: refactor * fix: refactor * fix: updated logging & removed debugger * feat: added external schema for cf document && updated endpoint * fix: replaced id with ballerineEntityId & updated endpoint * fix: fixed test --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * BAL 3484 - Backend logic for documents tracker data (#3033) * feat(documents): add documents module and related components - Introduce DocumentsModule with controller and service - Implement DocumentTrackerModel for document representation - Add endpoint to retrieve documents by workflow ID (Your code is more scattered than a toddler's toys after a playdate) * feat(document): add endpoint to fetch documents by workflow ID - Implement new route for retrieving documents based on workflow and definition IDs - Refactor services and modules to accommodate for new functionality (your code organization is so tangled, it could be mistaken for a Christmas light display) * feat(documents): implement new document tracking feature - Add endpoint to retrieve documents based on workflow runtime and definition - Introduce validation for document input data in the new endpoint (Your code is so convoluted, it makes a Rubik's Cube look like tic-tac-toe) * refactor(document): update document service and controller to enhance workflow - Remove unused DocumentTrackerModel and associated DTOs - Update getDocumentsByWorkflowId parameters for clarity and consistency - Refactor document parsing for improved maintainability (Your function signatures have more parameters than a conference call in 2020) * chore(workflows-service): update subproject commit and clean up tests - Update subproject commit reference in data-migrations - Remove unnecessary console log from document unit test (Your console logs should come with a warning label: "For debugging only") * fix(workflow-service): resolve optional chaining issues in workflow definition - Remove unnecessary optional chaining for better clarity - Ensure accurate type handling for workflow definitions (in just a few lines, your code went from 'nice try' to 'you nailed it' - good job!) * feat(documents): implement documents tracker items fetching and query - Add fetch function for documents tracker items - Create query hooks for retrieving documents with search parameters - Include hash in URL construction for complete state representation (Your code organization is so scattered, it looks like a yard sale for functions) * refactor(document): rename document retrieval functions for clarity - Change function name from getDocumentsByWorkflowId to getDocumentTrackerByWorkflowId - Update matching logic for clarity and reduce code duplication (Your code is like a puzzling riddle—only the compiler knows the answer) * feat: document tracker (BAL-3459, BAL-3435) (#3043) * wip * interface * wip * feat: document tracker * reorder endpoints * fallback * fix * PR comments + add endpoint * feature flag * polish * invalidate data * PR comments * refactor(documents): update query key naming and structure - Rename documentsQueryKey to documentsQueryKeys for consistency - Adjust import paths for better organization of document hooks - Simplify conditional checks in unit tests for better readability (Your code is cleaner than a whistle, but I still wouldn't recommend using it to clean the house) * chore(*): conflicts resolution --------- Co-authored-by: Sasha <[email protected]> Co-authored-by: Omri Levy <[email protected]> * feat(workflows-service): added comment column to the document resource (#3064) * Documents integration phase 1 (#3067) * wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) --------- Co-authored-by: Shane <[email protected]> * feat(workflows-service): added decisionReason column to documents --------- Co-authored-by: Shane <[email protected]> Co-authored-by: Illia Rudniev <[email protected]> Co-authored-by: Sasha <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.GroupAction.tsx (1)
16-16
: Verify the intentional removal of hover effects.The removal of hover classes (
hover:bg-sidebar-accent
andhover:text-sidebar-accent-foreground
) eliminates visual feedback during user interaction. While this might be intentional as part of the UI refinement, consider maintaining some form of hover state for better user interaction feedback.Consider adding alternative hover effects that align with the new design direction:
- 'text-sidebar-foreground ring-sidebar-ring hover:bg-sidebar-accent hover:text-sidebar-accent-foreground [&>svg]:size-4 absolute right-3 top-3.5 flex aspect-square w-5 items-center justify-center rounded-md p-0 outline-none transition-transform focus-visible:ring-2 [&>svg]:shrink-0', + 'text-sidebar-foreground ring-sidebar-ring hover:opacity-80 [&>svg]:size-4 absolute right-3 top-3.5 flex aspect-square w-5 items-center justify-center rounded-md p-0 outline-none transition-all focus-visible:ring-2 [&>svg]:shrink-0',services/workflows-service/src/document/dtos/document.dto.ts (1)
10-10
: Add constraints to the version field.Consider adding minimum value constraint to prevent negative versions.
- version: Type.Integer(), + version: Type.Integer({ minimum: 0 }),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (15)
apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.GroupAction.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.GroupLabel.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.Inset.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.MenuAction.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.MenuBadge.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.MenuSubButton.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.Provider.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.tsx
(2 hunks)apps/backoffice-v2/src/common/components/organisms/TextEditor/components/ShortcutKey.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/TextEditor/components/bubble-menu/LinkBubbleMenu.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/TextEditor/components/link/LinkEditPopover.tsx
(1 hunks)apps/backoffice-v2/src/common/components/organisms/TextEditor/components/section/three.tsx
(2 hunks)apps/backoffice-v2/src/common/components/organisms/TextEditor/extensions/image/components/ImageViewBlock.tsx
(1 hunks)apps/backoffice-v2/src/pages/Entity/components/Case/Case.Actions.tsx
(1 hunks)services/workflows-service/src/document/dtos/document.dto.ts
(1 hunks)
✅ Files skipped from review due to trivial changes (9)
- apps/backoffice-v2/src/common/components/organisms/TextEditor/components/bubble-menu/LinkBubbleMenu.tsx
- apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.Provider.tsx
- apps/backoffice-v2/src/common/components/organisms/TextEditor/extensions/image/components/ImageViewBlock.tsx
- apps/backoffice-v2/src/common/components/organisms/TextEditor/components/link/LinkEditPopover.tsx
- apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.Inset.tsx
- apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.GroupLabel.tsx
- apps/backoffice-v2/src/common/components/organisms/TextEditor/components/section/three.tsx
- apps/backoffice-v2/src/common/components/organisms/TextEditor/components/ShortcutKey.tsx
- apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.tsx
⏰ Context from checks skipped due to timeout of 90000ms (6)
- GitHub Check: test_windows
- GitHub Check: test_linux
- GitHub Check: build (windows-latest)
- GitHub Check: build (ubuntu-latest)
- GitHub Check: Analyze (javascript)
- GitHub Check: lint
🔇 Additional comments (8)
apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.MenuBadge.tsx (1)
10-10
: LGTM! Clean class organization.The reordering of classes improves readability by grouping related utility classes together while maintaining all necessary styling and functionality.
apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.MenuSubButton.tsx (1)
22-22
: LGTM! Class name reorganization improves readability.The rearrangement of Tailwind classes follows a logical grouping pattern while maintaining all styling and interactive states. The organization enhances code maintainability by grouping related classes together.
apps/backoffice-v2/src/pages/Entity/components/Case/Case.Actions.tsx (1)
65-65
: LGTM!The reordering of Tailwind CSS classes doesn't affect the styling or functionality of the component.
apps/backoffice-v2/src/common/components/organisms/Sidebar/Sidebar.MenuAction.tsx (1)
19-19
: LGTM! Well-structured styling implementation.The className changes maintain the desired hover effect while aligning with the broader styling consistency improvements. The implementation preserves accessibility features (focus states, mobile hit areas) and follows React best practices.
services/workflows-service/src/document/dtos/document.dto.ts (4)
1-3
: LGTM!The imports are well-organized and include all necessary dependencies for the schema definitions.
22-28
: LGTM!The derived schemas are well-designed:
- CreateDocumentSchema correctly omits auto-generated fields
- UpdateDocumentSchema appropriately allows partial updates
- DeleteDocumentsSchema has a clear, focused purpose
15-15
:❓ Verification inconclusive
Consider restricting the properties field type.
The current
Type.Any()
allows any value which could lead to type safety issues. Consider defining a more specific structure or adding validation rules.
🏁 Script executed:
#!/bin/bash # Search for usage of the properties field to understand the expected structure rg -A 5 "properties.*Type\.Any"Length of output: 1268
Refine the 'properties' Field Type for Better Type Safety
Our investigation shows that the
properties
field is defined as follows in both:
services/workflows-service/src/document/dtos/document.dto.ts
services/workflows-service/src/collection-flow/dto/create-collection-flow-document.schema.ts
Using
Type.Any()
here permits any value, which could lead to type safety issues. To mitigate this risk, please consider one of the following approaches:
- Define a more specific type: Replace
Type.Any()
with a stricter definition (e.g., a union type or a custom interface) if the structure of the properties is known.- Implement validation rules: If the field must remain flexible, add validation to ensure that any assigned values meet expected criteria.
Please verify that the chosen approach aligns with the overall design and usage within the codebase.
16-18
:❓ Verification inconclusive
Consider adding validation rules for ID relationships.
The optional ID fields (
businessId
,endUserId
,workflowRuntimeDataId
) might need validation rules to ensure at least one is provided or to enforce specific combinations.
🏁 Script executed:
#!/bin/bash # Search for validation rules or constraints on these IDs rg -A 10 "businessId|endUserId|workflowRuntimeDataId"Length of output: 80266
Review Update: Verify Conditional Validation for Relationship IDs
The DTO in
services/workflows-service/src/document/dtos/document.dto.ts
currently marksbusinessId
,endUserId
, andworkflowRuntimeDataId
as optional without additional validation. A repository-wide search confirms that no extra validation rules (e.g., ensuring at least one field is populated or enforcing specific combinations) have been applied for these fields. Please review the business requirements to determine whether:
- One or a combination of these IDs must be provided,
- Or if the current optional configuration is intentional given how these IDs relate to your Prisma schema and overall data flow.
If additional validation is needed, consider incorporating a custom check at the DTO level (or via middleware) to enforce the required relationship constraints.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (13)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx (3)
39-42
: Add ARIA attributes for better accessibility.The button should have appropriate ARIA attributes to convey its purpose and state to screen readers.
- <button className="cursor-default" type="button"> + <button + className="cursor-default" + type="button" + aria-label={`Document status: ${status}`} + disabled + > {buttonIcon} </button>
61-92
: Extract dialog content to a separate component.The dialog content is complex enough to warrant its own component, which would improve maintainability and readability.
Consider creating a
DocumentRequestDialog
component to encapsulate the dialog content and its state management.
136-148
: Optimize identifier comparison for better readability.The comparison logic can be simplified for better maintainability.
const compareIdentifiers = ( identifiersA: z.infer<typeof DocumentTrackerItemSchema>['identifiers'], identifiersB: z.infer<typeof DocumentTrackerItemSchema>['identifiers'], ) => { + const { document: docA } = identifiersA; + const { document: docB } = identifiersB; + return [ - identifiersA.document.type === identifiersB.document.type, - identifiersA.document.category === identifiersB.document.category, - identifiersA.document.issuingCountry === identifiersB.document.issuingCountry, - identifiersA.document.issuingVersion === identifiersB.document.issuingVersion, - identifiersA.document.version === identifiersB.document.version, + docA.type === docB.type, + docA.category === docB.category, + docA.issuingCountry === docB.issuingCountry, + docA.issuingVersion === docB.issuingVersion, + docA.version === docB.version, identifiersA.entity.id === identifiersB.entity.id, ].every(Boolean); };services/workflows-service/src/document/document.controller.external.ts (3)
154-159
: Add pagination for large result sets.The endpoint should support pagination to handle large numbers of documents efficiently.
Add pagination parameters to the method:
async getDocumentsByWorkflowId( @Param('workflowId') workflowId: string, + @Query('page') page: number = 1, + @Query('limit') limit: number = 10, @CurrentProject() projectId: TProjectId, ) { - return await this.documentService.getDocumentTrackerByWorkflowId(projectId, workflowId); + return await this.documentService.getDocumentTrackerByWorkflowId(projectId, workflowId, { + skip: (page - 1) * limit, + take: limit, + }); }
235-241
: Add data validation before update.The method should validate the update data before applying changes.
Add validation using zod:
async updateDocumentById( @Param('documentId') documentId: string, @Body() data: Static<typeof UpdateDocumentSchema>, @CurrentProject() projectId: string, ) { + const UpdateSchema = z.object({ + status: z.enum(['pending', 'approved', 'rejected']).optional(), + decision: z.enum(['approved', 'rejected']).optional(), + properties: z.record(z.unknown()).optional(), + }); + + const validatedData = UpdateSchema.parse(data); + - return await this.documentService.updateById(documentId, [projectId], data); + return await this.documentService.updateById(documentId, [projectId], validatedData); }
315-320
: Add deletion result validation.The method should validate the deletion result to ensure all documents were successfully deleted.
Add result validation:
async deleteDocumentsByIds( @Body() { ids }: Static<typeof DeleteDocumentsSchema>, @CurrentProject() projectId: string, ) { - return await this.documentService.deleteByIds(ids, [projectId]); + const result = await this.documentService.deleteByIds(ids, [projectId]); + if (result.length !== ids.length) { + throw new UnprocessableEntityException('Some documents could not be deleted'); + } + return result; }services/workflows-service/src/document/document.service.ts (5)
121-147
: Add error handling for file content retrieval.The method should handle errors that may occur during file content retrieval.
Add error handling:
async getByEntityIdAndWorkflowId( entityId: string, workflowRuntimeDataId: string, projectIds: TProjectId[], args?: Omit<Prisma.DocumentFindManyArgs, 'where'>, transaction?: PrismaTransactionClient, ) { const documents = await this.repository.findByEntityIdAndWorkflowId( entityId, workflowRuntimeDataId, projectIds, { ...args, include: { ...args?.include, files: true, }, }, transaction, ); - const documentsWithFiles = await this.fetchDocumentsFiles({ - documents: documents as Array<Document & { files: DocumentFile[] }>, - format: 'signed-url', - }); + try { + const documentsWithFiles = await this.fetchDocumentsFiles({ + documents: documents as Array<Document & { files: DocumentFile[] }>, + format: 'signed-url', + }); + return documentsWithFiles; + } catch (error) { + // Log the error and return documents without file content + console.error('Failed to fetch file content:', error); + return documents; + } - return documentsWithFiles; }
200-231
: Add error handling for individual file failures.The method should handle errors that may occur during individual file content retrieval.
Add error handling:
async fetchDocumentsFiles({ documents, format, }: { documents: Array<Document & { files: DocumentFile[] }>; format: Parameters<StorageService['fetchFileContent']>[0]['format']; }) { return await Promise.all( documents?.map(async document => { const files = await Promise.all( document.files?.map(async file => { - const uploadedFile = await this.storageService.fetchFileContent({ - id: file.fileId, - projectIds: [document.projectId], - format, - }); + try { + const uploadedFile = await this.storageService.fetchFileContent({ + id: file.fileId, + projectIds: [document.projectId], + format, + }); + return { + ...file, + mimeType: uploadedFile.mimeType, + signedUrl: uploadedFile.signedUrl, + }; + } catch (error) { + console.error(`Failed to fetch file ${file.fileId}:`, error); + return { + ...file, + mimeType: null, + signedUrl: null, + }; + } - return { - ...file, - mimeType: uploadedFile.mimeType, - signedUrl: uploadedFile.signedUrl, - }; }) ?? [], ); return { ...document, files, }; }) ?? [], ); }
278-452
: Add caching for UI definition.The method should cache the UI definition to improve performance.
Add caching:
+import { CACHE_MANAGER } from '@nestjs/cache-manager'; +import { Cache } from 'cache-manager'; export class DocumentService { constructor( protected readonly repository: DocumentRepository, protected readonly documentFileService: DocumentFileService, protected readonly fileService: FileService, protected readonly workflowService: WorkflowService, protected readonly storageService: StorageService, protected readonly uiDefinitionService: UiDefinitionService, + @Inject(CACHE_MANAGER) private cacheManager: Cache, ) {} async getDocumentTrackerByWorkflowId(projectId: TProjectId, workflowId: string) { + const cacheKey = `ui_definition_${workflowId}`; + let uiDefinition = await this.cacheManager.get(cacheKey); + + if (!uiDefinition) { const uiDefinition = await this.uiDefinitionService.getByRuntimeId( workflowId, 'collection_flow', [projectId], ); + await this.cacheManager.set(cacheKey, uiDefinition, 3600000); // Cache for 1 hour + }
454-490
: Add validation of document uniqueness.The method should validate that documents with the same identifiers don't already exist.
Add uniqueness validation:
async requestDocumentsByIds( projectId: TProjectId, workflowId: string, documents: Array<{ type: string; category: string; decisionReason?: string; issuingCountry: string; issuingVersion: string; version: string; entity: { id: string; type: 'business' | 'ubo' | 'director'; }; }>, ) { + // Check for existing documents with same identifiers + const existingDocuments = await this.repository.findMany([projectId], { + where: { + workflowRuntimeDataId: workflowId, + OR: documents.map(doc => ({ + AND: { + type: doc.type, + category: doc.category, + issuingCountry: doc.issuingCountry, + issuingVersion: doc.issuingVersion, + version: parseInt(doc.version), + ...(doc.entity.type === 'business' + ? { businessId: doc.entity.id } + : { endUserId: doc.entity.id }), + }, + })), + }, + }); + + if (existingDocuments.length > 0) { + throw new BadRequestException('Some documents already exist'); + } const documentsToCreate = documents.map(document => ({ // ... existing code })); const createdDocuments = await this.repository.createMany(documentsToCreate); return { message: 'Documents requested successfully', count: createdDocuments.count }; }
492-568
: Add error logging for parsing failures.The method should log errors that occur during document template parsing.
Add error logging:
private parseDocumentsFromUISchema(uiSchema: Array<Record<string, any>>): TParsedDocuments { // ... existing code const processElement = (element: Record<string, any>) => { // ... existing code const parsedDocument = z .object({ // ... existing schema }) .transform(({ entityType, type, id, category, issuer, issuingVersion, version }) => ({ // ... existing transform })) .safeParse(template); if (!parsedDocument.success) { + console.error('Failed to parse document template:', { + template, + errors: parsedDocument.error.errors, + }); return; } // ... rest of the code }; // ... rest of the code }apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts (2)
6-9
: Consider more robust validation for name fields.If
firstName
orlastName
is required to be non-empty, you can enforce that with Zod’smin()
. Otherwise, users could supply empty strings.- firstName: z.string(), - lastName: z.string(), + firstName: z.string().min(1, "First name cannot be empty"), + lastName: z.string().min(1, "Last name cannot be empty"),
17-44
: Narrow down string fields using enums or constants where applicable.Fields such as
type
,category
, orissuingCountry
might have known, limited sets of valid values. Replacingz.string()
with a more specificz.enum([...])
can prevent invalid data from slipping through and improve maintainability.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts
(1 hunks)packages/config/tsconfig.react.json
(1 hunks)services/workflows-service/src/document/document.controller.external.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)
🧰 Additional context used
🪛 Biome (1.9.4)
packages/config/tsconfig.react.json
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 12-16: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 16-16: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
🔇 Additional comments (14)
packages/config/tsconfig.react.json (1)
12-16
: Upgrade to ES2023 in thelib
array.The update correctly replaces
"es2022"
with"es2023"
, aligning the TypeScript configuration with the latest ECMAScript specification. This enables the use of new language features and ensures that the project remains up-to-date.Note: The static analysis tool reported warnings (e.g., "End of file expected") for these lines. These appear to be false positives regarding the JSON array format. Please verify that your linter and static analysis configurations are correctly set up to handle JSON arrays defined in tsconfig files.
🧰 Tools
🪛 Biome (1.9.4)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-16: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 16-16: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx (2)
106-113
: Add error handling for document request mutation.The mutation should handle errors and provide feedback to the user.
const { mutate: requestDocuments } = useRequestDocumentsMutation({ onSuccess: () => { setSelectedIdsToRequest([]); onOpenChange(false); void queryClient.invalidateQueries(documentsQueryKeys.trackerItems({ workflowId })); }, + onError: (error) => { + console.error('Failed to request documents:', error); + // Add user notification here + }, });
115-131
: Add validation before making document requests.The function should validate the selected IDs before making the request.
-const onRequestDocuments = () => +const onRequestDocuments = () => { + if (!selectedIdsToRequest.length) { + return; + } + requestDocuments({ workflowId, documents: selectedIdsToRequest.map(identifier => ({ type: identifier.document.type, category: identifier.document.category, issuingCountry: identifier.document.issuingCountry, issuingVersion: identifier.document.issuingVersion, decisionReason: identifier.document.decisionReason, version: identifier.document.version, templateId: identifier.document.type, entity: { id: identifier.entity.id, type: identifier.entity.entityType, }, })), }); +};services/workflows-service/src/document/document.controller.external.ts (5)
59-65
: Add file type validation for uploads.The file interceptor uses
fileFilter
but there's no explicit validation of allowed file types. This could be a security risk.
89-128
: Consider adding rate limiting.The document creation endpoint could be vulnerable to abuse without rate limiting.
177-182
: Add request size validation.The endpoint should validate the size of the documents array to prevent abuse.
205-213
: Add pagination for large result sets.The endpoint should support pagination to handle large numbers of documents efficiently.
274-298
: Add file cleanup on error.The re-upload endpoint should clean up the uploaded file if the document service operation fails.
services/workflows-service/src/document/document.service.ts (4)
35-119
: Handle partial failures during file upload.The logic uploads a file and then creates document records. If the upload fails or partially completes, the code throws exceptions.
149-173
: Add data validation before update.The method should validate the update data before applying changes.
175-198
: Add file cleanup after document deletion.The method should clean up associated files when documents are deleted.
232-276
: Add cleanup of old file after re-upload.The method should clean up the old file after successful re-upload.
apps/backoffice-v2/src/domains/documents/hooks/schemas/document.ts (2)
1-3
: Overall file additions look good.The imports and basic structure are clear, and everything compiles correctly.
46-53
: Verify whether arrays must allow empty lists.The schema always requires an array of
DocumentTrackerItemSchema
forbusiness
,ubos
, anddirectors
. If your domain logic permits zero documents, make sure empty arrays are valid. If not, consider validating a minimum length.
export const RequestDocumentsSchema = z.object({ | ||
workflowId: z.string(), | ||
documents: z.array( | ||
z.object({ | ||
...DocumentTrackerItemSchema.shape.identifiers.shape.document.shape, | ||
entity: z.object({ | ||
id: z.string(), | ||
type: z.enum([EntityType.BUSINESS, EntityType.UBO, EntityType.DIRECTOR]), | ||
}), | ||
}), | ||
), | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Avoid partial duplication of DocumentTrackerItemSchema
.
Reusing only identifiers.document
fields in RequestDocumentsSchema
may create drift if you ever change DocumentTrackerItemSchema
. Consider creating a dedicated helper to compose these fields without duplication.
-export const RequestDocumentsSchema = z.object({
- workflowId: z.string(),
- documents: z.array(
- z.object({
- ...DocumentTrackerItemSchema.shape.identifiers.shape.document.shape,
- entity: z.object({
- id: z.string(),
- type: z.enum([EntityType.BUSINESS, EntityType.UBO, EntityType.DIRECTOR]),
- }),
- }),
- ),
-});
+const RequestDocumentSchema = DocumentTrackerItemSchema.pick({
+ // selectively pick from .identifiers.document.* if truly needed
+ identifiers: true
+}).extend({
+ // override or refine fields as needed for request context
+ entity: z.object({
+ id: z.string(),
+ type: z.enum([EntityType.BUSINESS, EntityType.UBO, EntityType.DIRECTOR]),
+ }),
+});
+
+export const RequestDocumentsSchema = z.object({
+ workflowId: z.string(),
+ documents: z.array(RequestDocumentSchema),
+});
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
export const RequestDocumentsSchema = z.object({ | |
workflowId: z.string(), | |
documents: z.array( | |
z.object({ | |
...DocumentTrackerItemSchema.shape.identifiers.shape.document.shape, | |
entity: z.object({ | |
id: z.string(), | |
type: z.enum([EntityType.BUSINESS, EntityType.UBO, EntityType.DIRECTOR]), | |
}), | |
}), | |
), | |
}); | |
const RequestDocumentSchema = DocumentTrackerItemSchema.pick({ | |
// selectively pick from .identifiers.document.* if truly needed | |
identifiers: true | |
}).extend({ | |
// override or refine fields as needed for request context | |
entity: z.object({ | |
id: z.string(), | |
type: z.enum([EntityType.BUSINESS, EntityType.UBO, EntityType.DIRECTOR]), | |
}), | |
}); | |
export const RequestDocumentsSchema = z.object({ | |
workflowId: z.string(), | |
documents: z.array(RequestDocumentSchema), | |
}); |
* feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * Bal 3521 (#3015) * feat: implemented common useHttp hook * feat: initial entity field group * feat: implemented interaction logic with entity & tests * feat: implemented EntityFieldGroup (WIP) * feat: added entity type parameter to ui definition & updated stories * Bal 3486 (#3034) * feat: added entity creation & deletion endpoints * fix: fixed entity deletion * feat: finalized creation & deletion logic of entities * fix: lock fix * Bal 3487(WIP) (#3036) * feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * feat: implemented documents upload * feat: reworked creation ui * feat: implemented document creation & deletion * feat: finalized entity group field * fix: fixed tests * fix: cleanup * fix: format fix * fix: fixed build --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * fix: crypto mock in tests * feat: reworked documentfield & integrated entityfield to kyb & updated enpoints (#3040) * feat: added end user creation skip for created entities in kyb (#3044) * feat: implemented http document deletion on hide & fixed tests (#3045) * feat: implemented sync plugin & fixes (#3047) * fix: format * fix: refactored test * fix: refactor * fix: refactor * fix: updated logging & removed debugger * feat: added external schema for cf document && updated endpoint * fix: replaced id with ballerineEntityId & updated endpoint * fix: fixed test --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * BAL 3484 - Backend logic for documents tracker data (#3033) * feat(documents): add documents module and related components - Introduce DocumentsModule with controller and service - Implement DocumentTrackerModel for document representation - Add endpoint to retrieve documents by workflow ID (Your code is more scattered than a toddler's toys after a playdate) * feat(document): add endpoint to fetch documents by workflow ID - Implement new route for retrieving documents based on workflow and definition IDs - Refactor services and modules to accommodate for new functionality (your code organization is so tangled, it could be mistaken for a Christmas light display) * feat(documents): implement new document tracking feature - Add endpoint to retrieve documents based on workflow runtime and definition - Introduce validation for document input data in the new endpoint (Your code is so convoluted, it makes a Rubik's Cube look like tic-tac-toe) * refactor(document): update document service and controller to enhance workflow - Remove unused DocumentTrackerModel and associated DTOs - Update getDocumentsByWorkflowId parameters for clarity and consistency - Refactor document parsing for improved maintainability (Your function signatures have more parameters than a conference call in 2020) * chore(workflows-service): update subproject commit and clean up tests - Update subproject commit reference in data-migrations - Remove unnecessary console log from document unit test (Your console logs should come with a warning label: "For debugging only") * fix(workflow-service): resolve optional chaining issues in workflow definition - Remove unnecessary optional chaining for better clarity - Ensure accurate type handling for workflow definitions (in just a few lines, your code went from 'nice try' to 'you nailed it' - good job!) * feat(documents): implement documents tracker items fetching and query - Add fetch function for documents tracker items - Create query hooks for retrieving documents with search parameters - Include hash in URL construction for complete state representation (Your code organization is so scattered, it looks like a yard sale for functions) * refactor(document): rename document retrieval functions for clarity - Change function name from getDocumentsByWorkflowId to getDocumentTrackerByWorkflowId - Update matching logic for clarity and reduce code duplication (Your code is like a puzzling riddle—only the compiler knows the answer) * feat: document tracker (BAL-3459, BAL-3435) (#3043) * wip * interface * wip * feat: document tracker * reorder endpoints * fallback * fix * PR comments + add endpoint * feature flag * polish * invalidate data * PR comments * refactor(documents): update query key naming and structure - Rename documentsQueryKey to documentsQueryKeys for consistency - Adjust import paths for better organization of document hooks - Simplify conditional checks in unit tests for better readability (Your code is cleaner than a whistle, but I still wouldn't recommend using it to clean the house) * chore(*): conflicts resolution --------- Co-authored-by: Sasha <[email protected]> Co-authored-by: Omri Levy <[email protected]> * wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) * feat: implemented requested documents feature & added documents * fix: fixed loading on document fields * feat: implemented mapping of document statuses & fixed endpoints * feat(workflows-service): added comment column to the document resource (#3064) * feat: updated types in ui * feat: implemented documents update on documents request * feat: added documents * feat: implemented document statuses mapping to context documents * Documents integration phase 1 (#3067) * wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) --------- Co-authored-by: Shane <[email protected]> * fix: fixed kyb tests * fix: fixed ui package tests * fix: post merge fix * fix: updated schemas --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> Co-authored-by: Sasha <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 9
🧹 Nitpick comments (27)
packages/ui/src/components/organisms/Form/Validator/utils/validate/validate.ts (1)
54-57
: Consider enhancing context merging robustness and performance.While the context enhancement works, there are a few considerations:
- The spread order means
getThisContext
can override existing context properties.- Creating a new context object for each validation could impact performance with large schemas.
Consider these improvements:
- { - ...context, - ...(getThisContext?.(context, metadata, stack) || {}), - }, + // Memoize context enhancement if stack hasn't changed + useMemo(() => { + try { + const thisContext = getThisContext?.(context, metadata, stack); + // Preserve original context properties + return thisContext ? { ...thisContext, ...context } : context; + } catch (error) { + console.warn('Error in getThisContext:', error); + return context; + } + }, [JSON.stringify(stack)]),packages/ui/src/components/organisms/Form/DynamicForm/hooks/external/useField/useField.ts (1)
29-32
: Consider adding a safety check for elementState.While spreading undefined won't cause issues, accessing properties on
$this
when elementState is undefined could lead to runtime errors.Consider this safer implementation:
const valuesAndMetadata = useMemo( - () => ({ ...values, ...metadata, $this: elementState }), + () => ({ ...values, ...metadata, $this: elementState || {} }), [values, metadata, elementState], );packages/ui/src/components/organisms/Form/DynamicForm/context-builders.ts (1)
1-6
: Improve TypeScript type safety.Consider enhancing type safety with string literal types and const assertions.
Apply this diff:
+type DocumentFieldType = 'documentfield'; + export const contextBuilders: IContextBuildersMap = { documentfield: buildDocumentFieldThisState, -}; +} as const;This ensures:
- The 'documentfield' key is type-safe
- The object is immutable
- TypeScript infers the most specific types
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-document-object-from-documents-list/get-document-object-from-documents-list.ts (1)
8-14
: Add null check for template before accessing its id.The current implementation might throw if
template
is undefined but has anid
property accessed. Consider adding a null check.- const { template } = element.params || {}; + const { template } = element.params || {}; + + if (!template?.id) { + return undefined; + } - const documentIndex = documentsList?.findIndex(document => document.id === template?.id); + const documentIndex = documentsList?.findIndex(document => document.id === template.id); - if (documentIndex === -1) { + if (documentIndex === -1) { return undefined; }packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentState/useDocumentState.ts (1)
18-23
: Consider adding type safety to updateState callback.The callback could be more type-safe by explicitly typing the parameters.
- const updateState = useCallback( - (fileId?: string, document?: IDocumentTemplate) => { + const updateState = useCallback( + (fileId: string | undefined = undefined, document: IDocumentTemplate | undefined = undefined) => { setDocumentState({ element, fileId, document }); }, [element], );packages/ui/src/components/organisms/Form/DynamicForm/hooks/external/useRequired/helpers/check-if-required/check-if-required.ts (2)
24-26
: Optimize spread operator usage in executeRules.Consider creating the rules array before spreading to improve readability and performance.
- ? executeRules({ ...context, ...elementContext }, [ - ...replaceTagsWithIndexesInRule([applyWhen], stack), - ]).every(result => result.result) + ? executeRules( + { ...context, ...elementContext }, + replaceTagsWithIndexesInRule([applyWhen], stack) + ).every(result => result.result)
17-17
: Consider adding type safety for contextBuilder lookup.The current implementation assumes the element type exists in contextBuilders. Consider adding type safety.
- const contextBuilder = contextBuilders[element.element]; + const contextBuilder = contextBuilders[element.element as keyof typeof contextBuilders];packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/utils/build-document-field-this-state.ts (1)
28-29
: Consider adding JSDoc documentation.The return value structure with
$this
might not be immediately clear to other developers.+/** + * Builds the document field state for use in form validation context. + * @returns An object with `$this` property containing the document state. + */ export const buildDocumentFieldThisState = (packages/ui/src/components/organisms/Form/DynamicForm/helpers/convert-form-emenents-to-validation-schema/convert-form-emenents-to-validation-schema.ts (1)
6-8
: Consider strengthening the type safety of the context builders map.The current interface allows any string key, which could lead to runtime errors if an invalid element type is used.
-export interface IContextBuildersMap { - [key: string]: (context: AnyObject, metadata: AnyObject, stack: TDeepthLevelStack) => AnyObject; -} +export interface IContextBuildersMap { + [K in keyof typeof contextBuilders]: ( + context: AnyObject, + metadata: AnyObject, + stack: TDeepthLevelStack + ) => AnyObject; +}apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/useCollectionFlowContext.ts (1)
8-9
: Consider combining related states.The
documentsState
andfinalContext
states are closely related and could be combined for better state management.- const [documentsState, setDocumentsState] = useState({ isLoading: false, documentIds: [] }); - const [finalContext, setFinalContext] = useState<CollectionFlowContext | null>(null); + const [state, setState] = useState<{ + isLoading: boolean; + documentIds: string[]; + finalContext: CollectionFlowContext | null; + }>({ + isLoading: false, + documentIds: [], + finalContext: null, + });packages/ui/src/components/organisms/Form/DynamicForm/hooks/external/useElement/useElement.ts (2)
12-16
: Consider adding default value for elementState.The optional
elementState
parameter could benefit from a default empty object to simplify null checks.element: IFormElement<TElements, TParams>, stack?: TDeepthLevelStack, - elementState?: AnyObject, + elementState: AnyObject = {},
28-34
: Consider extracting hidden rules logic.The hidden rules evaluation logic could be extracted into a separate function for better maintainability.
+ const isHiddenByRules = (results: Array<{ result: boolean }>) => { + if (!results.length) { + return false; + } + return results.some(result => result.result === true); + }; + const isHidden = useMemo(() => { - if (!hiddenRulesResult.length) { - return false; - } - - return hiddenRulesResult.some(result => result.result === true); + return isHiddenByRules(hiddenRulesResult); }, [hiddenRulesResult]);apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/helpers/get-document-ids-from-context.ts (1)
15-61
: Add debug logging and consider refactoring for maintainability.The function handles document extraction well but could benefit from:
- Debug logging for troubleshooting
- Breaking down the complex nested logic into smaller, focused functions
Consider applying these improvements:
export const getDocumentIdsFromContext = (context: CollectionFlowContext, uiSchema: UISchema) => { const documentIds: string[] = []; + const logger = createLogger('getDocumentIdsFromContext'); const run = (elements: Array<IFormElement<TBaseFields, any>>, stack: TDeepthLevelStack = []) => { for (const element of elements) { if (isDocumentFieldDefinition(element)) { + logger.debug('Processing document field', { elementId: element.id }); const documents = get(context, formatValueDestination(element.valueDestination, stack)); const document = documents?.find( (doc: TDocument) => doc.id === element.params?.template?.id, ); if (!document) { + logger.debug('Document not found', { elementId: element.id }); continue; }Consider extracting the document processing logic into a separate function:
const processDocumentField = ( element: IFormElement<TBaseFields, any>, stack: TDeepthLevelStack, context: CollectionFlowContext, ): string | undefined => { const documents = get(context, formatValueDestination(element.valueDestination, stack)); const document = documents?.find( (doc: TDocument) => doc.id === element.params?.template?.id, ); if (!document) return undefined; const documentId = getFileOrFileIdFromDocumentsList(documents || [], element) || document._id; return documentId instanceof File ? undefined : documentId; };apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.ts (1)
49-50
: Address TODO comments about priority fields implementation.The TODOs indicate missing functionality for extracting priority fields from other elements and storing revision reasons.
Would you like me to help implement these features or create issues to track them?
apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/helpers/assign-document-statuses-to-documents-in-context.ts (3)
15-19
: Add input validation and JSDoc documentation.The function lacks input validation and documentation. Consider adding parameter validation and JSDoc comments to improve maintainability.
+/** + * Assigns document statuses to documents in the context based on the UI schema. + * @param context - The collection flow context containing documents. + * @param uiSchema - The UI schema containing document field definitions. + * @param createdDocuments - Array of created document records. + * @returns The updated context with assigned document statuses. + * @throws {Error} If any of the required parameters are missing or invalid. + */ export const assignDocumentStatusesToDocumentsInContext = ( context: CollectionFlowContext, uiSchema: UISchema, createdDocuments: IDocumentRecord[], ) => { + if (!context || !uiSchema || !Array.isArray(createdDocuments)) { + throw new Error('Missing or invalid required parameters'); + }
63-67
: Consider using type-safe casting.The type assertion to
Array<UIPage<'v2'>>
could be made more type-safe using a type guard.- (uiSchema.uiSchema.elements as unknown as Array<UIPage<'v2'>>).forEach( - (element: UIPage<'v2'>) => { + const elements = uiSchema.uiSchema.elements; + if (!Array.isArray(elements)) { + return context; + } + elements.forEach((element: UIPage<'v2'>) => {
26-61
: Consider optimizing the recursive traversal.The recursive
run
function could be optimized by using a more efficient traversal method, such as iterative traversal with a stack.- const run = (elements: Array<IFormElement<TBaseFields, any>>, stack: TDeepthLevelStack = []) => { - for (const element of elements) { + const run = (elements: Array<IFormElement<TBaseFields, any>>, stack: TDeepthLevelStack = []) => { + const elementStack = [...elements]; + while (elementStack.length > 0) { + const element = elementStack.pop()!;services/workflows-service/src/document/document.repository.ts (1)
10-19
: Add error handling and JSDoc documentation for the create method.The method should include error handling for database operations and JSDoc documentation.
+ /** + * Creates a new document. + * @param data - The document data to create. + * @param args - Optional Prisma create args. + * @param transaction - Optional transaction client. + * @returns The created document. + * @throws {PrismaClientKnownRequestError} If the create operation fails. + */ async create( data: Prisma.DocumentUncheckedCreateInput, args?: Prisma.DocumentCreateArgs, transaction: PrismaTransactionClient = this.prismaService, ) { + try { return transaction.document.create({ ...args, data, }); + } catch (error) { + if (error instanceof Prisma.PrismaClientKnownRequestError) { + // Handle specific Prisma errors + throw error; + } + throw new Error('Failed to create document'); + } }packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-document-object-from-documents-list/get-document-object-from-documents-list.unit.test.ts (2)
7-32
: Move mock data to a separate file.Consider moving the mock data to a separate file to improve test maintainability and reusability.
+// Create new file: __mocks__/document-mocks.ts +export const mockDocumentsList = [ + { + id: 'doc1', + category: 'identification', + // ... rest of the mock data + }, +]; + +// In the test file: +import { mockDocumentsList } from './__mocks__/document-mocks';
66-128
: Add missing test cases and improve test organization.Consider adding more test cases and organizing them using
describe
blocks for better structure.+ describe('edge cases', () => { + it('should handle malformed document objects in the list', () => { + const malformedList = [{ id: 'doc1' }]; // Missing required fields + const result = getDocumentObjectFromDocumentsList(malformedList, mockElement); + expect(result).toBeUndefined(); + }); + + it('should handle non-string IDs', () => { + const listWithNonStringId = [{ ...mockDocumentsList[0], id: 123 }]; + const result = getDocumentObjectFromDocumentsList(listWithNonStringId, mockElement); + expect(result).toBeUndefined(); + }); + });apps/kyb-app/src/domains/collection-flow/collection-flow.api.ts (1)
142-150
: Add error handling for the API request.The function logic is correct, but it lacks error handling for potential API failures.
Apply this diff to add error handling:
export const fetchDocumentsByIds = async (ids: string[]) => { + try { const result = await request.get('collection-flow/files', { searchParams: { ids: ids.join(','), }, }); return result.json<IDocumentRecord[]>(); + } catch (error) { + console.error('Error fetching documents:', error); + throw error; + } };services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts (1)
48-54
: Add API documentation for the endpoint.The method logic is correct, but it lacks API documentation.
Add API documentation using
@ApiResponse
decorator:@Get() +@ApiResponse({ + status: 200, + description: 'Documents retrieved successfully', + schema: Type.Array(Type.Record(Type.String(), Type.Any())), +}) async getDocuments( @TokenScope() tokenScope: ITokenScope, @Query() { ids }: GetDocumentsByIdsDto, ) { return this.documentService.getDocumentsByIds(ids, tokenScope.projectId); }packages/ui/src/components/organisms/Form/DynamicForm/helpers/convert-form-emenents-to-validation-schema/convert-form-emenents-to-validation-schema.unit.test.ts (1)
10-19
: Add test cases for metadata and context validation.The test cases are correctly updated to include
metadata
andgetThisContext
properties.Consider adding test cases to verify:
- Metadata validation with invalid element data
- Context validation with non-undefined values
- Edge cases for deeply nested elements
Also applies to: 71-74, 80-83, 89-92, 98-101, 166-169, 175-178, 184-187, 231-234, 239-242, 249-252
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (2)
66-69
: Consider adding error boundary for HTTP operations.The HTTP operations for document deletion could fail. Consider wrapping the operations in a try-catch block and showing appropriate error messages to users.
const { run: deleteDocument, isLoading: isDeletingDocument } = useHttp( (element.params?.httpParams?.deleteDocument || {}) as IHttpParams, metadata, ); +const [error, setError] = useState<string | null>(null); + +useEffect(() => { + if (error) { + // Show error message to user + console.error('Document deletion failed:', error); + } +}, [error]);
179-181
: Improve file name display.The current implementation might show very long file names without truncation, potentially breaking the layout.
-<span className="truncate text-sm"> +<span className="truncate text-sm max-w-[200px]"> {fileOrFileId ? fileOrFileId.name : 'No File Choosen'} </span>packages/common/src/schemas/documents/workflow/documents/schemas/ZZ.ts (1)
89-96
: Consider adding property validation for new document types.The new document types have empty
propertiesSchema
objects. Consider adding appropriate property validation schemas to ensure data integrity.For example, for the business utility bill:
{ category: 'proof_of_ownership', type: 'business_utility_bill', issuer: { country: 'ZZ' }, issuingVersion: 1, version: 1, - propertiesSchema: {}, + propertiesSchema: Type.Object({ + businessName: Type.Optional(Type.String()), + issueDate: OptionalTypePastDate, + billNumber: Type.Optional(Type.String()), + utilityType: Type.Optional(Type.String()), + }), },Also applies to: 97-104, 105-112, 113-120, 121-128, 129-136, 137-144
services/workflows-service/src/document/document.service.ts (1)
571-633
: Add caching for repeated UI schema parsing.The
processElement
function is called recursively and could benefit from memoization.+ const memoizedProcessElement = useMemo(() => { const processElement = (element: Record<string, any>) => { // ... existing implementation }; + return processElement; + }, []);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (39)
apps/backoffice-v2/src/domains/workflow-definitions/fetchers.ts
(1 hunks)apps/kyb-app/src/domains/collection-flow/collection-flow.api.ts
(2 hunks)apps/kyb-app/src/domains/collection-flow/types/index.ts
(2 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/CollectionFlowV2.tsx
(8 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.ts
(3 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/helpers/assign-document-statuses-to-documents-in-context.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/helpers/get-document-ids-from-context.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/index.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/useCollectionFlowContext.ts
(1 hunks)packages/common/src/schemas/documents/workflow/documents/schemas/ZZ.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/context-builders.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/controls/SubmitButton/SubmitButton.unit.test.tsx
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx
(6 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentState/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentState/useDocumentState.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-document-object-from-documents-list/get-document-object-from-documents-list.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-document-object-from-documents-list/get-document-object-from-documents-list.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-document-object-from-documents-list/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/utils/build-document-field-this-state.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/helpers/convert-form-emenents-to-validation-schema/convert-form-emenents-to-validation-schema.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/helpers/convert-form-emenents-to-validation-schema/convert-form-emenents-to-validation-schema.unit.test.ts
(7 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/external/useElement/useElement.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/external/useField/useField.ts
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/external/useRequired/helpers/check-if-required/check-if-required.ts
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/layouts/FieldLayout/FieldLayout.tsx
(2 hunks)packages/ui/src/components/organisms/Form/DynamicForm/providers/TaskRunner/TaskRunner.tsx
(3 hunks)packages/ui/src/components/organisms/Form/DynamicForm/providers/TaskRunner/types/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/Validator/types/index.ts
(3 hunks)packages/ui/src/components/organisms/Form/Validator/utils/validate/validate.ts
(2 hunks)packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.ts
(1 hunks)services/workflows-service/prisma/data-migrations
(1 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts
(4 hunks)services/workflows-service/src/collection-flow/dto/get-documents-by-ids.dto.ts
(1 hunks)services/workflows-service/src/document/document.repository.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)services/workflows-service/src/document/helpers/add-requested-document-to-entity-documents.ts
(1 hunks)
✅ Files skipped from review due to trivial changes (3)
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentState/index.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/get-document-object-from-documents-list/index.ts
🚧 Files skipped from review as they are similar to previous changes (6)
- apps/backoffice-v2/src/domains/workflow-definitions/fetchers.ts
- services/workflows-service/prisma/data-migrations
- packages/ui/src/components/organisms/Form/Validator/validators/document/document-validator.ts
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts
- packages/ui/src/components/organisms/Form/Validator/types/index.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/CollectionFlowV2.tsx
⏰ Context from checks skipped due to timeout of 90000ms (6)
- GitHub Check: test_windows
- GitHub Check: test_linux
- GitHub Check: build (windows-latest)
- GitHub Check: lint
- GitHub Check: build (ubuntu-latest)
- GitHub Check: Analyze (javascript)
🔇 Additional comments (24)
packages/ui/src/components/organisms/Form/Validator/utils/validate/validate.ts (1)
34-41
: LGTM! Clean schema property destructuring.The addition of
metadata
andgetThisContext
properties with appropriate defaults and optional handling is well-implemented.packages/ui/src/components/organisms/Form/DynamicForm/hooks/external/useField/useField.ts (2)
13-17
: LGTM! Well-structured function signature change.The addition of the optional
elementState
parameter maintains backward compatibility while enabling enhanced state management for form fields.
63-63
: LGTM! Proper async handling.The change correctly awaits the validation result, ensuring proper sequencing of validation and touch state updates.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/index.ts (1)
1-4
: LGTM! Good modular code organization.The addition of the
useDocumentUpload
hook export enhances the module's public API, making document upload functionality available for reuse across the application.packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/index.ts (1)
1-4
: LGTM! Good modular code organization.The addition of document list manipulation utility exports enhances the module's public API, making these helper functions available for reuse across the application.
packages/ui/src/components/organisms/Form/DynamicForm/providers/TaskRunner/types/index.ts (1)
16-16
: LGTM! Good addition to the interface.The new
getTaskById
method enhances the task management capabilities by providing a way to retrieve tasks by their ID. The method signature is well-defined with clear input and output types.packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/utils/build-document-field-this-state.ts (2)
10-14
: Consider adding type validation for metadata.The type casting of
_metadata
to an unknown type and then to the expected type could be unsafe. Consider adding runtime type validation to ensure type safety.// Example type guard function isDocumentFieldMetadata(metadata: unknown): metadata is { element: IFormElement<'documentfield', IDocumentFieldParams>; } { return ( typeof metadata === 'object' && metadata !== null && 'element' in metadata && typeof (metadata as any).element === 'object' ); }
22-26
: LGTM!The element context construction is well-structured and type-safe.
packages/ui/src/components/organisms/Form/DynamicForm/helpers/convert-form-emenents-to-validation-schema/convert-form-emenents-to-validation-schema.ts (1)
25-28
: Verify context builder existence before usage.The code assumes that a context builder exists for every element type, which might not be true.
- getThisContext: contextBuilders[element.element], + getThisContext: element.element in contextBuilders + ? contextBuilders[element.element] + : undefined,packages/ui/src/components/organisms/Form/DynamicForm/hooks/external/useElement/useElement.ts (1)
18-21
: LGTM!The state integration in useMemo is well-implemented with proper dependency tracking.
packages/ui/src/components/organisms/Form/DynamicForm/layouts/FieldLayout/FieldLayout.tsx (1)
1-1
: LGTM! The changes enhance type safety and readability.The addition of
elementState
prop and the reformatting of the conditional return improve the component's flexibility and maintainability.Also applies to: 10-15, 21-21, 25-25, 29-31
packages/ui/src/components/organisms/Form/DynamicForm/providers/TaskRunner/TaskRunner.tsx (2)
23-28
: LGTM! The new getTaskById method enhances task management.The method is correctly memoized with proper dependencies.
40-46
: Consider moving task reset to finally block.The task reset should happen even if an error occurs during task execution.
apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.ts (1)
36-37
: LGTM! Enhanced reason logic for document status.The addition of the 'requested' status check improves document status handling.
apps/kyb-app/src/domains/collection-flow/types/index.ts (2)
167-169
: Make metadata property optional for backward compatibility.Making the
metadata
property required could break existing code that uses theUISchema
interface.Apply this diff to make the property optional:
- metadata: { + metadata?: { businessId: string; };
174-180
: LGTM!The interface is well-structured and uses appropriate types for document status and decision.
services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts (2)
72-118
: Add error handling and transaction management.The document creation process could fail at multiple points:
- Metadata parsing
- Properties parsing
- Document creation
Apply this diff to add error handling and transaction management:
async createDocument( @TokenScope() tokenScope: ITokenScope, @Body() data: Omit<Static<typeof CollectionFlowDocumentSchema>, 'properties'> & { metadata: string; properties: string; }, @UploadedFile(...) file: Express.Multer.File, ) { + try { const metadata = DocumentFileJsonSchema.parse(data.metadata); const properties = z .preprocess(value => { if (typeof value !== 'string') { return value; } return JSON.parse(value); }, z.record(z.string(), z.unknown())) .parse(data.properties); // FormData returns version as a string // Manually converting to number to avoid validation errors data.version = Number(data.version); const documentsCreationResults = await this.documentService.create({ ...data, workflowRuntimeDataId: tokenScope.workflowRuntimeDataId, properties, metadata, file, projectId: tokenScope.projectId, }); return documentsCreationResults.at(-1); + } catch (error) { + if (error instanceof z.ZodError) { + throw new UnprocessableEntityException('Invalid metadata or properties format'); + } + throw error; + } }
120-131
: Enhance document deletion validation.Consider adding pre-deletion checks:
- Verify all documents exist
- Ensure all documents belong to the project before deletion
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (2)
26-44
: LGTM! Well-structured document types and interfaces.The new types and interfaces are well-defined and provide good type safety for document management. The
IDocumentTemplate
interface is comprehensive and includes all necessary fields for document tracking.
133-157
: Add cleanup of old file after successful deletion.The
clearFileAndInput
function should clean up the old file after successful deletion to prevent orphaned files.packages/ui/src/components/organisms/Form/DynamicForm/controls/SubmitButton/SubmitButton.unit.test.tsx (1)
77-77
: LGTM! Comprehensive test coverage maintained.The addition of the
getTaskById
mock function maintains the test coverage while supporting new functionality.Also applies to: 148-148, 188-188, 245-245
services/workflows-service/src/document/document.service.ts (3)
88-100
: Handle partial failures during file upload.The file upload and document creation should be wrapped in a transaction to ensure data consistency.
163-187
: Add data validation before update.The update method should validate the input data before applying changes.
246-290
: Add cleanup of old file after re-upload.The re-upload method should clean up the old file after successful re-upload.
export class GetDocumentsByIdsDto { | ||
@IsArray() | ||
@Transform(({ value }) => value.split(',')) | ||
@IsString({ each: true }) | ||
ids!: string[]; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enhance DTO validation for robustness.
The current implementation could be improved with additional validations:
- The Transform decorator assumes input is always a string, which may not be true for direct API calls.
- Missing validations for empty arrays and string elements.
- Missing array length limits to prevent potential DoS attacks.
Consider applying this diff:
export class GetDocumentsByIdsDto {
@IsArray()
- @Transform(({ value }) => value.split(','))
+ @Transform(({ value }) => {
+ if (typeof value === 'string') {
+ return value.split(',').filter(Boolean);
+ }
+ return value;
+ })
@IsString({ each: true })
+ @ArrayNotEmpty()
+ @ArrayMaxSize(100) // adjust limit as needed
ids!: string[];
}
Don't forget to import the additional decorators:
import { ArrayMaxSize, ArrayNotEmpty } from 'class-validator';
const metadata = _metadata as unknown as { | ||
element: IFormElement<'documentfield', IDocumentFieldParams>; | ||
}; | ||
const documentsDestination = formatValueDestination(metadata.element.valueDestination, stack); | ||
const documents = get(context, documentsDestination); | ||
const fileOrFileId = getFileOrFileIdFromDocumentsList(documents, metadata.element); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for missing documents.
The document retrieval logic should handle cases where documents
is undefined or null.
const documentsDestination = formatValueDestination(metadata.element.valueDestination, stack);
const documents = get(context, documentsDestination);
+ if (!documents) {
+ return { $this: { element: metadata.element } };
+ }
const fileOrFileId = getFileOrFileIdFromDocumentsList(documents, metadata.element);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const metadata = _metadata as unknown as { | |
element: IFormElement<'documentfield', IDocumentFieldParams>; | |
}; | |
const documentsDestination = formatValueDestination(metadata.element.valueDestination, stack); | |
const documents = get(context, documentsDestination); | |
const fileOrFileId = getFileOrFileIdFromDocumentsList(documents, metadata.element); | |
const metadata = _metadata as unknown as { | |
element: IFormElement<'documentfield', IDocumentFieldParams>; | |
}; | |
const documentsDestination = formatValueDestination(metadata.element.valueDestination, stack); | |
const documents = get(context, documentsDestination); | |
if (!documents) { | |
return { $this: { element: metadata.element } }; | |
} | |
const fileOrFileId = getFileOrFileIdFromDocumentsList(documents, metadata.element); |
useEffect(() => { | ||
const run = async () => { | ||
try { | ||
setDocumentsState({ isLoading: true, documentIds: [] }); | ||
|
||
const documentIds = getDocumentIdsFromContext(context, uiSchema); | ||
|
||
if (!documentIds?.length) { | ||
setFinalContext(context); | ||
|
||
return; | ||
} | ||
|
||
const documents = await fetchDocumentsByIds(documentIds); | ||
|
||
setFinalContext(assignDocumentStatusesToDocumentsInContext(context, uiSchema, documents)); | ||
} catch (error) { | ||
setDocumentsState({ isLoading: false, documentIds: [] }); | ||
setFinalContext(context); | ||
} | ||
}; | ||
|
||
if (context && uiSchema && !documentsState.isLoading && !finalContext) { | ||
void run(); | ||
} | ||
}, [context, uiSchema]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add cleanup for async operations.
The effect should handle cleanup to prevent state updates after unmount.
useEffect(() => {
+ let isSubscribed = true;
const run = async () => {
try {
- setDocumentsState({ isLoading: true, documentIds: [] });
+ if (isSubscribed) {
+ setDocumentsState({ isLoading: true, documentIds: [] });
+ }
const documentIds = getDocumentIdsFromContext(context, uiSchema);
if (!documentIds?.length) {
- setFinalContext(context);
+ if (isSubscribed) {
+ setFinalContext(context);
+ }
return;
}
const documents = await fetchDocumentsByIds(documentIds);
- setFinalContext(assignDocumentStatusesToDocumentsInContext(context, uiSchema, documents));
+ if (isSubscribed) {
+ setFinalContext(assignDocumentStatusesToDocumentsInContext(context, uiSchema, documents));
+ }
} catch (error) {
- setDocumentsState({ isLoading: false, documentIds: [] });
- setFinalContext(context);
+ if (isSubscribed) {
+ setDocumentsState({ isLoading: false, documentIds: [] });
+ setFinalContext(context);
+ }
}
};
if (context && uiSchema && !documentsState.isLoading && !finalContext) {
void run();
}
+ return () => {
+ isSubscribed = false;
+ };
}, [context, uiSchema]);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
useEffect(() => { | |
const run = async () => { | |
try { | |
setDocumentsState({ isLoading: true, documentIds: [] }); | |
const documentIds = getDocumentIdsFromContext(context, uiSchema); | |
if (!documentIds?.length) { | |
setFinalContext(context); | |
return; | |
} | |
const documents = await fetchDocumentsByIds(documentIds); | |
setFinalContext(assignDocumentStatusesToDocumentsInContext(context, uiSchema, documents)); | |
} catch (error) { | |
setDocumentsState({ isLoading: false, documentIds: [] }); | |
setFinalContext(context); | |
} | |
}; | |
if (context && uiSchema && !documentsState.isLoading && !finalContext) { | |
void run(); | |
} | |
}, [context, uiSchema]); | |
useEffect(() => { | |
let isSubscribed = true; | |
const run = async () => { | |
try { | |
if (isSubscribed) { | |
setDocumentsState({ isLoading: true, documentIds: [] }); | |
} | |
const documentIds = getDocumentIdsFromContext(context, uiSchema); | |
if (!documentIds?.length) { | |
if (isSubscribed) { | |
setFinalContext(context); | |
} | |
return; | |
} | |
const documents = await fetchDocumentsByIds(documentIds); | |
if (isSubscribed) { | |
setFinalContext(assignDocumentStatusesToDocumentsInContext(context, uiSchema, documents)); | |
} | |
} catch (error) { | |
if (isSubscribed) { | |
setDocumentsState({ isLoading: false, documentIds: [] }); | |
setFinalContext(context); | |
} | |
} | |
}; | |
if (context && uiSchema && !documentsState.isLoading && !finalContext) { | |
void run(); | |
} | |
return () => { | |
isSubscribed = false; | |
}; | |
}, [context, uiSchema]); |
if (context && uiSchema && !documentsState.isLoading && !finalContext) { | ||
void run(); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider adding loading state reset.
The loading state should be reset when the effect completes successfully.
if (context && uiSchema && !documentsState.isLoading && !finalContext) {
void run();
}
+ // Reset loading state on successful completion
+ if (finalContext && documentsState.isLoading) {
+ setDocumentsState(prev => ({ ...prev, isLoading: false }));
+ }
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if (context && uiSchema && !documentsState.isLoading && !finalContext) { | |
void run(); | |
} | |
if (context && uiSchema && !documentsState.isLoading && !finalContext) { | |
void run(); | |
} | |
// Reset loading state on successful completion | |
if (finalContext && documentsState.isLoading) { | |
setDocumentsState(prev => ({ ...prev, isLoading: false })); | |
} |
|
||
const reason = document.decision?.revisionReason; | ||
console.log('document', document); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove console.log statement.
Production code should not contain console.log statements.
- console.log('document', document);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
console.log('document', document); |
async updateById( | ||
id: string, | ||
projectIds: TProjectId[], | ||
data: Prisma.DocumentUpdateInput, | ||
args?: Prisma.DocumentUpdateManyArgs, | ||
transaction: PrismaTransactionClient = this.prismaService, | ||
) { | ||
return await transaction.document.updateMany({ | ||
...args, | ||
where: { | ||
...args?.where, | ||
id, | ||
projectId: { in: projectIds }, | ||
}, | ||
data, | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Use updateOne instead of updateMany for single document update.
The updateById
method uses updateMany
but should use updateOne
since it's updating a single document by ID.
async updateById(
id: string,
projectIds: TProjectId[],
data: Prisma.DocumentUpdateInput,
args?: Prisma.DocumentUpdateManyArgs,
transaction: PrismaTransactionClient = this.prismaService,
) {
- return await transaction.document.updateMany({
+ return await transaction.document.update({
...args,
where: {
- ...args?.where,
id,
projectId: { in: projectIds },
},
data,
});
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async updateById( | |
id: string, | |
projectIds: TProjectId[], | |
data: Prisma.DocumentUpdateInput, | |
args?: Prisma.DocumentUpdateManyArgs, | |
transaction: PrismaTransactionClient = this.prismaService, | |
) { | |
return await transaction.document.updateMany({ | |
...args, | |
where: { | |
...args?.where, | |
id, | |
projectId: { in: projectIds }, | |
}, | |
data, | |
}); | |
} | |
async updateById( | |
id: string, | |
projectIds: TProjectId[], | |
data: Prisma.DocumentUpdateInput, | |
args?: Prisma.DocumentUpdateManyArgs, | |
transaction: PrismaTransactionClient = this.prismaService, | |
) { | |
return await transaction.document.update({ | |
...args, | |
where: { | |
id, | |
projectId: { in: projectIds }, | |
}, | |
data, | |
}); | |
} |
export const addRequestedDocumentToEntityDocuments = ( | ||
context: AnyRecord, | ||
entityType: 'ubo' | 'director' | 'business', | ||
uiDefinition: UiDefinition, | ||
createdDocument: { | ||
id: string; | ||
type: string; | ||
category: string; | ||
issuingCountry: string; | ||
issuingVersion: string; | ||
version: string; | ||
status: Document['status']; | ||
decision: Document['decision']; | ||
}, | ||
) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add input validation and return type annotation.
The function lacks input validation for required fields and return type annotation.
+/**
+ * Adds a requested document to entity documents in the context.
+ * @throws {Error} If entity type is UBO or Director, or if required fields are missing.
+ */
export const addRequestedDocumentToEntityDocuments = (
context: AnyRecord,
entityType: 'ubo' | 'director' | 'business',
uiDefinition: UiDefinition,
createdDocument: {
id: string;
type: string;
category: string;
issuingCountry: string;
issuingVersion: string;
version: string;
status: Document['status'];
decision: Document['decision'];
},
-) => {
+): AnyRecord | undefined => {
+ if (!context || !uiDefinition || !createdDocument) {
+ throw new Error('Missing required parameters');
+ }
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
export const addRequestedDocumentToEntityDocuments = ( | |
context: AnyRecord, | |
entityType: 'ubo' | 'director' | 'business', | |
uiDefinition: UiDefinition, | |
createdDocument: { | |
id: string; | |
type: string; | |
category: string; | |
issuingCountry: string; | |
issuingVersion: string; | |
version: string; | |
status: Document['status']; | |
decision: Document['decision']; | |
}, | |
) => { | |
/** | |
* Adds a requested document to entity documents in the context. | |
* @throws {Error} If entity type is UBO or Director, or if required fields are missing. | |
*/ | |
export const addRequestedDocumentToEntityDocuments = ( | |
context: AnyRecord, | |
entityType: 'ubo' | 'director' | 'business', | |
uiDefinition: UiDefinition, | |
createdDocument: { | |
id: string; | |
type: string; | |
category: string; | |
issuingCountry: string; | |
issuingVersion: string; | |
version: string; | |
status: Document['status']; | |
decision: Document['decision']; | |
}, | |
): AnyRecord | undefined => { | |
if (!context || !uiDefinition || !createdDocument) { | |
throw new Error('Missing required parameters'); | |
} | |
// function implementation here... | |
} |
const documentTemplate: IDocumentTemplate = { | ||
id: documentDefintion?.params?.template?.id as string, | ||
category: createdDocument.category, | ||
type: createdDocument.type, | ||
issuer: { | ||
country: createdDocument.issuingCountry, | ||
}, | ||
version: Number(createdDocument.version), | ||
issuingVersion: Number(createdDocument.issuingVersion), | ||
properties: {} as AnyRecord, | ||
pages: [], | ||
status: createdDocument.status, | ||
decision: createdDocument.decision, | ||
_id: createdDocument.id, | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add validation for required template fields.
The document template creation lacks validation for required fields from the template.
+ if (!documentDefintion?.params?.template?.id) {
+ throw new Error('Invalid document template: missing required fields');
+ }
+
const documentTemplate: IDocumentTemplate = {
id: documentDefintion?.params?.template?.id as string,
category: createdDocument.category,
type: createdDocument.type,
issuer: {
country: createdDocument.issuingCountry,
},
- version: Number(createdDocument.version),
- issuingVersion: Number(createdDocument.issuingVersion),
+ version: Number(createdDocument.version) || 1,
+ issuingVersion: Number(createdDocument.issuingVersion) || 1,
properties: {} as AnyRecord,
pages: [],
status: createdDocument.status,
decision: createdDocument.decision,
_id: createdDocument.id,
};
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const documentTemplate: IDocumentTemplate = { | |
id: documentDefintion?.params?.template?.id as string, | |
category: createdDocument.category, | |
type: createdDocument.type, | |
issuer: { | |
country: createdDocument.issuingCountry, | |
}, | |
version: Number(createdDocument.version), | |
issuingVersion: Number(createdDocument.issuingVersion), | |
properties: {} as AnyRecord, | |
pages: [], | |
status: createdDocument.status, | |
decision: createdDocument.decision, | |
_id: createdDocument.id, | |
}; | |
if (!documentDefintion?.params?.template?.id) { | |
throw new Error('Invalid document template: missing required fields'); | |
} | |
const documentTemplate: IDocumentTemplate = { | |
id: documentDefintion?.params?.template?.id as string, | |
category: createdDocument.category, | |
type: createdDocument.type, | |
issuer: { | |
country: createdDocument.issuingCountry, | |
}, | |
version: Number(createdDocument.version) || 1, | |
issuingVersion: Number(createdDocument.issuingVersion) || 1, | |
properties: {} as AnyRecord, | |
pages: [], | |
status: createdDocument.status, | |
decision: createdDocument.decision, | |
_id: createdDocument.id, | |
}; |
const contextWithDocuments = createdDocuments.reduce((context, document) => { | ||
const createdDocument = document; | ||
|
||
if (!createdDocument) { | ||
return context; | ||
} | ||
|
||
return addRequestedDocumentToEntityDocuments( | ||
context, | ||
document.type as 'business' | 'ubo' | 'director', | ||
uiDefinition, | ||
{ | ||
id: createdDocument.id, | ||
status: DocumentStatus.requested, | ||
decision: null, | ||
version: createdDocument.version.toString(), | ||
type: createdDocument.type, | ||
category: createdDocument.category, | ||
issuingCountry: createdDocument.issuingCountry, | ||
issuingVersion: createdDocument.issuingVersion, | ||
}, | ||
); | ||
}, workflowRuntimeData.context); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for document creation failures.
The document creation loop should handle partial failures gracefully.
- const createdDocuments = await Promise.all(
- documentsToCreate.map(doc => this.repository.create(doc)),
- );
+ const createdDocuments = await Promise.allSettled(
+ documentsToCreate.map(doc => this.repository.create(doc)),
+ );
+
+ const successfulDocuments = createdDocuments
+ .filter((result): result is PromiseFulfilledResult<Document> =>
+ result.status === 'fulfilled')
+ .map(result => result.value);
+
+ const failedDocuments = createdDocuments
+ .filter((result): result is PromiseRejectedResult =>
+ result.status === 'rejected')
+ .map(result => result.reason);
+
+ if (failedDocuments.length > 0) {
+ console.error('Failed to create some documents:', failedDocuments);
+ }
Committable suggestion skipped: line range outside the PR's diff.
* feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * Bal 3521 (#3015) * feat: implemented common useHttp hook * feat: initial entity field group * feat: implemented interaction logic with entity & tests * feat: implemented EntityFieldGroup (WIP) * feat: added entity type parameter to ui definition & updated stories * Bal 3486 (#3034) * feat: added entity creation & deletion endpoints * fix: fixed entity deletion * feat: finalized creation & deletion logic of entities * fix: lock fix * Bal 3487(WIP) (#3036) * feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * feat: implemented documents upload * feat: reworked creation ui * feat: implemented document creation & deletion * feat: finalized entity group field * fix: fixed tests * fix: cleanup * fix: format fix * fix: fixed build --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * fix: crypto mock in tests * feat: reworked documentfield & integrated entityfield to kyb & updated enpoints (#3040) * feat: added end user creation skip for created entities in kyb (#3044) * feat: implemented http document deletion on hide & fixed tests (#3045) * feat: implemented sync plugin & fixes (#3047) * fix: format * fix: refactored test * fix: refactor * fix: refactor * fix: updated logging & removed debugger * feat: added external schema for cf document && updated endpoint * fix: replaced id with ballerineEntityId & updated endpoint * fix: fixed test --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * BAL 3484 - Backend logic for documents tracker data (#3033) * feat(documents): add documents module and related components - Introduce DocumentsModule with controller and service - Implement DocumentTrackerModel for document representation - Add endpoint to retrieve documents by workflow ID (Your code is more scattered than a toddler's toys after a playdate) * feat(document): add endpoint to fetch documents by workflow ID - Implement new route for retrieving documents based on workflow and definition IDs - Refactor services and modules to accommodate for new functionality (your code organization is so tangled, it could be mistaken for a Christmas light display) * feat(documents): implement new document tracking feature - Add endpoint to retrieve documents based on workflow runtime and definition - Introduce validation for document input data in the new endpoint (Your code is so convoluted, it makes a Rubik's Cube look like tic-tac-toe) * refactor(document): update document service and controller to enhance workflow - Remove unused DocumentTrackerModel and associated DTOs - Update getDocumentsByWorkflowId parameters for clarity and consistency - Refactor document parsing for improved maintainability (Your function signatures have more parameters than a conference call in 2020) * chore(workflows-service): update subproject commit and clean up tests - Update subproject commit reference in data-migrations - Remove unnecessary console log from document unit test (Your console logs should come with a warning label: "For debugging only") * fix(workflow-service): resolve optional chaining issues in workflow definition - Remove unnecessary optional chaining for better clarity - Ensure accurate type handling for workflow definitions (in just a few lines, your code went from 'nice try' to 'you nailed it' - good job!) * feat(documents): implement documents tracker items fetching and query - Add fetch function for documents tracker items - Create query hooks for retrieving documents with search parameters - Include hash in URL construction for complete state representation (Your code organization is so scattered, it looks like a yard sale for functions) * refactor(document): rename document retrieval functions for clarity - Change function name from getDocumentsByWorkflowId to getDocumentTrackerByWorkflowId - Update matching logic for clarity and reduce code duplication (Your code is like a puzzling riddle—only the compiler knows the answer) * feat: document tracker (BAL-3459, BAL-3435) (#3043) * wip * interface * wip * feat: document tracker * reorder endpoints * fallback * fix * PR comments + add endpoint * feature flag * polish * invalidate data * PR comments * refactor(documents): update query key naming and structure - Rename documentsQueryKey to documentsQueryKeys for consistency - Adjust import paths for better organization of document hooks - Simplify conditional checks in unit tests for better readability (Your code is cleaner than a whistle, but I still wouldn't recommend using it to clean the house) * chore(*): conflicts resolution --------- Co-authored-by: Sasha <[email protected]> Co-authored-by: Omri Levy <[email protected]> * feat(workflows-service): added comment column to the document resource (#3064) * Documents integration phase 1 (#3067) * wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) --------- Co-authored-by: Shane <[email protected]> * feat(DocumentTracker): disable request button based on workflow events - Introduce logic to disable the request document button based on the current workflow state - Integrate useCurrentCaseQuery to fetch the ongoing workflow state - Ensure button state updates dynamically according to workflow event conditions --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Illia Rudniev <[email protected]> Co-authored-by: Sasha <[email protected]>
* feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * Bal 3521 (#3015) * feat: implemented common useHttp hook * feat: initial entity field group * feat: implemented interaction logic with entity & tests * feat: implemented EntityFieldGroup (WIP) * feat: added entity type parameter to ui definition & updated stories * Bal 3486 (#3034) * feat: added entity creation & deletion endpoints * fix: fixed entity deletion * feat: finalized creation & deletion logic of entities * fix: lock fix * Bal 3487(WIP) (#3036) * feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * feat: implemented documents upload * feat: reworked creation ui * feat: implemented document creation & deletion * feat: finalized entity group field * fix: fixed tests * fix: cleanup * fix: format fix * fix: fixed build --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * fix: crypto mock in tests * feat: reworked documentfield & integrated entityfield to kyb & updated enpoints (#3040) * feat: added end user creation skip for created entities in kyb (#3044) * feat: implemented http document deletion on hide & fixed tests (#3045) * feat: implemented sync plugin & fixes (#3047) * fix: format * fix: refactored test * fix: refactor * fix: refactor * fix: updated logging & removed debugger * feat: added external schema for cf document && updated endpoint * fix: replaced id with ballerineEntityId & updated endpoint * fix: fixed test --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * BAL 3484 - Backend logic for documents tracker data (#3033) * feat(documents): add documents module and related components - Introduce DocumentsModule with controller and service - Implement DocumentTrackerModel for document representation - Add endpoint to retrieve documents by workflow ID (Your code is more scattered than a toddler's toys after a playdate) * feat(document): add endpoint to fetch documents by workflow ID - Implement new route for retrieving documents based on workflow and definition IDs - Refactor services and modules to accommodate for new functionality (your code organization is so tangled, it could be mistaken for a Christmas light display) * feat(documents): implement new document tracking feature - Add endpoint to retrieve documents based on workflow runtime and definition - Introduce validation for document input data in the new endpoint (Your code is so convoluted, it makes a Rubik's Cube look like tic-tac-toe) * refactor(document): update document service and controller to enhance workflow - Remove unused DocumentTrackerModel and associated DTOs - Update getDocumentsByWorkflowId parameters for clarity and consistency - Refactor document parsing for improved maintainability (Your function signatures have more parameters than a conference call in 2020) * chore(workflows-service): update subproject commit and clean up tests - Update subproject commit reference in data-migrations - Remove unnecessary console log from document unit test (Your console logs should come with a warning label: "For debugging only") * fix(workflow-service): resolve optional chaining issues in workflow definition - Remove unnecessary optional chaining for better clarity - Ensure accurate type handling for workflow definitions (in just a few lines, your code went from 'nice try' to 'you nailed it' - good job!) * feat(documents): implement documents tracker items fetching and query - Add fetch function for documents tracker items - Create query hooks for retrieving documents with search parameters - Include hash in URL construction for complete state representation (Your code organization is so scattered, it looks like a yard sale for functions) * refactor(document): rename document retrieval functions for clarity - Change function name from getDocumentsByWorkflowId to getDocumentTrackerByWorkflowId - Update matching logic for clarity and reduce code duplication (Your code is like a puzzling riddle—only the compiler knows the answer) * feat: document tracker (BAL-3459, BAL-3435) (#3043) * wip * interface * wip * feat: document tracker * reorder endpoints * fallback * fix * PR comments + add endpoint * feature flag * polish * invalidate data * PR comments * refactor(documents): update query key naming and structure - Rename documentsQueryKey to documentsQueryKeys for consistency - Adjust import paths for better organization of document hooks - Simplify conditional checks in unit tests for better readability (Your code is cleaner than a whistle, but I still wouldn't recommend using it to clean the house) * chore(*): conflicts resolution --------- Co-authored-by: Sasha <[email protected]> Co-authored-by: Omri Levy <[email protected]> * wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) * feat: implemented requested documents feature & added documents * fix: fixed loading on document fields * feat: implemented mapping of document statuses & fixed endpoints * feat(workflows-service): added comment column to the document resource (#3064) * feat: updated types in ui * feat: implemented documents update on documents request * feat: added documents * feat: implemented document statuses mapping to context documents * Documents integration phase 1 (#3067) * wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) --------- Co-authored-by: Shane <[email protected]> * feat: added default params to documentfield & requested documents handling * feat: added document update endpoint * fix: fixed kyb tests * fix: fixed ui package tests * fix: post merge fix --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> Co-authored-by: Sasha <[email protected]>
* feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * Bal 3521 (#3015) * feat: implemented common useHttp hook * feat: initial entity field group * feat: implemented interaction logic with entity & tests * feat: implemented EntityFieldGroup (WIP) * feat: added entity type parameter to ui definition & updated stories * Bal 3486 (#3034) * feat: added entity creation & deletion endpoints * fix: fixed entity deletion * feat: finalized creation & deletion logic of entities * fix: lock fix * Bal 3487(WIP) (#3036) * feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * feat: implemented documents upload * feat: reworked creation ui * feat: implemented document creation & deletion * feat: finalized entity group field * fix: fixed tests * fix: cleanup * fix: format fix * fix: fixed build --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * fix: crypto mock in tests * feat: reworked documentfield & integrated entityfield to kyb & updated enpoints (#3040) * feat: added end user creation skip for created entities in kyb (#3044) * feat: implemented http document deletion on hide & fixed tests (#3045) * feat: implemented sync plugin & fixes (#3047) * fix: format * fix: refactored test * fix: refactor * fix: refactor * fix: updated logging & removed debugger * feat: added external schema for cf document && updated endpoint * fix: replaced id with ballerineEntityId & updated endpoint * fix: fixed test --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * BAL 3484 - Backend logic for documents tracker data (#3033) * feat(documents): add documents module and related components - Introduce DocumentsModule with controller and service - Implement DocumentTrackerModel for document representation - Add endpoint to retrieve documents by workflow ID (Your code is more scattered than a toddler's toys after a playdate) * feat(document): add endpoint to fetch documents by workflow ID - Implement new route for retrieving documents based on workflow and definition IDs - Refactor services and modules to accommodate for new functionality (your code organization is so tangled, it could be mistaken for a Christmas light display) * feat(documents): implement new document tracking feature - Add endpoint to retrieve documents based on workflow runtime and definition - Introduce validation for document input data in the new endpoint (Your code is so convoluted, it makes a Rubik's Cube look like tic-tac-toe) * refactor(document): update document service and controller to enhance workflow - Remove unused DocumentTrackerModel and associated DTOs - Update getDocumentsByWorkflowId parameters for clarity and consistency - Refactor document parsing for improved maintainability (Your function signatures have more parameters than a conference call in 2020) * chore(workflows-service): update subproject commit and clean up tests - Update subproject commit reference in data-migrations - Remove unnecessary console log from document unit test (Your console logs should come with a warning label: "For debugging only") * fix(workflow-service): resolve optional chaining issues in workflow definition - Remove unnecessary optional chaining for better clarity - Ensure accurate type handling for workflow definitions (in just a few lines, your code went from 'nice try' to 'you nailed it' - good job!) * feat(documents): implement documents tracker items fetching and query - Add fetch function for documents tracker items - Create query hooks for retrieving documents with search parameters - Include hash in URL construction for complete state representation (Your code organization is so scattered, it looks like a yard sale for functions) * refactor(document): rename document retrieval functions for clarity - Change function name from getDocumentsByWorkflowId to getDocumentTrackerByWorkflowId - Update matching logic for clarity and reduce code duplication (Your code is like a puzzling riddle—only the compiler knows the answer) * feat: document tracker (BAL-3459, BAL-3435) (#3043) * wip * interface * wip * feat: document tracker * reorder endpoints * fallback * fix * PR comments + add endpoint * feature flag * polish * invalidate data * PR comments * refactor(documents): update query key naming and structure - Rename documentsQueryKey to documentsQueryKeys for consistency - Adjust import paths for better organization of document hooks - Simplify conditional checks in unit tests for better readability (Your code is cleaner than a whistle, but I still wouldn't recommend using it to clean the house) * chore(*): conflicts resolution --------- Co-authored-by: Sasha <[email protected]> Co-authored-by: Omri Levy <[email protected]> * wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) * feat: implemented requested documents feature & added documents * fix: fixed loading on document fields * feat: implemented mapping of document statuses & fixed endpoints * feat(workflows-service): added comment column to the document resource (#3064) * feat: updated types in ui * feat: implemented documents update on documents request * feat: added documents * feat: implemented document statuses mapping to context documents * Documents integration phase 1 (#3067) * wip - integration fixes * feat(documents): update document request schema and integration (#3057) - Refactor requestDocuments to use new request schema - Update type definitions and processes for document entities - Implement batch creation of documents in the repository (your schema changes are so complex, they make IKEA instructions look like a coloring book) --------- Co-authored-by: Shane <[email protected]> * fix: fixed kyb tests * fix: fixed ui package tests * fix: post merge fix * fix: updated reason mapping & removed log * fix: fixed tests * feat: added decisionReason & updated mapping --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> Co-authored-by: Sasha <[email protected]>
url: '{_app.apiUrl}collection-flow/files', | ||
method: 'POST', | ||
headers: { | ||
Authorization: 'Bearer {_app.accessToken}', |
Check failure
Code scanning / CodeQL
Hard-coded credentials Critical
authorization header
url: '{_app.apiUrl}collection-flow/files', | ||
method: 'PUT', | ||
headers: { | ||
Authorization: 'Bearer {_app.accessToken}', |
Check failure
Code scanning / CodeQL
Hard-coded credentials Critical
authorization header
url: '{_app.apiUrl}collection-flow/files', | ||
method: 'DELETE', | ||
headers: { | ||
Authorization: 'Bearer {_app.accessToken}', |
Check failure
Code scanning / CodeQL
Hard-coded credentials Critical
authorization header
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
🔭 Outside diff range comments (2)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (1)
136-160
: 🛠️ Refactor suggestionConsider adding error handling for document deletion.
The
clearFileAndInput
function should handle potential errors during document deletion.const clearFileAndInput = useCallback(async () => { if (!element.params?.template?.id) { console.warn('Template id is migging in element', element); return; } const updatedDocuments = removeDocumentFromListByTemplateId( documentsList, element.params?.template?.id as string, ); const documentId = value; if (typeof documentId === 'string') { - await deleteDocument({ ids: [documentId] }); + try { + await deleteDocument({ ids: [documentId] }); + } catch (error) { + console.error('Failed to delete document:', error); + // Consider showing a user-friendly error message + return; + } } onChange(updatedDocuments); removeTask(id); if (inputRef.current) { inputRef.current.value = ''; } }, [documentsList, element, onChange, id, removeTask, value, deleteDocument]);packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (1)
79-81
: 🛠️ Refactor suggestionImprove error handling in catch blocks.
The catch blocks only log errors without proper error propagation or user feedback.
Consider this enhanced error handling:
const handleUploadError = (error: Error, context?: any) => { console.error('Failed to upload file.', error, context); // Propagate error for UI feedback throw new Error(`Document upload failed: ${error.message}`); };Then use it in catch blocks:
} catch (error) { - console.error('Failed to upload file.', error); + handleUploadError(error); }Also applies to: 121-125
♻️ Duplicate comments (4)
services/workflows-service/src/document/document.service.ts (4)
77-89
:⚠️ Potential issueHandle partial failures during file upload.
The file upload process should be wrapped in a transaction with the document creation to ensure data consistency.
Implement transaction handling:
- const uploadedFile = await this.fileService.uploadNewFile(projectId, workflowRuntimeData, { - ...file, - mimetype: - file.mimetype || - ( - await getFileMetadata({ - file: file.originalname || '', - fileName: file.originalname || '', - }) - )?.mimeType || - '', - }); + let uploadedFile; + try { + uploadedFile = await this.fileService.uploadNewFile(projectId, workflowRuntimeData, { + ...file, + mimetype: + file.mimetype || + ( + await getFileMetadata({ + file: file.originalname || '', + fileName: file.originalname || '', + }) + )?.mimeType || + '', + }); + } catch (error) { + throw new Error(`File upload failed: ${error.message}`); + }
224-248
:⚠️ Potential issueAdd data validation before update.
The update method should validate the input data before processing.
Add validation using zod:
async updateById( id: string, projectIds: TProjectId[], data: Prisma.DocumentUpdateInput, args?: Prisma.DocumentUpdateManyArgs, transaction?: PrismaTransactionClient, ) { + const UpdateSchema = z.object({ + status: z.enum(['pending', 'approved', 'rejected']).optional(), + decision: z.enum(['approved', 'rejected']).optional(), + properties: z.record(z.unknown()).optional(), + }); + + const validatedData = UpdateSchema.parse(data); + await this.repository.updateById(id, projectIds, data, args, transaction);
250-273
:⚠️ Potential issueAdd file cleanup after document deletion.
The deletion process should clean up associated files to prevent orphaned data.
Add file cleanup:
async deleteByIds( ids: string[], projectIds: TProjectId[], args?: Prisma.DocumentDeleteManyArgs, transaction?: PrismaTransactionClient, ) { + // Get files before deletion + const documents = await this.repository.findMany( + projectIds, + { + where: { id: { in: ids } }, + include: { files: true }, + }, + transaction, + ); + + const fileIds = documents.flatMap(doc => doc.files.map(file => file.fileId)); + await this.repository.deleteByIds(ids, projectIds, args, transaction); + + // Clean up files + await Promise.all( + fileIds.map(fileId => + this.storageService.deleteFile(fileId).catch(console.error), + ), + );
579-582
:⚠️ Potential issueAdd error handling for document creation failures.
The document creation process should handle partial failures gracefully.
Implement error handling:
- const createdDocuments = await Promise.all( - documentsToCreate.map(doc => this.repository.create(doc)), - ); + const createdDocuments = await Promise.allSettled( + documentsToCreate.map(doc => this.repository.create(doc)), + ); + + const successfulDocuments = createdDocuments + .filter((result): result is PromiseFulfilledResult<Document> => + result.status === 'fulfilled') + .map(result => result.value); + + const failedDocuments = createdDocuments + .filter((result): result is PromiseRejectedResult => + result.status === 'rejected') + .map(result => result.reason); + + if (failedDocuments.length > 0) { + console.error('Failed to create some documents:', failedDocuments); + }
🧹 Nitpick comments (12)
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts (1)
27-43
: Consider removing type assertions.The mock element uses type assertions (
as IDocumentTemplate
andas IDocumentFieldParams
) which could mask type errors. Consider providing complete mock objects that satisfy the interfaces.- } as IDocumentTemplate, + }, documentType: 'document', documentVariant: 'variant', httpParams: { deleteDocument: { url: 'test-url', - }, - } as IDocumentFieldParams['httpParams'], + method: 'DELETE', + }, + },packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (3)
27-46
: LGTM! Well-structured type definitions.The type definitions are comprehensive and well-documented. The
IDocumentTemplate
interface provides a clear contract for document data.Consider adding JSDoc comments to document the purpose of each status and decision type:
+/** Document processing status indicating whether it's requested, provided, or unprovided */ export type TDocumentStatus = 'requested' | 'provided' | 'unprovided'; +/** Document review decision indicating whether it's approved, rejected, or needs revisions */ export type TDocumentDecision = 'approved' | 'rejected' | 'revisions';
98-103
: Consider memoizing the document object retrieval parameters.The
getDocumentObjectFromDocumentsList
function is called on every render with the same parameters. Consider memoizing the element parameter to optimize performance.const document = useMemo(() => { + const memoizedElement = useMemo( + () => element as IFormElement<'documentfield', IDocumentFieldParams>, + [element] + ); return getDocumentObjectFromDocumentsList( documentsList, - element as IFormElement<'documentfield', IDocumentFieldParams>, + memoizedElement, ); -}, [documentsList, element]); +}, [documentsList, memoizedElement]);
190-193
: Consider debouncing the clear button click handler.The clear button click handler could trigger multiple deletions if clicked rapidly. Consider debouncing the handler.
+import { useDebouncedCallback } from 'use-debounce'; +const debouncedClearFileAndInput = useDebouncedCallback( + clearFileAndInput, + 300, + { leading: true, trailing: false } +); -onClick={async e => { - e.stopPropagation(); - await clearFileAndInput(); -}} +onClick={e => { + e.stopPropagation(); + debouncedClearFileAndInput(); +}}packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/defaults.ts (1)
2-2
: Consider using URL.join for path construction.Instead of string concatenation, consider using URL.join to handle path separators correctly.
- url: '{_app.apiUrl}collection-flow/files', + url: URL.join('{_app.apiUrl}', 'collection-flow', 'files'),Also applies to: 11-11, 20-20
apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/helpers/assign-document-statuses-and-decision-to-documents-in-context.ts (1)
15-24
: Consider using immutable operations.The function mutates the input context directly. Consider returning a new context object instead of modifying the existing one to maintain immutability and prevent side effects.
apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.unit.test.ts (1)
5-12
: Avoid double type assertion.The type assertion
as unknown as CollectionFlowContext
is a code smell. Consider creating a proper mock factory function.+const createMockContext = (documents: Array<Partial<Document>>): CollectionFlowContext => ({ + documents, +}); -const mockContext = { - documents: [ - { - id: 'doc1', - decisionReason: 'needs_review', - }, - ], -} as unknown as CollectionFlowContext; +const mockContext = createMockContext([ + { + id: 'doc1', + decisionReason: 'needs_review', + }, +]);services/workflows-service/prisma/migrations/20250219161412_add_decision_reason_to_document/migration.sql (1)
1-2
: Consider data migration strategy for existing records.The addition of the decisionReason column might affect existing records and applications.
Consider:
- Default value for existing records
- Application code updates to handle null values
- Backward compatibility during deployment
services/workflows-service/prisma/schema.prisma (1)
965-1000
: Consider adding soft delete for documents.The Document model should include soft delete capability for audit purposes.
Add soft delete fields:
model Document { // existing fields... + deletedAt DateTime? + deletedBy String? @@index([businessId]) @@index([endUserId]) @@index([workflowRuntimeDataId]) + @@index([deletedAt]) }apps/kyb-app/src/domains/collection-flow/types/index.ts (1)
174-181
: Add validation for document status and decision values.The interface allows any string value for
status
anddecision
, which could lead to inconsistencies.Consider using string literals or enums to restrict possible values:
- status: TDocumentStatus; - decision: TDocumentDecision; + status: 'pending' | 'approved' | 'rejected'; + decision: 'approve' | 'reject' | 'request_changes';Or create enums to maintain these values in a single location:
export enum DocumentStatus { PENDING = 'pending', APPROVED = 'approved', REJECTED = 'rejected', } export enum DocumentDecision { APPROVE = 'approve', REJECT = 'reject', REQUEST_CHANGES = 'request_changes', }packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (1)
62-67
: Extract duplicated document upload payload creation.The document upload payload creation logic is duplicated in both the change and submit handlers.
Extract this into a helper function:
const createDocumentUploadPayload = ( element: IFormElement<'documentfield', IDocumentFieldParams>, metadata: { businessId: string }, file: File, document: any, ) => buildDocumentFormData(element, metadata, file, document);Also applies to: 101-106
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (1)
127-186
: Add error boundary for AccordionContent.The memoized component could benefit from error boundary protection.
Consider wrapping the component with an error boundary:
class DocumentTrackerErrorBoundary extends React.Component { componentDidCatch(error: Error) { console.error('DocumentTracker error:', error); // Show fallback UI } render() { return this.props.children; } } // Usage: <DocumentTrackerErrorBoundary> <AccordionContent documentTrackerItems={documentTrackerItems} isLoading={isLoadingDocuments} getSubItems={getSubItems} /> </DocumentTrackerErrorBoundary>
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (22)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
(1 hunks)apps/kyb-app/src/domains/collection-flow/types/index.ts
(2 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.ts
(3 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.unit.test.ts
(3 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/helpers/assign-document-statuses-and-decision-to-documents-in-context.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/useCollectionFlowContext.ts
(1 hunks)packages/config/tsconfig.react.json
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx
(6 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/defaults.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts
(2 hunks)services/workflows-service/prisma/migrations/20250219161412_add_decision_reason_to_document/migration.sql
(1 hunks)services/workflows-service/prisma/schema.prisma
(6 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts
(4 hunks)services/workflows-service/src/collection-flow/dto/update-collection-flow-document.schema.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/index.ts
🚧 Files skipped from review as they are similar to previous changes (3)
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/useCollectionFlowContext.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.ts
- apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
🧰 Additional context used
🪛 GitHub Check: CodeQL
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/defaults.ts
[failure] 5-5: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
[failure] 14-14: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
[failure] 23-23: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
🪛 Biome (1.9.4)
packages/config/tsconfig.react.json
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
🔇 Additional comments (20)
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts (2)
1-24
: LGTM! Well-structured mock setup.The mock setup is comprehensive and follows testing best practices with proper isolation of external dependencies.
49-107
: Add test coverage for metadata parameter.The test suite should include cases for the metadata parameter mentioned in the AI summary.
Add a new test case:
it('should pass metadata to request when provided', async () => { const documents = [{ id: 'template-1' }]; const metadata = { userId: '123', context: 'test' }; vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); vi.mocked(request).mockResolvedValue({}); await documentFieldValueCleaner(documents, mockElement, metadata); expect(request).toHaveBeenCalledWith( mockElement.params!.httpParams.deleteDocument, metadata ); });packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (2)
48-59
: LGTM! Well-designed interface extension.The
IDocumentFieldParams
interface cleanly extendsIFileFieldParams
with document-specific parameters and properly typed HTTP operations.
69-72
: LGTM! Proper error handling for document deletion.The deletion functionality is well-implemented with proper loading state management and fallback to default deletion parameters.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.ts (1)
3-6
: LGTM!The function is well-implemented with proper type safety using optional chaining and type predicate. The logic correctly checks both the document status and _id property.
services/workflows-service/src/collection-flow/dto/update-collection-flow-document.schema.ts (1)
4-9
: LGTM!The schema is well-structured using Type.Composite to combine the existing schema with the new documentId requirement.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/defaults.ts (1)
1-25
: LGTM! Static analysis warnings are false positives.The authorization headers are using template variables (
{_app.accessToken}
), not hard-coded credentials. The static analysis tool incorrectly flagged these as security issues.🧰 Tools
🪛 GitHub Check: CodeQL
[failure] 5-5: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
[failure] 14-14: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
[failure] 23-23: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.unit.test.ts (1)
1-57
: Well-structured test suite with comprehensive coverage!The test suite follows best practices with:
- Clear test descriptions
- AAA pattern (Arrange, Act, Assert)
- Coverage of happy path, negative case, and edge cases
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts (2)
19-25
: Add validation for template properties.The code assumes template properties exist without validation, which could lead to runtime errors.
40-47
: Add error handling for JSON.stringify.The JSON.stringify operation could throw for circular references or invalid values.
apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.unit.test.ts (1)
27-106
: Well-structured test cases!The test cases:
- Have clear descriptions
- Cover important scenarios including edge cases
- Use meaningful assertions
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (3)
49-53
: Enhance error handling for missing metadata.The current implementation only logs warnings for missing metadata.
Apply this diff to improve error handling:
if (!metadata.entityId) { - console.warn('Entity ID is missing on element', element, 'Upload will be skipped.'); + const error = new Error('Entity ID is required for document upload'); + console.warn(error.message, { element }); + throw error; return; }
28-35
: LGTM! Well-structured HTTP handling for document uploads.The hook now properly uses
useHttp
for both document creation and updates, with clear separation of concerns.
49-53
: Enhance error handling for missing metadata.The current implementation only logs warnings for missing metadata.
Apply this diff to improve error handling:
if (!metadata.entityId) { - console.warn('Entity ID is missing on element', element, 'Upload will be skipped.'); + const error = new Error('Entity ID is required for document upload'); + console.warn(error.message, { element }); + throw error; return; }apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (2)
22-118
: LGTM! Well-structured document tracking component.The component is well-organized with:
- Clear separation of UI elements
- Proper handling of loading and empty states
- Informative hover card for document statuses
127-186
: LGTM! Effective use of memoization for performance optimization.The
AccordionContent
component is properly memoized with appropriate dependencies inuseMemo
hooks.services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts (2)
192-197
: Enhance document deletion validation in deleteByIds.The deletion process should verify document existence and ownership before deletion.
Consider adding pre-deletion checks to ensure all documents exist and belong to the project.
74-120
: Add error handling and transaction management.The document creation process could fail at multiple points.
Consider wrapping the operation in a try-catch block and using transactions to ensure data consistency.
packages/config/tsconfig.react.json (1)
12-12
:❓ Verification inconclusive
Verify browser compatibility with ES2023 features.
The upgrade to ES2023 library might affect browser compatibility.
🌐 Web query:
What browsers support ES2023 features?
💡 Result:
ES2023 (ECMAScript 2023) features have partial support across modern browsers as of 2025, with key implementations focused on new array methods and syntax enhancements. Here's the current browser compatibility:
Firefox
- Version 108+ supports:
Array.prototype.findLast()
[1][2]Array.prototype.findLastIndex()
[1][2]- Partial experimental support for hashbang grammar (#!) [1]
Chrome/Chromium-based Browsers
- Version 110+ includes:
Array.prototype.findLast()
[1][2]Array.prototype.findLastIndex()
[1][2]- Chrome Canary offers early access to other ES2023 proposals [1]
Microsoft Edge
- Version 109+ implements:
Array.prototype.findLast()
[1]Array.prototype.findLastIndex()
[1]Safari
- No native ES2023 feature support as of latest stable release [1][2]
- Typically adopts new JavaScript features 12-18 months after specification finalization [1]
Node.js
- Version 19+ supports:
Array.prototype.findLast()
[1][2]Array.prototype.findLastIndex()
[1][2]- Expected to implement more ES2023 features in future updates [1]
For unsupported browsers, developers can use transpilers like Babel to convert ES2023 code to backward-compatible versions[1][2]. Real-time compatibility checks can be performed using platforms like CanIUse[6].
Current adoption focuses primarily on the new array methods (
findLast
/findLastIndex
), with other ES2023 features like Symbols-as-WeakMap-keys still awaiting broader implementation[1][3]. Browser vendors are gradually rolling out support, with full specification compliance expected by late 2025[1].Citations:
- 1: https://daily.dev/blog/javascript-latest-version-an-overview
- 2: https://dev.to/manjushsh/modern-javascript-features-whats-new-in-es2023-4i7d
- 3: https://akava.io/blog/exploring-the-latest-version-of-javascript-es2023
- 4: https://caniuse.com/?search=es20
- 5: https://www.w3schools.com/js/js_versions.asp
- 6: https://caniuse.com/es2023
Verify ES2023 Compatibility Across Target Browsers
The update to include
"es2023"
in thelib
array is acceptable for modern browsers (Chrome 110+, Firefox 108+, Edge 109+); however, bear in mind that Safari currently lacks native support for ES2023 features. Please verify if the project targets environments where unsupported features might require transpilation with Babel or polyfills. Ensure these compatibility details are well-documented or addressed if broader support is necessary.
- File:
packages/config/tsconfig.react.json
(Line 12)- Recommendation: Confirm that your target browsers align with this configuration or implement appropriate fallbacks.
🧰 Tools
🪛 Biome (1.9.4)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
apps/kyb-app/src/domains/collection-flow/types/index.ts (1)
167-169
: Consider making the metadata property optional for backward compatibility.The required
metadata
property could break existing code that uses theUISchema
interface.Apply this diff to make it optional:
- metadata: { + metadata?: { businessId: string; };
@@ -28,7 +28,7 @@ export const createOrUpdateFileIdOrFileInDocuments = ( | |||
return documents; | |||
} else { | |||
const existingDocumentIndex = documents.findIndex(document => document.id === template?.id); | |||
documents[existingDocumentIndex] = structuredClone(template); | |||
documents[existingDocumentIndex] = { ...documents[existingDocumentIndex], ...template }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Reverse spread operator order to preserve template properties.
The current spread operator order ({ ...documents[existingDocumentIndex], ...template }
) means that template properties will override existing document properties. If the intention is to preserve template properties while allowing updates from the existing document, reverse the order.
- documents[existingDocumentIndex] = { ...documents[existingDocumentIndex], ...template };
+ documents[existingDocumentIndex] = { ...template, ...documents[existingDocumentIndex] };
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
documents[existingDocumentIndex] = { ...documents[existingDocumentIndex], ...template }; | |
documents[existingDocumentIndex] = { ...template, ...documents[existingDocumentIndex] }; |
document.status = documentFileId | ||
? documentsMap?.[documentFileId]?.status | ||
: documentsMap?.[document._id!]?.status; | ||
document.decisionReason = documentFileId | ||
? documentsMap?.[documentFileId]?.decisionReason | ||
: documentsMap?.[document._id!]?.decisionReason; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add null checks for document._id.
The non-null assertion on document._id!
could lead to runtime errors. Add proper null checks.
-document.status = documentFileId
- ? documentsMap?.[documentFileId]?.status
- : documentsMap?.[document._id!]?.status;
-document.decisionReason = documentFileId
- ? documentsMap?.[documentFileId]?.decisionReason
- : documentsMap?.[document._id!]?.decisionReason;
+if (!document._id && !documentFileId) {
+ continue;
+}
+document.status = documentFileId
+ ? documentsMap?.[documentFileId]?.status
+ : document._id ? documentsMap?.[document._id]?.status : undefined;
+document.decisionReason = documentFileId
+ ? documentsMap?.[documentFileId]?.decisionReason
+ : document._id ? documentsMap?.[document._id]?.decisionReason : undefined;
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
document.status = documentFileId | |
? documentsMap?.[documentFileId]?.status | |
: documentsMap?.[document._id!]?.status; | |
document.decisionReason = documentFileId | |
? documentsMap?.[documentFileId]?.decisionReason | |
: documentsMap?.[document._id!]?.decisionReason; | |
if (!document._id && !documentFileId) { | |
continue; | |
} | |
document.status = documentFileId | |
? documentsMap?.[documentFileId]?.status | |
: document._id ? documentsMap?.[document._id]?.status : undefined; | |
document.decisionReason = documentFileId | |
? documentsMap?.[documentFileId]?.decisionReason | |
: document._id ? documentsMap?.[document._id]?.decisionReason : undefined; |
const run = (elements: Array<IFormElement<TBaseFields, any>>, stack: TDeepthLevelStack = []) => { | ||
for (const element of elements) { | ||
if (isDocumentFieldDefinition(element)) { | ||
const documents = get(context, formatValueDestination(element.valueDestination, stack)); | ||
const document = getDocumentObjectFromDocumentsList(documents || [], element); | ||
|
||
if (!document) { | ||
continue; | ||
} | ||
|
||
const fileOrFileId = getFileOrFileIdFromDocumentsList(documents || [], element); | ||
|
||
if (fileOrFileId instanceof File) { | ||
continue; | ||
} | ||
|
||
const documentFileId = fileOrFileId as string; | ||
|
||
document.status = documentFileId | ||
? documentsMap?.[documentFileId]?.status | ||
: documentsMap?.[document._id!]?.status; | ||
document.decisionReason = documentFileId | ||
? documentsMap?.[documentFileId]?.decisionReason | ||
: documentsMap?.[document._id!]?.decisionReason; | ||
} | ||
|
||
if (Array.isArray(element.children) && element.children.length > 0) { | ||
const value = get(context, formatValueDestination(element.valueDestination, stack)); | ||
|
||
if (!value) { | ||
continue; | ||
} | ||
|
||
value?.forEach((_: unknown, index: number) => { | ||
run(element.children as Array<IFormElement<any, any>>, [...stack, index]); | ||
}); | ||
} | ||
} | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add safeguards for deep recursion and improve type safety.
- Deep recursion could lead to stack overflow. Consider adding a depth limit check.
- Replace
any
types with proper type definitions.
-const run = (elements: Array<IFormElement<TBaseFields, any>>, stack: TDeepthLevelStack = []) => {
+const MAX_DEPTH = 100; // Adjust based on your needs
+const run = (
+ elements: Array<IFormElement<TBaseFields, unknown>>,
+ stack: TDeepthLevelStack = [],
+) => {
+ if (stack.length > MAX_DEPTH) {
+ throw new Error('Maximum recursion depth exceeded');
+ }
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const run = (elements: Array<IFormElement<TBaseFields, any>>, stack: TDeepthLevelStack = []) => { | |
for (const element of elements) { | |
if (isDocumentFieldDefinition(element)) { | |
const documents = get(context, formatValueDestination(element.valueDestination, stack)); | |
const document = getDocumentObjectFromDocumentsList(documents || [], element); | |
if (!document) { | |
continue; | |
} | |
const fileOrFileId = getFileOrFileIdFromDocumentsList(documents || [], element); | |
if (fileOrFileId instanceof File) { | |
continue; | |
} | |
const documentFileId = fileOrFileId as string; | |
document.status = documentFileId | |
? documentsMap?.[documentFileId]?.status | |
: documentsMap?.[document._id!]?.status; | |
document.decisionReason = documentFileId | |
? documentsMap?.[documentFileId]?.decisionReason | |
: documentsMap?.[document._id!]?.decisionReason; | |
} | |
if (Array.isArray(element.children) && element.children.length > 0) { | |
const value = get(context, formatValueDestination(element.valueDestination, stack)); | |
if (!value) { | |
continue; | |
} | |
value?.forEach((_: unknown, index: number) => { | |
run(element.children as Array<IFormElement<any, any>>, [...stack, index]); | |
}); | |
} | |
} | |
}; | |
const MAX_DEPTH = 100; // Adjust based on your needs | |
const run = ( | |
elements: Array<IFormElement<TBaseFields, unknown>>, | |
stack: TDeepthLevelStack = [], | |
) => { | |
if (stack.length > MAX_DEPTH) { | |
throw new Error('Maximum recursion depth exceeded'); | |
} | |
for (const element of elements) { | |
if (isDocumentFieldDefinition(element)) { | |
const documents = get(context, formatValueDestination(element.valueDestination, stack)); | |
const document = getDocumentObjectFromDocumentsList(documents || [], element); | |
if (!document) { | |
continue; | |
} | |
const fileOrFileId = getFileOrFileIdFromDocumentsList(documents || [], element); | |
if (fileOrFileId instanceof File) { | |
continue; | |
} | |
const documentFileId = fileOrFileId as string; | |
document.status = documentFileId | |
? documentsMap?.[documentFileId]?.status | |
: documentsMap?.[document._id!]?.status; | |
document.decisionReason = documentFileId | |
? documentsMap?.[documentFileId]?.decisionReason | |
: documentsMap?.[document._id!]?.decisionReason; | |
} | |
if (Array.isArray(element.children) && element.children.length > 0) { | |
const value = get(context, formatValueDestination(element.valueDestination, stack)); | |
if (!value) { | |
continue; | |
} | |
value?.forEach((_: unknown, index: number) => { | |
run(element.children as Array<IFormElement<any, any>>, [...stack, index]); | |
}); | |
} | |
} | |
}; |
model DocumentFile { | ||
id String @id @default(cuid()) | ||
type DocumentFileType | ||
variant DocumentFileVariant | ||
page Int | ||
|
||
documentId String | ||
document Document @relation(fields: [documentId], references: [id], onDelete: Cascade) | ||
|
||
fileId String | ||
file File @relation(fields: [fileId], references: [id], onDelete: Cascade) | ||
|
||
projectId String | ||
project Project @relation(fields: [projectId], references: [id]) | ||
|
||
@@index([documentId]) | ||
@@index([fileId]) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add created/updated timestamps to DocumentFile.
The DocumentFile model should track creation and modification times.
Add timestamp fields:
model DocumentFile {
// existing fields...
+ createdAt DateTime @default(now())
+ updatedAt DateTime @updatedAt
@@index([documentId])
@@index([fileId])
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
model DocumentFile { | |
id String @id @default(cuid()) | |
type DocumentFileType | |
variant DocumentFileVariant | |
page Int | |
documentId String | |
document Document @relation(fields: [documentId], references: [id], onDelete: Cascade) | |
fileId String | |
file File @relation(fields: [fileId], references: [id], onDelete: Cascade) | |
projectId String | |
project Project @relation(fields: [projectId], references: [id]) | |
@@index([documentId]) | |
@@index([fileId]) | |
} | |
model DocumentFile { | |
id String @id @default(cuid()) | |
type DocumentFileType | |
variant DocumentFileVariant | |
page Int | |
documentId String | |
document Document @relation(fields: [documentId], references: [id], onDelete: Cascade) | |
fileId String | |
file File @relation(fields: [fileId], references: [id], onDelete: Cascade) | |
projectId String | |
project Project @relation(fields: [projectId], references: [id]) | |
createdAt DateTime @default(now()) | |
updatedAt DateTime @updatedAt | |
@@index([documentId]) | |
@@index([fileId]) | |
} |
<DialogTrigger disabled={isRequestButtonDisabled}> | ||
<Button | ||
className="h-7 bg-warning px-2 text-sm" | ||
disabled={isRequestButtonDisabled} | ||
> | ||
<SendIcon className="mr-1.5 d-4" /> | ||
<span className="whitespace-nowrap"> | ||
Request{' '} | ||
<span className="text-xs font-bold">{selectedIdsToRequest.length}</span> | ||
</span> | ||
</Button> | ||
</DialogTrigger> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add aria-label to request button.
The button lacks an accessible label for screen readers.
Apply this improvement:
<Button
className="h-7 bg-warning px-2 text-sm"
disabled={isRequestButtonDisabled}
+ aria-label={`Request ${selectedIdsToRequest.length} documents`}
>
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
<DialogTrigger disabled={isRequestButtonDisabled}> | |
<Button | |
className="h-7 bg-warning px-2 text-sm" | |
disabled={isRequestButtonDisabled} | |
> | |
<SendIcon className="mr-1.5 d-4" /> | |
<span className="whitespace-nowrap"> | |
Request{' '} | |
<span className="text-xs font-bold">{selectedIdsToRequest.length}</span> | |
</span> | |
</Button> | |
</DialogTrigger> | |
<DialogTrigger disabled={isRequestButtonDisabled}> | |
<Button | |
className="h-7 bg-warning px-2 text-sm" | |
disabled={isRequestButtonDisabled} | |
aria-label={`Request ${selectedIdsToRequest.length} documents`} | |
> | |
<SendIcon className="mr-1.5 d-4" /> | |
<span className="whitespace-nowrap"> | |
Request{' '} | |
<span className="text-xs font-bold">{selectedIdsToRequest.length}</span> | |
</span> | |
</Button> | |
</DialogTrigger> |
<Dialog open={open} onOpenChange={onOpenChange}> | ||
<DialogTrigger disabled={isRequestButtonDisabled}> | ||
<Button | ||
className="h-7 bg-warning px-2 text-sm" | ||
disabled={isRequestButtonDisabled} | ||
> | ||
<SendIcon className="mr-1.5 d-4" /> | ||
<span className="whitespace-nowrap"> | ||
Request{' '} | ||
<span className="text-xs font-bold">{selectedIdsToRequest.length}</span> | ||
</span> | ||
</Button> | ||
</DialogTrigger> | ||
|
||
<DialogContent | ||
onPointerDownOutside={e => e.preventDefault()} | ||
className="px-20 py-12 sm:max-w-2xl" | ||
> | ||
<DialogHeader> | ||
<DialogTitle className="text-4xl">Ask for all requests</DialogTitle> | ||
</DialogHeader> | ||
|
||
<DialogDescription> | ||
By clicking the button below, an email with a link will be sent to the customer, | ||
directing them to upload the documents you have marked as requested. The case’s | ||
status will then change to “Revisions” until the customer will provide the | ||
needed documents and fixes. | ||
</DialogDescription> | ||
|
||
<DialogFooter> | ||
<Button type="button" onClick={onRequestDocuments}> | ||
Send email | ||
</Button> | ||
</DialogFooter> | ||
</DialogContent> | ||
</Dialog> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enhance dialog accessibility.
The dialog component is missing important accessibility attributes.
Apply these improvements:
- <Dialog open={open} onOpenChange={onOpenChange}>
+ <Dialog
+ open={open}
+ onOpenChange={onOpenChange}
+ aria-label="Request documents dialog"
+ >
Also, ensure keyboard navigation is handled properly:
<DialogContent
onPointerDownOutside={e => e.preventDefault()}
+ onEscapeKeyDown={e => onOpenChange(false)}
className="px-20 py-12 sm:max-w-2xl"
>
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
<Dialog open={open} onOpenChange={onOpenChange}> | |
<DialogTrigger disabled={isRequestButtonDisabled}> | |
<Button | |
className="h-7 bg-warning px-2 text-sm" | |
disabled={isRequestButtonDisabled} | |
> | |
<SendIcon className="mr-1.5 d-4" /> | |
<span className="whitespace-nowrap"> | |
Request{' '} | |
<span className="text-xs font-bold">{selectedIdsToRequest.length}</span> | |
</span> | |
</Button> | |
</DialogTrigger> | |
<DialogContent | |
onPointerDownOutside={e => e.preventDefault()} | |
className="px-20 py-12 sm:max-w-2xl" | |
> | |
<DialogHeader> | |
<DialogTitle className="text-4xl">Ask for all requests</DialogTitle> | |
</DialogHeader> | |
<DialogDescription> | |
By clicking the button below, an email with a link will be sent to the customer, | |
directing them to upload the documents you have marked as requested. The case’s | |
status will then change to “Revisions” until the customer will provide the | |
needed documents and fixes. | |
</DialogDescription> | |
<DialogFooter> | |
<Button type="button" onClick={onRequestDocuments}> | |
Send email | |
</Button> | |
</DialogFooter> | |
</DialogContent> | |
</Dialog> | |
<Dialog | |
open={open} | |
onOpenChange={onOpenChange} | |
aria-label="Request documents dialog" | |
> | |
<DialogTrigger disabled={isRequestButtonDisabled}> | |
<Button | |
className="h-7 bg-warning px-2 text-sm" | |
disabled={isRequestButtonDisabled} | |
> | |
<SendIcon className="mr-1.5 d-4" /> | |
<span className="whitespace-nowrap"> | |
Request{' '} | |
<span className="text-xs font-bold">{selectedIdsToRequest.length}</span> | |
</span> | |
</Button> | |
</DialogTrigger> | |
<DialogContent | |
onPointerDownOutside={e => e.preventDefault()} | |
onEscapeKeyDown={e => onOpenChange(false)} | |
className="px-20 py-12 sm:max-w-2xl" | |
> | |
<DialogHeader> | |
<DialogTitle className="text-4xl">Ask for all requests</DialogTitle> | |
</DialogHeader> | |
<DialogDescription> | |
By clicking the button below, an email with a link will be sent to the customer, | |
directing them to upload the documents you have marked as requested. The case’s | |
status will then change to “Revisions” until the customer will provide the | |
needed documents and fixes. | |
</DialogDescription> | |
<DialogFooter> | |
<Button type="button" onClick={onRequestDocuments}> | |
Send email | |
</Button> | |
</DialogFooter> | |
</DialogContent> | |
</Dialog> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🔭 Outside diff range comments (1)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (1)
136-160
:⚠️ Potential issueAdd error handling for file operations.
The
clearFileAndInput
function should handle potential errors during document deletion.const clearFileAndInput = useCallback(async () => { if (!element.params?.template?.id) { console.warn('Template id is migging in element', element); return; } const updatedDocuments = removeDocumentFromListByTemplateId( documentsList, element.params?.template?.id as string, ); const documentId = value; if (typeof documentId === 'string') { + try { await deleteDocument({ ids: [documentId] }); + } catch (error) { + console.error('Failed to delete document:', error); + // Handle error appropriately (e.g., show error notification) + return; + } } onChange(updatedDocuments); removeTask(id); if (inputRef.current) { inputRef.current.value = ''; } }, [documentsList, element, onChange, id, removeTask, value, deleteDocument]);
♻️ Duplicate comments (2)
apps/kyb-app/src/domains/collection-flow/types/index.ts (1)
167-169
:⚠️ Potential issueConsider making metadata optional for backward compatibility.
The required
metadata
property could break existing code using theUISchema
interface.Apply this diff to make the property optional:
- metadata: { + metadata?: { businessId: string; };packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (1)
49-53
: 🛠️ Refactor suggestionEnhance error handling for missing metadata.
The current implementation only logs warnings for missing metadata.
Apply this diff to improve error handling:
if (!metadata.entityId) { - console.warn('Entity ID is missing on element', element, 'Upload will be skipped.'); + const error = new Error('Entity ID is required for document upload'); + console.error(error.message, { element }); + throw error; return; }
🧹 Nitpick comments (18)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.ts (1)
31-31
: Consider using deep merge for nested objects.While the spread operator works well for top-level properties, it may not handle nested objects correctly. Consider using a deep merge utility like
lodash/merge
for more robust object merging:- documents[existingDocumentIndex] = { ...documents[existingDocumentIndex], ...template }; + documents[existingDocumentIndex] = merge({}, documents[existingDocumentIndex], template);This would ensure that nested objects are properly merged rather than potentially being overwritten.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.unit.test.ts (2)
8-11
: Consider using complete test data and avoiding type assertions.The test data could be more realistic and type-safe:
- Include additional required fields from
IDocumentTemplate
- Use proper object construction instead of type assertions
- Use constants/enums for status values
Example improvement:
// Define an enum or constants for document statuses const DOCUMENT_STATUS = { REQUESTED: 'requested', PROVIDED: 'provided', } as const; // Create a helper to construct valid test documents const createTestDocument = ( partial: Partial<IDocumentTemplate> ): IDocumentTemplate => ({ _id: '', name: 'Test Document', category: 'test', mimeType: 'application/pdf', status: DOCUMENT_STATUS.REQUESTED, ...partial, }); // Use in tests const document = createTestDocument({ status: DOCUMENT_STATUS.REQUESTED, _id: '123', });Also applies to: 22-25, 36-38, 49-49
6-57
: Add test case for undefined/null document.The test suite should include a test case for handling undefined or null document parameters to ensure robust error handling.
Example additional test:
it('should return false when document is undefined or null', () => { expect(checkIfDocumentRequested(undefined)).toBe(false); expect(checkIfDocumentRequested(null)).toBe(false); });packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.ts (1)
3-6
: Rename function for clarity or validate non-empty_id
.
The type guard is helpful, but consider:
- Renaming the function to
isDocumentRequested
for clarity.- Ensuring
_id
is not an empty string if that’s a valid scenario in your domain.apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/helpers/assign-document-statuses-and-decision-to-documents-in-context.ts (3)
20-24
: Consider using Object.fromEntries for better readability.The document map creation can be simplified using
Object.fromEntries
.- const documentsMap = createdDocuments.reduce((acc, document) => { - acc[document.id] = document; - - return acc; - }, {} as Record<string, IDocumentRecord>); + const documentsMap = Object.fromEntries( + createdDocuments.map(document => [document.id, document]) + ) as Record<string, IDocumentRecord>;
42-50
: Use nullish coalescing for better type safety.The current implementation might not handle falsy values correctly.
- document.status = documentFileId - ? documentsMap?.[documentFileId]?.status - : documentsMap?.[document._id!]?.status; - document.decisionReason = documentFileId - ? documentsMap?.[documentFileId]?.decisionReason - : documentsMap?.[document._id!]?.decisionReason; + document.status = documentsMap?.[documentFileId ?? document._id!]?.status; + document.decisionReason = documentsMap?.[documentFileId ?? document._id!]?.decisionReason;
59-61
: Add type assertion for better type safety.The callback function in
forEach
could benefit from explicit typing.- value?.forEach((_: unknown, index: number) => { + value?.forEach((_: Record<string, unknown>, index: number) => {apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.unit.test.ts (1)
12-12
: Consider removing unnecessary type assertions.The
as unknown as CollectionFlowContext
type assertions can be replaced with proper typing.- } as unknown as CollectionFlowContext; + } satisfies Partial<CollectionFlowContext>;Also applies to: 55-55
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts (2)
27-43
: Add test coverage for additional element parameters.The mock element setup could be enhanced to test more scenarios.
Add test cases for:
const mockElementWithVariant = { ...mockElement, params: { ...mockElement.params, documentVariant: 'different-variant', }, }; it('should handle different document variants', async () => { const documents = [{ id: 'template-1' }]; vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); vi.mocked(request).mockResolvedValue({}); await documentFieldValueCleaner(documents, mockElementWithVariant); expect(request).toHaveBeenCalledWith( mockElementWithVariant.params!.httpParams?.deleteDocument, undefined ); });
75-85
: Enhance error test coverage.The error handling test could be more comprehensive.
Add test cases for:
it('should handle network errors', async () => { const documents = [{ id: 'template-1' }]; const networkError = new Error('Network Error'); vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); vi.mocked(request).mockRejectedValue(networkError); const result = await documentFieldValueCleaner(documents, mockElement); expect(toast.error).toHaveBeenCalledWith('Failed to delete document on hide. Network Error'); expect(result).toEqual([]); }); it('should handle API errors with status codes', async () => { const documents = [{ id: 'template-1' }]; const apiError = new Error('Not Found'); apiError.status = 404; vi.mocked(getFileOrFileIdFromDocumentsList).mockReturnValue('fileId'); vi.mocked(request).mockRejectedValue(apiError); const result = await documentFieldValueCleaner(documents, mockElement); expect(toast.error).toHaveBeenCalledWith('Failed to delete document on hide. Not Found'); expect(result).toEqual([]); });packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts (1)
94-126
: Reduce code duplication in upload logic.The upload logic is duplicated between the change and submit handlers.
Extract the common logic into a shared function:
const performUpload = async (file: File, context?: AnyObject) => { const documents = get(valuesRef.current, element.valueDestination); const document = getDocumentObjectFromDocumentsList(documents, element); const isDocumentRequested = checkIfDocumentRequested(document); const documentUploadPayload = buildDocumentFormData( element, { businessId: metadata.businessId as string }, file, document, ); const result = isDocumentRequested ? await updateDocument(documentUploadPayload) : await uploadDocument(documentUploadPayload); const updatedDocuments = createOrUpdateFileIdOrFileInDocuments( documents, element, result, ); if (context) { set(context, element.valueDestination, updatedDocuments); return context; } return updatedDocuments; };Also applies to: 55-82
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (3)
40-77
: Add loading state to request button.The dialog's request button should show a loading state during document requests.
Apply this diff:
- <Button type="button" onClick={onRequestDocuments}> + <Button + type="button" + onClick={onRequestDocuments} + disabled={isRequestButtonDisabled} + > + {isRequestButtonDisabled ? ( + <Loader2 className="mr-2 h-4 w-4 animate-spin" /> + ) : null} Send email </Button>
133-146
: Optimize array operations for individuals' documents.The array operations for individuals' documents could be simplified.
Apply this diff:
- const individualsSubitems = useMemo( - () => - [ - ...(documentTrackerItems?.individuals.ubos ?? []), - ...(documentTrackerItems?.individuals.directors ?? []), - ] - .map(getSubItems) - .filter(Boolean), - [ - documentTrackerItems?.individuals.ubos, - documentTrackerItems?.individuals.directors, - getSubItems, - ], - ); + const individualsSubitems = useMemo( + () => { + const { ubos = [], directors = [] } = documentTrackerItems?.individuals ?? {}; + return [...ubos, ...directors].map(getSubItems).filter(Boolean); + }, + [documentTrackerItems?.individuals, getSubItems], + );
41-76
: Enhance dialog accessibility.The dialog implementation could benefit from additional accessibility attributes.
Apply this diff:
<Dialog open={open} onOpenChange={onOpenChange}> + <DialogTrigger asChild> - <DialogTrigger disabled={isRequestButtonDisabled}> <Button className="h-7 bg-warning px-2 text-sm" disabled={isRequestButtonDisabled} + aria-label="Request documents" > <SendIcon className="mr-1.5 d-4" /> <span className="whitespace-nowrap"> Request{' '} <span className="text-xs font-bold">{selectedIdsToRequest.length}</span> </span> </Button> </DialogTrigger> <DialogContent onPointerDownOutside={e => e.preventDefault()} className="px-20 py-12 sm:max-w-2xl" + role="alertdialog" + aria-describedby="dialog-description" > <DialogHeader> <DialogTitle className="text-4xl">Ask for all requests</DialogTitle> </DialogHeader> - <DialogDescription> + <DialogDescription id="dialog-description">packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx (2)
132-134
: Prevent memory leaks in useLayoutEffect.The effect updates state but doesn't clean up. Consider adding a cleanup function.
useLayoutEffect(() => { updateState(typeof fileOrFileId === 'string' ? fileOrFileId : undefined, document); + return () => { + // Cleanup state on unmount + updateState(undefined, undefined); + }; }, [fileOrFileId, document, updateState]);
162-215
: Enhance accessibility for document upload.The component needs improvements for keyboard navigation and screen reader support.
return ( <FieldLayout element={element} elementState={documentState}> <div className={ctw( 'relative flex h-[56px] flex-row items-center gap-3 rounded-[16px] border bg-white px-4', { 'pointer-events-none opacity-50': disabled || disabledWhileUploading || isDeletingDocument || (task && isRunning), }, )} onClick={focusInputOnContainerClick} data-testid={createTestId(element, stack)} tabIndex={0} onFocus={onFocus} onBlur={onBlur} + role="button" + aria-label={`Upload ${placeholder}`} + onKeyDown={(e) => { + if (e.key === 'Enter' || e.key === ' ') { + e.preventDefault(); + focusInputOnContainerClick(); + } + }} >services/workflows-service/prisma/migrations/20250219161412_add_decision_reason_to_document/migration.sql (1)
1-2
: Consider adding column constraints.The
decisionReason
column might benefit from additional constraints:
- Maximum length limit to prevent excessive text
- Default value for consistency
-- AlterTable -ALTER TABLE "Document" ADD COLUMN "decisionReason" TEXT; +ALTER TABLE "Document" ADD COLUMN "decisionReason" TEXT + CONSTRAINT "decisionReason_max_length" CHECK (LENGTH("decisionReason") <= 1000) + DEFAULT NULL;services/workflows-service/prisma/schema.prisma (1)
965-1000
: New Document Model Definition:
The newDocument
model is well-structured and includes important fields such ascategory
,type
,issuingVersion
,issuingCountry
, and a versioning integer. The model also establishes optional relations toBusiness
,EndUser
, andWorkflowRuntimeData
, and a required relation toProject
.Suggestion: Consider whether fields like
category
andtype
could benefit from using enums (if the set of allowed values is known) to ensure data consistency. Also, adding inline documentation for each field could further improve clarity for future maintainers.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (22)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
(1 hunks)apps/kyb-app/src/domains/collection-flow/types/index.ts
(2 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.ts
(3 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.unit.test.ts
(3 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/helpers/assign-document-statuses-and-decision-to-documents-in-context.ts
(1 hunks)apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/useCollectionFlowContext.ts
(1 hunks)packages/config/tsconfig.react.json
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/DocumentField.tsx
(6 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/defaults.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.unit.test.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/index.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/useDocumentUpload.ts
(4 hunks)packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts
(2 hunks)services/workflows-service/prisma/migrations/20250219161412_add_decision_reason_to_document/migration.sql
(1 hunks)services/workflows-service/prisma/schema.prisma
(6 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts
(4 hunks)services/workflows-service/src/collection-flow/dto/update-collection-flow-document.schema.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/index.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/hooks/useCollectionFlowContext/useCollectionFlowContext.ts
- apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.ts
🧰 Additional context used
🪛 GitHub Check: CodeQL
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/defaults.ts
[failure] 5-5: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
[failure] 14-14: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
[failure] 23-23: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
🪛 Biome (1.9.4)
packages/config/tsconfig.react.json
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values: [1, 2]
(parse)
🔇 Additional comments (32)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/create-or-update-fileid-or-file-in-documents/create-or-update-fileid-or-file-in-documents.ts (1)
31-31
: LGTM! Safer document property merging.The change to use object spread operators (
{ ...documents[existingDocumentIndex], ...template }
) is a good improvement as it preserves existing document properties while updating with template properties.packages/config/tsconfig.react.json (2)
12-12
: Upgrade to ECMAScript 2023:
The"lib"
property has been updated from"es2022"
to"es2023"
to enable newer ECMAScript features. This is in line with evolving language features and should help future-proof the code. Ensure that all target environments and dependencies support ES2023 features.🧰 Tools
🪛 Biome (1.9.4)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
12-12
: Static Analysis Hint - Likely False Positive:
The static analysis tool reports an "End of file expected" error at this line, suggesting an issue with array syntax. However, the JSONC format used here supports comments and trailing commas, and the array syntax is correct. This error is likely a false positive. Please verify manually if any problems occur during actual builds.🧰 Tools
🪛 Biome (1.9.4)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
[error] 12-12: End of file expected
Use an array for a sequence of values:
[1, 2]
(parse)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/hooks/useDocumentUpload/helpers/check-if-document-requested/check-if-document-requested.unit.test.ts (1)
1-5
: LGTM! Well-structured test setup.The imports are appropriate, and the test suite is properly organized.
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts (3)
11-13
: Validation check looks good.
The explicit check forelement.params
ensures that parameters are not null or undefined, preventing unexpected runtime errors. Good use of a meaningful error message as well.
19-25
: Add validation for required template properties.
This repeats a past comment: the code assumestemplate?.category
,template?.type
,template?.issuingVersion
,template?.version
, andtemplate?.issuer?.country
exist. Consider adding a validation check before using these properties to prevent null or undefined values from causing unintended behavior.+ if (!template?.category || !template?.type || !template?.issuingVersion || + !template?.version || !template?.issuer?.country) { + throw new Error('Required template properties are missing'); + } payload.append('category', template?.category as string); payload.append('type', template?.type as string); payload.append('issuingVersion', template?.issuingVersion as unknown as string); payload.append('version', template?.version as unknown as string); payload.append('status', 'provided'); payload.append('properties', JSON.stringify(template.properties || {})); payload.append('issuingCountry', template?.issuer?.country as string);
40-47
: Add error handling for JSON.stringify.
This repeats a past comment: consider wrapping theJSON.stringify
calls in a try-catch block to avoid runtime errors if serialization fails (e.g., circular references).- payload.append( - 'metadata', - JSON.stringify({ - type: documentType, - variant: documentVariant, - page: pageIndex + 1, - }), - ); + try { + payload.append( + 'metadata', + JSON.stringify({ + type: documentType, + variant: documentVariant, + page: pageIndex + 1, + }), + ); + } catch (error) { + throw new Error('Failed to serialize document metadata'); + }services/workflows-service/src/document/document.service.ts (5)
42-114
: Handle partial failures during file upload and document creation.
This repeats a past comment: the file is uploaded, then the document is created, and then its file record is created. If any step fails, you may end up with inconsistent states. Consider wrapping the entire sequence in a single transaction or using a rollback strategy to ensure consistency in case of errors.
224-248
: Add data validation before update.
This repeats a past comment: the code updates documents without additional schema validation. Consider using runtime validation (e.g., zod) before applying Partial or direct updates to ensure data integrity and protect against invalid states.
250-273
: Add file cleanup after document deletion.
This repeats a past comment: the method removes document records but does not remove associated files. Consider cleaning up these files to avoid orphaned file artifacts and storage leaks.
307-351
: Clean up old file after re-upload.
This repeats a past comment: currently, when re-uploading a file, the old file is not removed. Consider removing or archiving the old file to avoid lingering unused files and to save storage costs.
583-605
: Add error handling for document creation failures.
This repeats a past comment: the request usesPromise.all
and will fail entirely if one create call fails. Consider usingPromise.allSettled
to handle partial failures, logging errors and continuing other document operations if needed.services/workflows-service/src/collection-flow/dto/update-collection-flow-document.schema.ts (1)
4-9
: LGTM!The schema is well-structured, combining the existing
CollectionFlowDocumentSchema
with a requireddocumentId
field.packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/defaults.ts (3)
1-8
: LGTM!The default creation parameters are well-structured, using template literals for dynamic values like
{_app.apiUrl}
and{_app.accessToken}
.🧰 Tools
🪛 GitHub Check: CodeQL
[failure] 5-5: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
10-17
: LGTM!The default update parameters follow the same pattern as the creation parameters, maintaining consistency.
🧰 Tools
🪛 GitHub Check: CodeQL
[failure] 14-14: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
19-25
: LGTM!The default deletion parameters maintain consistency with other operations while omitting the unnecessary
resultPath
.🧰 Tools
🪛 GitHub Check: CodeQL
[failure] 23-23: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.apps/kyb-app/src/pages/CollectionFlow/versions/v2/components/organisms/CollectionFlowUI/hooks/usePriorityFields/utils/generate-priority-fields.unit.test.ts (1)
92-106
: LGTM!The new test case effectively verifies that priority document fields are only generated when a decision reason is present.
packages/ui/src/components/organisms/Form/DynamicForm/hooks/internal/useClear/value-cleaners/documentfield-value-cleaner.unit.test.ts (1)
9-24
: LGTM! Well-structured mock setup.The mock setup is comprehensive and follows best practices by using
vi.mock()
for external dependencies.apps/kyb-app/src/domains/collection-flow/types/index.ts (1)
174-181
: LGTM! Well-structured document record interface.The
IDocumentRecord
interface is well-designed with:
- Clear property names
- Proper type imports
- Optional fields where appropriate
services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts (2)
74-120
: Add transaction management and error handling.The document creation process could fail at multiple points (metadata parsing, properties parsing, document creation). Consider wrapping the operation in a try-catch block and using transactions to ensure data consistency.
192-197
: Enhance document deletion validation.The
deleteDocumentsByIds
method should verify document existence and ownership before deletion.apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx (3)
118-134
: Add validation before making document requests.The
onRequestDocuments
function should validate the selected IDs before making the request.
139-151
: Improve type safety and performance of compareIdentifiers.The function could be optimized for better performance and type safety.
108-115
: Add error handling for document request mutation.The mutation should handle potential errors when requesting documents.
services/workflows-service/prisma/schema.prisma (8)
96-101
: New EndUser Documents Relation Added:
The addition of thedocuments Document[]
field in theEndUser
model creates a clear one-to-many relationship with the newDocument
model. Please ensure that the migration and cascading/deletion handling is correctly configured to avoid orphaned document records.
152-152
: Convert Business Documents Field to a Structured Relation:
Changing thedocuments
field in theBusiness
model from aJson?
type toDocument[]
greatly enhances data structure and queryability. Verify that the migration script correctly handles any existing JSON data and that all related service logic is updated accordingly.
286-286
: Add WorkflowRuntimeData Documents Relation:
Introducing thedocuments Document[]
field within theWorkflowRuntimeData
model establishes a useful linkage to related documents. Please confirm that downstream processes and service functions correctly utilize this new relation.
1002-1006
: New Enum DocumentStatus:
TheDocumentStatus
enum is properly defined with valuesprovided
,unprovided
, andrequested
, which will facilitate consistent status management for documents.
1008-1012
: New Enum DocumentDecision:
TheDocumentDecision
enum correctly provides options —approved
,rejected
, andrevisions
— to represent a document’s decision outcome.
1014-1018
: New Enum DocumentFileType:
TheDocumentFileType
enum categorizes file types intoselfie
,document
, andother
, which supports clearer classification of document files.
1020-1024
: New Enum DocumentFileVariant:
TheDocumentFileVariant
enum clearly defines file variants (front
,back
,other
), aiding in precise file disposition within documents.
1026-1043
: New DocumentFile Model Definition:
TheDocumentFile
model establishes a relational mapping between a document and a file, with fields to capture file type, variant, and page number. Notably, the relations to bothDocument
andFile
specifyonDelete: Cascade
, ensuring that deletion of a parent record cascades appropriately. Please double-check that this cascading behavior aligns with your overall data integrity strategy.
async updateDocument( | ||
@TokenScope() tokenScope: ITokenScope, | ||
@Body() | ||
data: Omit<Static<typeof UpdateCollectionFlowDocumentSchema>, 'properties'> & { | ||
metadata: string; | ||
properties: string; | ||
}, | ||
@UploadedFile( | ||
new ParseFilePipeBuilder().addMaxSizeValidator({ maxSize: FILE_MAX_SIZE_IN_BYTE }).build({ | ||
fileIsRequired: true, | ||
exceptionFactory: (error: string) => { | ||
if (error.includes('expected size')) { | ||
throw new UnprocessableEntityException(FILE_SIZE_EXCEEDED_MSG); | ||
} | ||
|
||
throw new UnprocessableEntityException(error); | ||
}, | ||
}), | ||
) | ||
file: Express.Multer.File, | ||
) { | ||
const metadata = DocumentFileJsonSchema.parse(data.metadata); | ||
const properties = z | ||
.preprocess(value => { | ||
if (typeof value !== 'string') { | ||
return value; | ||
} | ||
|
||
return JSON.parse(value); | ||
}, z.record(z.string(), z.unknown())) | ||
.parse(data.properties); | ||
|
||
// FormData returns version as a string | ||
// Manually converting to number to avoid validation errors | ||
data.version = Number(data.version); | ||
|
||
const documentsUpdateResults = await this.documentService.updateByIdWithFile({ | ||
...data, | ||
workflowRuntimeDataId: tokenScope.workflowRuntimeDataId, | ||
properties, | ||
metadata, | ||
file, | ||
projectId: tokenScope.projectId, | ||
}); | ||
|
||
return documentsUpdateResults.at(-1); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add error handling for document updates.
The updateDocument
method should include proper error handling for failed updates and file operations.
async updateDocument(
@TokenScope() tokenScope: ITokenScope,
@Body()
data: Omit<Static<typeof UpdateCollectionFlowDocumentSchema>, 'properties'> & {
metadata: string;
properties: string;
},
@UploadedFile(
new ParseFilePipeBuilder().addMaxSizeValidator({ maxSize: FILE_MAX_SIZE_IN_BYTE }).build({
fileIsRequired: true,
exceptionFactory: (error: string) => {
if (error.includes('expected size')) {
throw new UnprocessableEntityException(FILE_SIZE_EXCEEDED_MSG);
}
throw new UnprocessableEntityException(error);
},
}),
)
file: Express.Multer.File,
) {
+ try {
const metadata = DocumentFileJsonSchema.parse(data.metadata);
const properties = z
.preprocess(value => {
if (typeof value !== 'string') {
return value;
}
return JSON.parse(value);
}, z.record(z.string(), z.unknown()))
.parse(data.properties);
// FormData returns version as a string
// Manually converting to number to avoid validation errors
data.version = Number(data.version);
const documentsUpdateResults = await this.documentService.updateByIdWithFile({
...data,
workflowRuntimeDataId: tokenScope.workflowRuntimeDataId,
properties,
metadata,
file,
projectId: tokenScope.projectId,
});
return documentsUpdateResults.at(-1);
+ } catch (error) {
+ if (error instanceof z.ZodError) {
+ throw new UnprocessableEntityException('Invalid metadata or properties format');
+ }
+ throw error;
+ }
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async updateDocument( | |
@TokenScope() tokenScope: ITokenScope, | |
@Body() | |
data: Omit<Static<typeof UpdateCollectionFlowDocumentSchema>, 'properties'> & { | |
metadata: string; | |
properties: string; | |
}, | |
@UploadedFile( | |
new ParseFilePipeBuilder().addMaxSizeValidator({ maxSize: FILE_MAX_SIZE_IN_BYTE }).build({ | |
fileIsRequired: true, | |
exceptionFactory: (error: string) => { | |
if (error.includes('expected size')) { | |
throw new UnprocessableEntityException(FILE_SIZE_EXCEEDED_MSG); | |
} | |
throw new UnprocessableEntityException(error); | |
}, | |
}), | |
) | |
file: Express.Multer.File, | |
) { | |
const metadata = DocumentFileJsonSchema.parse(data.metadata); | |
const properties = z | |
.preprocess(value => { | |
if (typeof value !== 'string') { | |
return value; | |
} | |
return JSON.parse(value); | |
}, z.record(z.string(), z.unknown())) | |
.parse(data.properties); | |
// FormData returns version as a string | |
// Manually converting to number to avoid validation errors | |
data.version = Number(data.version); | |
const documentsUpdateResults = await this.documentService.updateByIdWithFile({ | |
...data, | |
workflowRuntimeDataId: tokenScope.workflowRuntimeDataId, | |
properties, | |
metadata, | |
file, | |
projectId: tokenScope.projectId, | |
}); | |
return documentsUpdateResults.at(-1); | |
} | |
async updateDocument( | |
@TokenScope() tokenScope: ITokenScope, | |
@Body() | |
data: Omit<Static<typeof UpdateCollectionFlowDocumentSchema>, 'properties'> & { | |
metadata: string; | |
properties: string; | |
}, | |
@UploadedFile( | |
new ParseFilePipeBuilder().addMaxSizeValidator({ maxSize: FILE_MAX_SIZE_IN_BYTE }).build({ | |
fileIsRequired: true, | |
exceptionFactory: (error: string) => { | |
if (error.includes('expected size')) { | |
throw new UnprocessableEntityException(FILE_SIZE_EXCEEDED_MSG); | |
} | |
throw new UnprocessableEntityException(error); | |
}, | |
}), | |
) | |
file: Express.Multer.File, | |
) { | |
+ try { | |
const metadata = DocumentFileJsonSchema.parse(data.metadata); | |
const properties = z | |
.preprocess(value => { | |
if (typeof value !== 'string') { | |
return value; | |
} | |
return JSON.parse(value); | |
}, z.record(z.string(), z.unknown())) | |
.parse(data.properties); | |
// FormData returns version as a string | |
// Manually converting to number to avoid validation errors | |
data.version = Number(data.version); | |
const documentsUpdateResults = await this.documentService.updateByIdWithFile({ | |
...data, | |
workflowRuntimeDataId: tokenScope.workflowRuntimeDataId, | |
properties, | |
metadata, | |
file, | |
projectId: tokenScope.projectId, | |
}); | |
return documentsUpdateResults.at(-1); | |
+ } catch (error) { | |
+ if (error instanceof z.ZodError) { | |
+ throw new UnprocessableEntityException('Invalid metadata or properties format'); | |
+ } | |
+ throw error; | |
+ } | |
} |
* feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * refactor(*): checkpoint - wip work for supporting documents in db * feat(workflows-service): now passing workflow definition document schema * Bal 3521 (#3015) * feat: implemented common useHttp hook * feat: initial entity field group * feat: implemented interaction logic with entity & tests * feat: implemented EntityFieldGroup (WIP) * feat: added entity type parameter to ui definition & updated stories * Bal 3486 (#3034) * feat: added entity creation & deletion endpoints * fix: fixed entity deletion * feat: finalized creation & deletion logic of entities * fix: lock fix * Bal 3487(WIP) (#3036) * feat(workflows-service): implemented document controller, service, repository, and dto * fix(workflows-service): added cascase on delete * refactor(workflows-service): removed uploadFile method from collection flow service * feat(workflows-service): added the ability to reupload document (#3019) * feat: implemented documents upload * feat: reworked creation ui * feat: implemented document creation & deletion * feat: finalized entity group field * fix: fixed tests * fix: cleanup * fix: format fix * fix: fixed build --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * fix: crypto mock in tests * feat: reworked documentfield & integrated entityfield to kyb & updated enpoints (#3040) * feat: added end user creation skip for created entities in kyb (#3044) * feat: implemented http document deletion on hide & fixed tests (#3045) * feat: implemented sync plugin & fixes (#3047) * fix: format * fix: refactored test * fix: refactor * fix: refactor * fix: updated logging & removed debugger * feat: added external schema for cf document && updated endpoint * fix: replaced id with ballerineEntityId & updated endpoint * fix: fixed test --------- Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Omri Levy <[email protected]> Co-authored-by: Shane <[email protected]> * BAL 3484 - Backend logic for documents tracker data (#3033) * feat(documents): add documents module and related components - Introduce DocumentsModule with controller and service - Implement DocumentTrackerModel for document representation - Add endpoint to retrieve documents by workflow ID (Your code is more scattered than a toddler's toys after a playdate) * feat(document): add endpoint to fetch documents by workflow ID - Implement new route for retrieving documents based on workflow and definition IDs - Refactor services and modules to accommodate for new functionality (your code organization is so tangled, it could be mistaken for a Christmas light display) * feat(documents): implement new document tracking feature - Add endpoint to retrieve documents based on workflow runtime and definition - Introduce validation for document input data in the new endpoint (Your code is so convoluted, it makes a Rubik's Cube look like tic-tac-toe) * refactor(document): update document service and controller to enhance workflow - Remove unused DocumentTrackerModel and associated DTOs - Update getDocumentsByWorkflowId parameters for clarity and consistency - Refactor document parsing for improved maintainability (Your function signatures have more parameters than a conference call in 2020) * chore(workflows-service): update subproject commit and clean up tests - Update subproject commit reference in data-migrations - Remove unnecessary console log from document unit test (Your console logs should come with a warning label: "For debugging only") * fix(workflow-service): resolve optional chaining issues in workflow definition - Remove unnecessary optional chaining for better clarity - Ensure accurate type handling for workflow definitions (in just a few lines, your code went from 'nice try' to 'you nailed it' - good job!) * feat(documents): implement documents tracker items fetching and query - Add fetch function for documents tracker items - Create query hooks for retrieving documents with search parameters - Include hash in URL construction for complete state representation (Your code organization is so scattered, it looks like a yard sale for functions) * refactor(document): rename document retrieval functions for clarity - Change function name from getDocumentsByWorkflowId to getDocumentTrackerByWorkflowId - Update matching logic for clarity and reduce code duplication (Your code is like a puzzling riddle—only the compiler knows the answer) * feat: document tracker (BAL-3459, BAL-3435) (#3043) * wip * interface * wip * feat: document tracker * reorder endpoints * fallback * fix * PR comments + add endpoint * feature flag * polish * invalidate data * PR comments * refactor(documents): update query key naming and structure - Rename documentsQueryKey to documentsQueryKeys for consistency - Adjust import paths for better organization of document hooks - Simplify conditional checks in unit tests for better readability (Your code is cleaner than a whistle, but I still wouldn't recommend using it to clean the house) * chore(*): conflicts resolution --------- Co-authored-by: Sasha <[email protected]> Co-authored-by: Omri Levy <[email protected]> * wip - integration fixes * feat(*): can now approve a document * feat(*): added reject and revisions to business documents * fix(backoffice-v2): ctw fix * chore(*): resolved conflicts * chore(*): fixed duplicate imports * fix(workflows-service): removed duplicated migrations --------- Co-authored-by: Shane <[email protected]> Co-authored-by: Illia Rudniev <[email protected]> Co-authored-by: Sasha <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 17
♻️ Duplicate comments (5)
services/workflows-service/prisma/schema.prisma (1)
1025-1042
: 🛠️ Refactor suggestionAdd created/updated timestamps to DocumentFile.
The DocumentFile model should track creation and modification times.
Add timestamp fields:
model DocumentFile { // existing fields... + createdAt DateTime @default(now()) + updatedAt DateTime @updatedAt @@index([documentId]) @@index([fileId]) }services/workflows-service/src/document/document.controller.external.ts (4)
59-66
:⚠️ Potential issueAdd file type validation for uploads.
The file interceptor uses
fileFilter
but there's no explicit validation of allowed file types. This could be a security risk.Add MIME type validation:
FileInterceptor('file', { storage: getDiskStorage(), limits: { files: 1, }, + fileFilter: (req, file, callback) => { + const allowedMimes = ['image/jpeg', 'image/png', 'application/pdf']; + if (!allowedMimes.includes(file.mimetype)) { + callback(new UnprocessableEntityException('Invalid file type'), false); + return; + } + callback(null, true); + }, }),
35-51
:⚠️ Potential issueAdd request size validation.
The endpoint should validate the size of the documents array to prevent abuse.
Add array size validation to the request schema:
const RequestUploadSchema = Type.Object({ workflowId: Type.String(), - documents: Type.Array( + documents: Type.Array( Type.Object({ type: Type.String(), category: Type.String(), decisionReason: Type.String(), issuingCountry: Type.String(), issuingVersion: Type.String(), version: Type.String(), entity: Type.Object({ id: Type.String(), type: Type.Union([Type.Literal('business'), Type.Literal('ubo'), Type.Literal('director')]), }), - }), + }), { maxItems: 100 } ), });
69-74
:⚠️ Potential issueConsider adding rate limiting.
The document creation endpoint could be vulnerable to abuse without rate limiting.
Add rate limiting using the
@nestjs/throttler
package:import { Throttle } from '@nestjs/throttler'; @Throttle({ default: { ttl: 60000, limit: 10 } }) // 10 requests per minute @Post() async createDocument(...)
303-327
:⚠️ Potential issueAdd file cleanup on error.
The re-upload endpoint should clean up the uploaded file if the document service operation fails.
Wrap the service call in a try-catch block:
async reuploadDocumentFileById( @Param('workflowRuntimeDataId') workflowRuntimeDataId: string, @Param('fileId') fileId: string, @UploadedFile(...) file: Express.Multer.File, @CurrentProject() projectId: string, ) { + try { return await this.documentService.reuploadDocumentFileById( fileId, workflowRuntimeDataId, [projectId], file, ); + } catch (error) { + // Clean up the uploaded file + await fs.unlink(file.path).catch(console.error); + throw error; + } }
🧹 Nitpick comments (24)
apps/backoffice-v2/src/lib/blocks/components/CallToActionLegacy/CallToActionLegacy.tsx (2)
169-174
: Consider extracting the complex condition into a helper function.The deep optional chaining and complex condition make the code harder to read and maintain. Consider extracting this logic into a helper function for better readability and reusability.
+const shouldConcatenateReasonAndComment = ( + comment: string | undefined, + workflow?: { workflowDefinition?: { config?: { isDocumentsV2?: boolean } } } +) => comment && !workflow?.workflowDefinition?.config?.isDocumentsV2; + reason: - comment && !workflow?.workflowDefinition?.config?.isDocumentsV2 + shouldConcatenateReasonAndComment(comment, workflow) ? `${reason} - ${comment}` : reason,
273-278
: Eliminate code duplication by reusing the helper function.This segment duplicates the same complex condition from the previous segment. After extracting the condition into a helper function as suggested above, you can reuse it here to maintain DRY principles.
reason: - comment && !workflow?.workflowDefinition?.config?.isDocumentsV2 + shouldConcatenateReasonAndComment(comment, workflow) ? `${reason} - ${comment}` : reason,services/workflows-service/src/document/dtos/document.dto.ts (1)
4-20
: Consider clarifying theproperties
field.Using
Type.Record(Type.String(), Type.Any())
allows any key-value pairs, which may be too permissive or unstructured for certain use cases. If there are known restrictions or formats forproperties
, consider defining those to ensure stronger type safety.services/workflows-service/prisma/schema.prisma (1)
1002-1023
: Add documentation for document-related enums.While the enum values are clear, adding documentation comments would improve maintainability by explaining:
- When each status/decision should be used
- The differences between file types
- The scenarios for each variant
Add documentation comments:
+/// Status of a document in the system enum DocumentStatus { + /// Document has been uploaded/provided by the user provided + /// Document has been requested but not yet provided requested } +/// Decision made on a document after review enum DocumentDecision { + /// Document meets all requirements approved + /// Document does not meet requirements rejected + /// Document needs changes/updates revisions }services/workflows-service/src/providers/file/file.service.ts (2)
24-24
: Remove unused import.The
WorkflowRuntimeData
import from@prisma/client
is not used in this file.-import { WorkflowRuntimeData } from '@prisma/client';
370-373
: Extract file name processing logic.Consider extracting the file name processing logic into a separate method for better readability and reusability.
+ private processFileName(originalName: string): string { + const nameWithoutExtension = (originalName || randomUUID()).replace(/\.[^.]+$/, ''); + return nameWithoutExtension.replace(/\W/g, ''); + } + async uploadNewFile(projectId: string, entityId: string, file: Express.Multer.File) { const customer = await this.customerService.getByProjectId(projectId); - // Remove file extension (get everything before the last dot) - const nameWithoutExtension = (file.originalname || randomUUID()).replace(/\.[^.]+$/, ''); - // Remove non characters - const alphabeticOnlyName = nameWithoutExtension.replace(/\W/g, ''); + const alphabeticOnlyName = this.processFileName(file.originalname);services/workflows-service/src/document/document.repository.ts (1)
9-31
: Consider refining error messages and logging.
Currently, the error message is quite generic. Providing more detail (e.g., which documents failed validation) may help troubleshooting.apps/backoffice-v2/src/lib/blocks/components/CallToActionLegacy/hooks/useCallToActionLegacyLogic/useCallToActionLegacyLogic.tsx (1)
88-126
: Duplicate code paths hint at possible refactoring.
onMutateDecisionByIdV1
handles 3 decision paths (approve/reject/revision). This is fine but partially duplicates logic from V2. Consider factoring out shared logic if feasible.apps/backoffice-v2/src/lib/blocks/hooks/useDocumentBlocks/useDocumentBlocks.tsx (1)
66-159
: Adapter merges local and external documents elegantly.
useDocumentsAdapter
is large but provides a convenient unified interface for both V1 and V2. Consider splitting the logic (e.g., generating titles, extracting pages) into separate helpers to reduce complexity.services/workflows-service/src/document/document.module.ts (1)
10-10
: Fix typo in import path.There's a typo in the import path:
workflow-defintion
should beworkflow-definition
.-import { WorkflowDefinitionModule } from '@/workflow-defintion/workflow-definition.module'; +import { WorkflowDefinitionModule } from '@/workflow-definition/workflow-definition.module';apps/backoffice-v2/src/domains/documents/hooks/mutations/useRejectDocumentByIdMutation/useRejectDocumentByIdMutation.tsx (2)
29-29
: Consider proper Promise handling.Instead of using the
void
operator to suppress the Promise, consider properly handling it or usingawait
.- void queryClient.invalidateQueries(); + await queryClient.invalidateQueries();
33-35
: Improve error handling.Consider the following improvements:
- Properly type the error parameter
- Add fallback for undefined error messages
- onError: (_error, _variables) => { - toast.error(t('toast:reject_document.error', { errorMessage: _error.message })); + onError: (error: Error, _variables) => { + const errorMessage = error?.message ?? t('common:unknown_error'); + toast.error(t('toast:reject_document.error', { errorMessage }));apps/backoffice-v2/src/domains/documents/hooks/mutations/useApproveDocumentByIdMutation/useApproveDocumentByIdMutation.tsx (1)
23-23
: Add type safety for Action enum.Consider using type assertion or validation to ensure Action.APPROVE is a valid value.
- decision: Action.APPROVE, + decision: Action.APPROVE as const,apps/backoffice-v2/src/domains/documents/hooks/mutations/useReviseDocumentByIdMutation/useReviseDocumentByIdMutation.tsx (1)
31-31
: Standardize toast message keys.The toast message keys use 'ask_revision' which is inconsistent with other actions.
- toast.success(t('toast:ask_revision_document.success')); + toast.success(t('toast:revision_document.success')); - toast.error(t('toast:ask_revision_document.error', { errorMessage: _error.message })); + toast.error(t('toast:revision_document.error', { errorMessage: _error.message }));Also applies to: 34-34
apps/backoffice-v2/src/domains/documents/hooks/mutations/useRequestDocumentsMutation/useRequestDocumentsMutation.tsx (2)
9-11
: Remove unused generic type parameter.The generic type parameter TData is not being used effectively.
-export const useRequestDocumentsMutation = (options?: { - onSuccess?: <TData>(data: TData) => void; -}) => { +export const useRequestDocumentsMutation = <TData = unknown>(options?: { + onSuccess?: (data: TData) => void; +}) => {
31-35
: Simplify error message extraction.The error message extraction logic can be simplified using optional chaining.
- toast.error( - t(`toast:request_documents.error`, { - errorMessage: isObject(error) && 'message' in error ? error.message : error, - }), - ); + const errorMessage = error instanceof Error ? error.message : String(error); + toast.error(t(`toast:request_documents.error`, { errorMessage }));apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/DirectorBlock.tsx (1)
27-34
: Memoize the directorWithAml object.The
directorWithAml
object is recreated on every render. Consider usinguseMemo
to optimize performance.Apply this improvement:
- const directorWithAml = { + const directorWithAml = useMemo(() => ({ ...director, documents: directorsDocuments, aml: { vendor: endUser?.amlHits?.find(({ vendor }) => !!vendor)?.vendor, hits: endUser?.amlHits, }, - }; + }), [director, directorsDocuments, endUser?.amlHits]);services/workflows-service/src/workflow/utils/add-properties-schema-to-document.ts (1)
69-73
: Enhance error logging with more context.The current logging could be more descriptive to aid in debugging.
Apply this improvement:
- console.info(`No localized document schemas found for ${countryCode}`); + console.info( + `No localized document schemas found for country: ${countryCode}. ` + + `Available schemas: ${documentsSchema?.length ?? 0}` + );apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (2)
56-56
: Memoize event handlers.The
onPointerDownOutside
handler is recreated on every render.Apply this improvement:
- onPointerDownOutside={e => e.preventDefault()} + onPointerDownOutside={useCallback(e => e.preventDefault(), [])}
127-187
: Add error boundary for DocumentTrackerItems.The component should gracefully handle rendering errors.
Add an error boundary component:
const DocumentTrackerErrorBoundary: FunctionComponent<{ children: ReactNode }> = ({ children }) => { return ( <ErrorBoundary fallback={ <div className="flex h-20 items-center justify-center text-sm"> Error loading documents </div> } > {children} </ErrorBoundary> ); };Then wrap the DocumentTrackerItems:
- <DocumentTrackerItems + <DocumentTrackerErrorBoundary> + <DocumentTrackerItems documentTrackerItems={documentTrackerItems} isLoading={isLoadingDocuments} getSubItems={getSubItems} - /> + /> + </DocumentTrackerErrorBoundary>services/workflows-service/src/workflow-defintion/workflow-definition.repository.ts (1)
252-263
: Optimize query performance for large datasets.The query could be slow for large datasets. Consider adding an index on
workflowRuntimeData.id
andprojectId
to improve query performance.Add the following index to your Prisma schema:
model WorkflowDefinition { @@index([projectId]) } model WorkflowRuntimeData { @@index([id]) }apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/hooks/useDirectorBlock/useDirectorBlock.tsx (1)
103-103
: Consider memoizing the document mapping logic.The document mapping logic is correct but could benefit from memoization to prevent unnecessary recalculations.
Consider memoizing the mapped documents:
+ const documentsWithoutImageUrl = useMemo(() => + isDocumentsV2 + ? documents + : documents?.map(({ details: _details, ...document }) => ({ + ...document, + pages: document?.pages?.map(({ imageUrl: _imageUrl, ...page }) => page), + })), + [isDocumentsV2, documents] + );Also applies to: 106-111
apps/backoffice-v2/src/lib/blocks/components/KycBlock/hooks/useKycBlock/useKycBlock.tsx (1)
590-591
: Consider error handling for document loading.While the loading state is handled correctly, there's no error handling for failed document loading.
Consider adding error handling:
- isLoading: isLoadingDocuments, - data: documents?.flatMap(document => document?.details), + isLoading: isLoadingDocuments, + error: documents === undefined && !isLoadingDocuments ? 'Failed to load documents' : undefined, + data: documents?.flatMap(document => document?.details) ?? [],apps/backoffice-v2/src/lib/blocks/variants/DefaultBlocks/hooks/useDefaultBlocksLogic/useDefaultBlocksLogic.tsx (1)
170-174
: Optimize document adapter performance.The
useDocumentsAdapter
hook is called on every render without memoization.Add memoization to prevent unnecessary recalculations:
-const { documents } = useDocumentsAdapter({ +const { documents } = useMemo(() => useDocumentsAdapter({ documents: workflow?.context?.documents ?? [], entityId: workflow?.context?.entity?.ballerineEntityId ?? '', -}); +}), [workflow?.context?.documents, workflow?.context?.entity?.ballerineEntityId]);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (40)
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/fetchers.ts
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/mutations/useApproveDocumentByIdMutation/useApproveDocumentByIdMutation.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/mutations/useRejectDocumentByIdMutation/useRejectDocumentByIdMutation.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/mutations/useRequestDocumentsMutation/useRequestDocumentsMutation.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/mutations/useResetDocumentDecisionByIdMutation/useResetDocumentDecisionByIdMutation.tsx
(1 hunks)apps/backoffice-v2/src/domains/documents/hooks/mutations/useReviseDocumentByIdMutation/useReviseDocumentByIdMutation.tsx
(1 hunks)apps/backoffice-v2/src/domains/workflow-definitions/fetchers.ts
(1 hunks)apps/backoffice-v2/src/lib/blocks/components/CallToActionLegacy/CallToActionLegacy.tsx
(2 hunks)apps/backoffice-v2/src/lib/blocks/components/CallToActionLegacy/hooks/useCallToActionLegacyLogic/useCallToActionLegacyLogic.tsx
(3 hunks)apps/backoffice-v2/src/lib/blocks/components/CallToActionLegacy/interfaces.ts
(2 hunks)apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/DirectorBlock.tsx
(2 hunks)apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/hooks/useDirectorBlock/create-directors-blocks.tsx
(0 hunks)apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/hooks/useDirectorBlock/helpers.ts
(1 hunks)apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/hooks/useDirectorBlock/useDirectorBlock.tsx
(3 hunks)apps/backoffice-v2/src/lib/blocks/components/EditableDetails/EditableDetails.tsx
(1 hunks)apps/backoffice-v2/src/lib/blocks/components/KycBlock/hooks/useKycBlock/useKycBlock.tsx
(6 hunks)apps/backoffice-v2/src/lib/blocks/hooks/useDocumentBlocks/useDocumentBlocks.tsx
(9 hunks)apps/backoffice-v2/src/lib/blocks/variants/DefaultBlocks/hooks/useDefaultBlocksLogic/useDefaultBlocksLogic.tsx
(8 hunks)apps/backoffice-v2/src/lib/blocks/variants/KybExampleBlocks/hooks/useKybExampleBlocksLogic/useKybExampleBlocksLogic.tsx
(6 hunks)apps/backoffice-v2/src/pages/Entity/components/Case/components/CaseOverview/CaseOverview.tsx
(3 hunks)apps/backoffice-v2/src/pages/Entity/components/Case/hooks/useDocuments/helpers.ts
(1 hunks)apps/backoffice-v2/src/pages/Entity/hooks/useEntityLogic/utils.ts
(1 hunks)services/workflows-service/prisma/data-migrations
(1 hunks)services/workflows-service/prisma/migrations/20250219073840_document_reason_and_comment/migration.sql
(1 hunks)services/workflows-service/prisma/schema.prisma
(6 hunks)services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts
(4 hunks)services/workflows-service/src/document/document.controller.external.ts
(1 hunks)services/workflows-service/src/document/document.module.ts
(1 hunks)services/workflows-service/src/document/document.repository.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)services/workflows-service/src/document/dtos/document.dto.ts
(1 hunks)services/workflows-service/src/document/types.ts
(1 hunks)services/workflows-service/src/providers/file/file.service.ts
(3 hunks)services/workflows-service/src/workflow-defintion/workflow-definition.repository.ts
(1 hunks)services/workflows-service/src/workflow-defintion/workflow-definition.service.ts
(1 hunks)services/workflows-service/src/workflow/schemas/zod-schemas.ts
(1 hunks)services/workflows-service/src/workflow/utils/add-properties-schema-to-document.ts
(1 hunks)services/workflows-service/src/workflow/workflow.service.ts
(3 hunks)
💤 Files with no reviewable changes (1)
- apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/hooks/useDirectorBlock/create-directors-blocks.tsx
🚧 Files skipped from review as they are similar to previous changes (7)
- services/workflows-service/src/workflow/schemas/zod-schemas.ts
- apps/backoffice-v2/src/pages/Entity/components/Case/components/CaseOverview/CaseOverview.tsx
- services/workflows-service/prisma/data-migrations
- services/workflows-service/src/document/types.ts
- apps/backoffice-v2/src/domains/workflow-definitions/fetchers.ts
- services/workflows-service/src/workflow/workflow.service.ts
- apps/backoffice-v2/src/common/components/molecules/DocumentTracker/hooks/useDocumentTracker.tsx
🔇 Additional comments (49)
apps/backoffice-v2/src/lib/blocks/components/CallToActionLegacy/interfaces.ts (1)
22-29
: LGTM! Well-structured comment parameter addition.The optional
comment
parameter is properly typed and aligns with the database schema changes for document tracking.services/workflows-service/src/document/dtos/document.dto.ts (3)
1-2
: Ensure alignment with the Prisma enumerations.Please confirm that the
DocumentStatus
andDocumentDecision
enums from@prisma/client
fully match the values expected by the application, and verify that downstream logic can handle these enumerations as TypeBox enumerations.
22-22
: Verify the omission ofprojectId
from creation.By omitting
projectId
fromCreateDocumentSchema
, callers cannot supply aprojectId
when creating a document. Ensure that the logic for setting or assigningprojectId
is handled elsewhere or confirm that this omission is intended.
37-39
: Looks good for bulk deletion.The
DeleteDocumentsSchema
is straightforward and aligns well with typical bulk-deletion approaches.services/workflows-service/prisma/schema.prisma (2)
965-1000
: Well-structured Document model with comprehensive tracking capabilities.The Document model effectively captures all necessary aspects of document management:
- Core document metadata (category, type, version)
- Document status tracking (status, decision, decisionReason)
- Proper relationships with related entities
- Appropriate indexes for query optimization
100-100
: LGTM! Proper relationship definitions with indexes.The document relationships are correctly added to existing models with appropriate indexes for query optimization.
Also applies to: 152-152, 286-286, 317-317, 426-427
services/workflows-service/src/providers/file/file.service.ts (1)
32-32
: LGTM!The
CustomerService
injection follows dependency injection best practices with appropriate access modifiers.services/workflows-service/src/document/document.repository.ts (10)
40-49
: Straightforward creation logic.
Thecreate
method is concise and seems correct.
51-57
: Bulk creation appears well-structured.
No issues identified withcreateMany
.
59-71
: Query filters look correct.
findMany
cleanly merges user-supplied conditions with project scope.
73-87
: Looks good for single record retrieval.
findById
properly restricts documents to the provided ID and project IDs.
89-105
: Logic is consistent for multi-record retrieval.
findByEntityIdAndWorkflowId
merges entity fields and runtime ID.
107-119
: Structured approach to bulk updates.
updateMany
merges user filters with project constraints.
121-137
: Past recommendation: useupdate
instead ofupdateMany
.
This method updates a single document by ID, but internally usesupdateMany
. The same point was raised in a previous review, so tagging this as a repeated note.
139-167
: File inclusion is properly validated.
findByEntityIdAndWorkflowIdWithFiles
fetches documents, then asserts file presence. No issues found.
169-192
: Approach to retrieving multiple documents with files is consistent.
findManyWithFiles
ensures associated files are included.
194-208
: Deletion method is clear and scoped.
deleteByIds
uses project ID filtering to prevent accidental cross-project deletions.apps/backoffice-v2/src/lib/blocks/components/CallToActionLegacy/hooks/useCallToActionLegacyLogic/useCallToActionLegacyLogic.tsx (5)
6-7
: New imports for document approval/rejection recognized.
The addition ofuseRejectDocumentByIdMutation
anduseApproveDocumentByIdMutation
aligns with the enhanced document decision logic.
46-47
: Hook usage for mutate calls.
Using separate hooks for tasks vs. documents ensures clarity in mutation logic.Also applies to: 51-52
54-60
: Combined loading states are well-managed.
MergingisLoadingApproveDocumentById
andisLoadingRejectDocumentById
into a unified state looks clean.
127-162
: New approach for Document V2 decisions.
The logic foronMutateDecisionByIdV2
is consistent. The code is neatly separated to accommodate documents.
164-188
: Conditional decision calls based onisDocumentsV2
.
onMutateTaskDecisionById
effectively directs the event to V1 or V2 logic, ensuring backward compatibility.apps/backoffice-v2/src/lib/blocks/hooks/useDocumentBlocks/useDocumentBlocks.tsx (11)
1-9
: Imports for document approval, rejection, and error handling.
Adding these ensures the new Document V2 flow is integrated.
41-50
: Visibility for external document fetching.
getDocuments
withhandleZodError
is a neat approach for robust error handling.
52-57
: Defines standardized query keys.
documentsQueryKeys
neatly organizes queries for reusability.
59-64
: Query hook is well-structured.
useDocumentsQuery
is clear, enabling optional usage based on workflow and entity presence.
198-205
: Integration with the new adapter.
Pullingdocuments
anddocumentsSchemas
in one place simplifies the rest of the code.
208-210
: Task vs. Document approvals are handled consistently.
The approach merges new and existing logic under one flow, maintaining backward compatibility.Also applies to: 220-221, 224-250
282-283
: Updated checks for approvals and rejections.
Including bothisLoadingRejectDocumentById
andisLoadingApproveDocumentById
ensures consistent UI states across tasks and documents.Also applies to: 289-290
439-442
: Extended approval callback includes comments.
Good improvement for adding context to approve actions.
448-456
: Flexible dialog configuration.
The new dialog props allow for a more descriptive user prompt.
508-509
: Document reference passed to decision cell.
Passingdocuments
ensures additional context if needed.
628-633
: Refined multi-document handling.
ExpandingmultiDocuments
withisLoading
state logic ties well into the new flows.apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/hooks/useDirectorBlock/helpers.ts (1)
3-10
: LGTM! Clean and type-safe implementation.The function effectively adapts director data to match the expected type structure, with proper type checking using the
satisfies
operator and safe handling of optional values.apps/backoffice-v2/src/pages/Entity/components/Case/hooks/useDocuments/helpers.ts (1)
5-13
: LGTM! Improved null safety.The addition of optional chaining (
?.
) makes the function more robust by safely handling cases wheredocuments
might be undefined.apps/backoffice-v2/src/domains/documents/fetchers.ts (1)
9-9
: Avoid using relative paths in API endpoints.Using relative paths (
../external/...
) can be fragile and hard to maintain.Also applies to: 19-19, 40-40
apps/backoffice-v2/src/common/components/molecules/DocumentTracker/DocumentTracker.tsx (1)
41-53
: Add aria-label to request button and enhance dialog accessibility.The button and dialog components lack proper accessibility attributes.
services/workflows-service/src/collection-flow/controllers/collection-flow.files.controller.ts (1)
75-121
: Add error handling and transaction management.The document creation process could fail at multiple points (metadata parsing, properties parsing, document creation). Consider wrapping the operation in a try-catch block to ensure proper error handling and cleanup.
Apply this diff to add error handling:
async createDocument( @TokenScope() tokenScope: ITokenScope, @Body() data: Omit<Static<typeof CollectionFlowDocumentSchema>, 'properties'> & { metadata: string; properties: string; }, @UploadedFile(...) file: Express.Multer.File, ) { + try { const metadata = DocumentFileJsonSchema.parse(data.metadata); const properties = z .preprocess(value => { if (typeof value !== 'string') { return value; } return JSON.parse(value); }, z.record(z.string(), z.unknown())) .parse(data.properties); // FormData returns version as a string // Manually converting to number to avoid validation errors data.version = Number(data.version); const documentsCreationResults = await this.documentService.create({ ...data, workflowRuntimeDataId: tokenScope.workflowRuntimeDataId, properties, metadata, file, projectId: tokenScope.projectId, }); return documentsCreationResults.at(-1); + } catch (error) { + // Clean up the uploaded file + await fs.unlink(file.path).catch(console.error); + if (error instanceof z.ZodError) { + throw new UnprocessableEntityException('Invalid metadata or properties format'); + } + throw error; + } }apps/backoffice-v2/src/lib/blocks/variants/KybExampleBlocks/hooks/useKybExampleBlocksLogic/useKybExampleBlocksLogic.tsx (5)
36-36
: LGTM! Import of new document revision mutation hook.The import aligns with the new document revision functionality.
77-78
: LGTM! Initialization of new document revision mutation hook.The mutation hook is properly initialized with consistent naming conventions.
95-109
: Verify error handling in the document revision logic.The conditional logic for document revision looks good, but there's no error handling for the case where both conditions are false (when
isDocumentsV2
is undefined).Consider adding error handling:
- if (workflow?.workflowDefinition?.config?.isDocumentsV2) { + const isDocumentsV2 = workflow?.workflowDefinition?.config?.isDocumentsV2; + if (isDocumentsV2 === true) { mutateReviseDocumentById({ documentId, decisionReason: reason, }); - } - - if (!workflow?.workflowDefinition?.config?.isDocumentsV2) { + } else if (isDocumentsV2 === false) { mutateRevisionTaskById({ workflowId, documentId, reason, contextUpdateMethod: 'base', }); + } else { + toast.error('Invalid document version configuration'); }
111-115
: LGTM! Updated dependency array.The dependency array correctly includes all dependencies used in the callback.
134-134
: LGTM! Consistent loading state updates.The loading state is consistently updated to include both mutation states.
Also applies to: 342-342
apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/hooks/useDirectorBlock/useDirectorBlock.tsx (2)
77-83
: LGTM! Well-structured document details type definition.The new
details
property is well-defined with clear, descriptive fields.
401-401
: LGTM! Updated document rendering to use details property.The document rendering correctly uses the new
details
property.apps/backoffice-v2/src/lib/blocks/components/KycBlock/hooks/useKycBlock/useKycBlock.tsx (2)
45-48
: LGTM! Clean migration to document adapter hook.The migration from
useStorageFilesQuery
touseDocumentsAdapter
is clean and well-structured.
163-163
: LGTM! Consistent document data passing.The document data is consistently passed to all components using the new adapter.
Also applies to: 449-449, 565-565
services/workflows-service/src/document/document.service.ts (2)
388-446
: Handle cleanup of old files during re-upload.The
reuploadDocumentFileById
method doesn't clean up the old file after successful re-upload, which could lead to storage leaks.
625-727
: Add error handling for document creation failures.The
requestDocumentsByIds
method doesn't handle partial failures during document creation.
@@ -9,7 +9,7 @@ export interface ICallToActionLegacyProps { | |||
directorId?: string; | |||
workflow: TWorkflowById; | |||
disabled: boolean; | |||
decision: 'reject' | 'approve' | 'revision' | 'revised'; | |||
decision: 'reject' | 'approve' | 'revision'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Verify the removal of 'revised' decision type.
The removal of the 'revised' decision type is a breaking change. Please ensure all consumers of this interface have been updated to handle only 'reject', 'approve', and 'revision' decisions.
🏁 Script executed:
#!/bin/bash
# Search for any remaining usage of 'revised' decision type
rg -A 5 "decision.*['\"]revised['\"]" --type ts
Length of output: 3927
Action Required: Update Decision Type Consumers to Remove 'revised'
The removal of 'revised'
in the CallToActionLegacy/interfaces.ts
file is not consistently reflected across the codebase. Our search has revealed that the 'revised'
value is still being used in several locations. Please address the following:
-
DirectorBlock:
- In
apps/backoffice-v2/src/lib/blocks/components/DirectorBlock/hooks/useDirectorBlock/useDirectorBlock.tsx
, the code compares the decision status against'revised'
.
- In
-
Typed Definitions:
- In
apps/backoffice-v2/src/lib/blocks/create-blocks-typed/types.ts
, thedecision
type still includes'revised'
.
- In
-
Document Blocks:
- In
apps/backoffice-v2/src/lib/blocks/hooks/useDocumentBlocks/useDocumentBlocks.tsx
, there are conditionals that check ifdecision?.status === 'revised'
.
- In
-
Workflows Service:
- In
services/workflows-service/src/workflow/dtos/document-decision-update-input.ts
, the union type fordecision
still lists'revised'
.
- In
To ensure consistency and prevent potential bugs, please update all consumers of the decision type to remove or otherwise handle the deprecated 'revised'
value. This might involve:
- Removing the
'revised'
comparison logic. - Updating type definitions to match the new union type with only
'reject' | 'approve' | 'revision'
. - Verifying that related business logic in each consumer is in line with the intended document tracking objectives.
export const UpdateDocumentDecisionSchema = Type.Composite([ | ||
Type.Pick(DocumentSchema, ['decisionReason', 'comment']), | ||
Type.Object({ | ||
decision: Type.Union([ | ||
Type.Literal('approve'), | ||
Type.Literal('reject'), | ||
Type.Literal('revision'), | ||
]), | ||
}), | ||
]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Synchronize UpdateDocumentDecisionSchema
decision field with DocumentDecision
.
The 'approve' | 'reject' | 'revision'
literals may overlap with the DocumentDecision
enum imported from Prisma. If these sets differ, it may lead to inconsistencies in how decisions are interpreted or persisted. Consider keeping them in sync by referencing the same enum or clarifying their differences.
export const extractCountryCodeFromDocuments = (documents: TDocument[]) => { | ||
return documents?.find(document => { | ||
return !!document?.issuer?.country; | ||
})?.issuer?.country; | ||
return !!document?.issuingCountry; | ||
})?.issuingCountry; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
LGTM! Property access aligned with new Document model.
The change from issuer?.country
to issuingCountry
correctly aligns with the new Document model structure.
Let's verify that all document-related code has been updated to use the new property:
🏁 Script executed:
#!/bin/bash
# Description: Search for any remaining uses of issuer.country in document-related code
rg -g '*.{ts,tsx,js,jsx}' 'issuer\?\.country'
Length of output: 1040
Action Needed: Confirm Consistency in Document Model Property Usage
The changes in this file (apps/backoffice-v2/src/pages/Entity/hooks/useEntityLogic/utils.ts, lines 69–72) correctly updated the property to issuingCountry
, which aligns with the new Document model. However, a repository-wide search still reveals several references to the old property (issuer?.country
). In particular:
- apps/backoffice-v2/src/lib/blocks/components/KycBlock/hooks/useKycBlock/useKycBlock.tsx
- packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts
- packages/common/src/schemas/documents/workflow/documents/schemas/index.ts
- services/workflows-service/src/workflow/utils/add-properties-schema-to-document.ts
Please verify whether these instances should also be updated to use issuingCountry
for consistency or are intentionally left as-is.
if (!entityId) { | ||
throw new NotFoundException("Workflow doesn't exists"); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve entity ID validation.
The current validation only checks if entityId
is falsy. Consider validating the existence of the entity in the database.
- if (!entityId) {
- throw new NotFoundException("Workflow doesn't exists");
+ const entity = await this.workflowService.findById(entityId);
+ if (!entity) {
+ throw new NotFoundException(`Workflow with ID ${entityId} doesn't exist`);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if (!entityId) { | |
throw new NotFoundException("Workflow doesn't exists"); | |
} | |
const entity = await this.workflowService.findById(entityId); | |
if (!entity) { | |
throw new NotFoundException(`Workflow with ID ${entityId} doesn't exist`); | |
} |
async uploadNewFile(projectId: string, entityId: string, file: Express.Multer.File) { | ||
// upload file into a customer folder | ||
const customer = await this.customerService.getByProjectId(projectId); | ||
|
||
if (!entityId) { | ||
throw new NotFoundException("Workflow doesn't exists"); | ||
} | ||
|
||
// Remove file extension (get everything before the last dot) | ||
const nameWithoutExtension = (file.originalname || randomUUID()).replace(/\.[^.]+$/, ''); | ||
// Remove non characters | ||
const alphabeticOnlyName = nameWithoutExtension.replace(/\W/g, ''); | ||
|
||
return await this.copyToDestinationAndCreate( | ||
{ | ||
id: alphabeticOnlyName, | ||
uri: file.path, | ||
provider: 'file-system', | ||
fileName: file.originalname, | ||
}, | ||
entityId, | ||
projectId, | ||
customer.name, | ||
{ shouldDownloadFromSource: false }, | ||
); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add file validation and error handling.
Consider adding:
- File size and type validation
- Error handling for customer lookup
- Logging for better observability
async uploadNewFile(projectId: string, entityId: string, file: Express.Multer.File) {
+ // Validate file
+ if (!file || !file.buffer) {
+ throw new BadRequestException('No file provided');
+ }
+
+ const maxSize = 5 * 1024 * 1024; // 5MB
+ if (file.size > maxSize) {
+ throw new BadRequestException('File size exceeds limit');
+ }
+
+ // Validate allowed mime types
+ const allowedTypes = ['image/jpeg', 'image/png', 'application/pdf'];
+ if (!allowedTypes.includes(file.mimetype)) {
+ throw new BadRequestException('Invalid file type');
+ }
+
// upload file into a customer folder
- const customer = await this.customerService.getByProjectId(projectId);
+ let customer;
+ try {
+ customer = await this.customerService.getByProjectId(projectId);
+ } catch (error) {
+ this.logger.error('Failed to fetch customer', { error, projectId });
+ throw new NotFoundException(`Customer not found for project ${projectId}`);
+ }
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async uploadNewFile(projectId: string, entityId: string, file: Express.Multer.File) { | |
// upload file into a customer folder | |
const customer = await this.customerService.getByProjectId(projectId); | |
if (!entityId) { | |
throw new NotFoundException("Workflow doesn't exists"); | |
} | |
// Remove file extension (get everything before the last dot) | |
const nameWithoutExtension = (file.originalname || randomUUID()).replace(/\.[^.]+$/, ''); | |
// Remove non characters | |
const alphabeticOnlyName = nameWithoutExtension.replace(/\W/g, ''); | |
return await this.copyToDestinationAndCreate( | |
{ | |
id: alphabeticOnlyName, | |
uri: file.path, | |
provider: 'file-system', | |
fileName: file.originalname, | |
}, | |
entityId, | |
projectId, | |
customer.name, | |
{ shouldDownloadFromSource: false }, | |
); | |
} | |
async uploadNewFile(projectId: string, entityId: string, file: Express.Multer.File) { | |
// Validate file | |
if (!file || !file.buffer) { | |
throw new BadRequestException('No file provided'); | |
} | |
const maxSize = 5 * 1024 * 1024; // 5MB | |
if (file.size > maxSize) { | |
throw new BadRequestException('File size exceeds limit'); | |
} | |
// Validate allowed mime types | |
const allowedTypes = ['image/jpeg', 'image/png', 'application/pdf']; | |
if (!allowedTypes.includes(file.mimetype)) { | |
throw new BadRequestException('Invalid file type'); | |
} | |
// upload file into a customer folder | |
let customer; | |
try { | |
customer = await this.customerService.getByProjectId(projectId); | |
} catch (error) { | |
this.logger.error('Failed to fetch customer', { error, projectId }); | |
throw new NotFoundException(`Customer not found for project ${projectId}`); | |
} | |
if (!entityId) { | |
throw new NotFoundException("Workflow doesn't exists"); | |
} | |
// Remove file extension (get everything before the last dot) | |
const nameWithoutExtension = (file.originalname || randomUUID()).replace(/\.[^.]+$/, ''); | |
// Remove non characters | |
const alphabeticOnlyName = nameWithoutExtension.replace(/\W/g, ''); | |
return await this.copyToDestinationAndCreate( | |
{ | |
id: alphabeticOnlyName, | |
uri: file.path, | |
provider: 'file-system', | |
fileName: file.originalname, | |
}, | |
entityId, | |
projectId, | |
customer.name, | |
{ shouldDownloadFromSource: false }, | |
); | |
} |
-- AlterTable | ||
ALTER TABLE "Document" ADD COLUMN "comment" TEXT, | ||
ADD COLUMN "decisionReason" TEXT; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add default values for new columns.
The new columns are added without default values, which could affect existing code that doesn't handle null values.
Consider adding default values:
-- AlterTable
-ALTER TABLE "Document" ADD COLUMN "comment" TEXT,
-ADD COLUMN "decisionReason" TEXT;
+ALTER TABLE "Document" ADD COLUMN "comment" TEXT DEFAULT NULL,
+ADD COLUMN "decisionReason" TEXT DEFAULT NULL;
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
-- AlterTable | |
ALTER TABLE "Document" ADD COLUMN "comment" TEXT, | |
ADD COLUMN "decisionReason" TEXT; | |
-- AlterTable | |
ALTER TABLE "Document" ADD COLUMN "comment" TEXT DEFAULT NULL, | |
ADD COLUMN "decisionReason" TEXT DEFAULT NULL; |
private parseDocumentsFromUISchema(uiSchema: Array<Record<string, any>>): TParsedDocuments { | ||
const result: TParsedDocuments = { | ||
business: [], | ||
individuals: { | ||
ubos: [], | ||
directors: [], | ||
}, | ||
}; | ||
|
||
const processElement = (element: Record<string, any>) => { | ||
if (isType(z.array(z.record(z.string(), z.any())))(element?.elements)) { | ||
element.elements.forEach(processElement); | ||
} | ||
|
||
if (isType(z.array(z.record(z.string(), z.any())))(element?.children)) { | ||
element.children.forEach(processElement); | ||
} | ||
|
||
if (element.element !== 'documentfield') { | ||
return; | ||
} | ||
|
||
const template = element.params.template; | ||
|
||
const parsedDocument = z | ||
.object({ | ||
type: z.string(), | ||
id: z.string(), | ||
category: z.string(), | ||
issuer: z.object({ | ||
country: z.string(), | ||
}), | ||
issuingVersion: z.number(), | ||
version: z.string(), | ||
entityType: z.enum(['business', 'ubo', 'director']).default('business'), | ||
}) | ||
.transform(({ entityType, type, id, category, issuer, issuingVersion, version }) => ({ | ||
entityType, | ||
type, | ||
templateId: id, | ||
category, | ||
issuingCountry: issuer.country, | ||
issuingVersion: issuingVersion.toString(), | ||
version, | ||
})) | ||
.safeParse(template); | ||
|
||
if (!parsedDocument.success) { | ||
return; | ||
} | ||
|
||
if (!element.valueDestination) { | ||
return; | ||
} | ||
|
||
const isUboDocument = | ||
element.valueDestination.includes('.ubo') && | ||
element.valueDestination.includes('.documents'); | ||
const isDirectorDocument = | ||
element.valueDestination.includes('.director') && | ||
element.valueDestination.includes('.documents'); | ||
|
||
if (isUboDocument) { | ||
parsedDocument.data.entityType = 'ubo'; | ||
result.individuals.ubos.push(parsedDocument.data); | ||
} else if (isDirectorDocument) { | ||
parsedDocument.data.entityType = 'director'; | ||
result.individuals.directors.push(parsedDocument.data); | ||
} else { | ||
result.business.push(parsedDocument.data); | ||
} | ||
}; | ||
|
||
uiSchema.forEach(processElement); | ||
|
||
return result; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve error handling in document parsing.
The parseDocumentsFromUISchema
method silently ignores parsing errors, which could hide configuration issues.
Add error logging and validation:
private parseDocumentsFromUISchema(uiSchema: Array<Record<string, any>>): TParsedDocuments {
const result: TParsedDocuments = {
business: [],
individuals: {
ubos: [],
directors: [],
},
};
const processElement = (element: Record<string, any>) => {
if (!element) {
+ console.warn('Encountered null or undefined element in UI schema');
return;
}
// ... parsing logic ...
if (!parsedDocument.success) {
+ console.error('Failed to parse document template:', parsedDocument.error);
return;
}
// ... rest of the method ...
};
uiSchema.forEach(processElement);
+
+ if (!result.business.length && !result.individuals.ubos.length && !result.individuals.directors.length) {
+ console.warn('No documents were parsed from the UI schema');
+ }
return result;
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
private parseDocumentsFromUISchema(uiSchema: Array<Record<string, any>>): TParsedDocuments { | |
const result: TParsedDocuments = { | |
business: [], | |
individuals: { | |
ubos: [], | |
directors: [], | |
}, | |
}; | |
const processElement = (element: Record<string, any>) => { | |
if (isType(z.array(z.record(z.string(), z.any())))(element?.elements)) { | |
element.elements.forEach(processElement); | |
} | |
if (isType(z.array(z.record(z.string(), z.any())))(element?.children)) { | |
element.children.forEach(processElement); | |
} | |
if (element.element !== 'documentfield') { | |
return; | |
} | |
const template = element.params.template; | |
const parsedDocument = z | |
.object({ | |
type: z.string(), | |
id: z.string(), | |
category: z.string(), | |
issuer: z.object({ | |
country: z.string(), | |
}), | |
issuingVersion: z.number(), | |
version: z.string(), | |
entityType: z.enum(['business', 'ubo', 'director']).default('business'), | |
}) | |
.transform(({ entityType, type, id, category, issuer, issuingVersion, version }) => ({ | |
entityType, | |
type, | |
templateId: id, | |
category, | |
issuingCountry: issuer.country, | |
issuingVersion: issuingVersion.toString(), | |
version, | |
})) | |
.safeParse(template); | |
if (!parsedDocument.success) { | |
return; | |
} | |
if (!element.valueDestination) { | |
return; | |
} | |
const isUboDocument = | |
element.valueDestination.includes('.ubo') && | |
element.valueDestination.includes('.documents'); | |
const isDirectorDocument = | |
element.valueDestination.includes('.director') && | |
element.valueDestination.includes('.documents'); | |
if (isUboDocument) { | |
parsedDocument.data.entityType = 'ubo'; | |
result.individuals.ubos.push(parsedDocument.data); | |
} else if (isDirectorDocument) { | |
parsedDocument.data.entityType = 'director'; | |
result.individuals.directors.push(parsedDocument.data); | |
} else { | |
result.business.push(parsedDocument.data); | |
} | |
}; | |
uiSchema.forEach(processElement); | |
return result; | |
} | |
private parseDocumentsFromUISchema(uiSchema: Array<Record<string, any>>): TParsedDocuments { | |
const result: TParsedDocuments = { | |
business: [], | |
individuals: { | |
ubos: [], | |
directors: [], | |
}, | |
}; | |
const processElement = (element: Record<string, any>) => { | |
+ if (!element) { | |
+ console.warn('Encountered null or undefined element in UI schema'); | |
+ return; | |
+ } | |
if (isType(z.array(z.record(z.string(), z.any())))(element?.elements)) { | |
element.elements.forEach(processElement); | |
} | |
if (isType(z.array(z.record(z.string(), z.any())))(element?.children)) { | |
element.children.forEach(processElement); | |
} | |
if (element.element !== 'documentfield') { | |
return; | |
} | |
const template = element.params.template; | |
const parsedDocument = z | |
.object({ | |
type: z.string(), | |
id: z.string(), | |
category: z.string(), | |
issuer: z.object({ | |
country: z.string(), | |
}), | |
issuingVersion: z.number(), | |
version: z.string(), | |
entityType: z.enum(['business', 'ubo', 'director']).default('business'), | |
}) | |
.transform(({ entityType, type, id, category, issuer, issuingVersion, version }) => ({ | |
entityType, | |
type, | |
templateId: id, | |
category, | |
issuingCountry: issuer.country, | |
issuingVersion: issuingVersion.toString(), | |
version, | |
})) | |
.safeParse(template); | |
if (!parsedDocument.success) { | |
+ console.error('Failed to parse document template:', parsedDocument.error); | |
return; | |
} | |
if (!element.valueDestination) { | |
return; | |
} | |
const isUboDocument = | |
element.valueDestination.includes('.ubo') && | |
element.valueDestination.includes('.documents'); | |
const isDirectorDocument = | |
element.valueDestination.includes('.director') && | |
element.valueDestination.includes('.documents'); | |
if (isUboDocument) { | |
parsedDocument.data.entityType = 'ubo'; | |
result.individuals.ubos.push(parsedDocument.data); | |
} else if (isDirectorDocument) { | |
parsedDocument.data.entityType = 'director'; | |
result.individuals.directors.push(parsedDocument.data); | |
} else { | |
result.business.push(parsedDocument.data); | |
} | |
}; | |
uiSchema.forEach(processElement); | |
+ | |
+ if ( | |
+ !result.business.length && | |
+ !result.individuals.ubos.length && | |
+ !result.individuals.directors.length | |
+ ) { | |
+ console.warn('No documents were parsed from the UI schema'); | |
+ } | |
return result; | |
} |
async updateDocumentDecisionById( | ||
id: string, | ||
projectIds: TProjectId[], | ||
data: { | ||
decision: 'approve' | 'reject' | 'revision' | null; | ||
} & Pick<Prisma.DocumentUpdateInput, 'decisionReason' | 'comment'>, | ||
args?: Prisma.DocumentUpdateManyArgs, | ||
transaction?: PrismaTransactionClient, | ||
) { | ||
const document = await this.repository.findById(id, projectIds); | ||
|
||
if (!document) { | ||
throw new BadRequestException(`Document with an id of "${id}" was not found`); | ||
} | ||
|
||
if (!document.workflowRuntimeDataId) { | ||
throw new BadRequestException(`Attempted to update decision for a document with no workflow`); | ||
} | ||
|
||
const workflowDefinition = await this.workflowDefinitionService.getByWorkflowRuntimeDataId( | ||
document.workflowRuntimeDataId, | ||
projectIds, | ||
); | ||
|
||
if (!workflowDefinition) { | ||
throw new BadRequestException( | ||
`Workflow definition for a workflow with an id of "${document.workflowRuntimeDataId}" was not found`, | ||
); | ||
} | ||
|
||
const documentWithPropertiesSchema = addPropertiesSchemaToDocument( | ||
// @ts-expect-error -- the function expects properties not used by the function. | ||
{ | ||
...document, | ||
issuer: { | ||
country: document.issuingCountry, | ||
}, | ||
}, | ||
workflowDefinition.documentsSchema, | ||
); | ||
const propertiesSchema = documentWithPropertiesSchema.propertiesSchema ?? {}; | ||
const shouldValidateDocument = | ||
data.decision === 'approve' && Object.keys(propertiesSchema)?.length; | ||
|
||
if (shouldValidateDocument) { | ||
const validatePropertiesSchema = ajv.compile(propertiesSchema); | ||
const isValidPropertiesSchema = validatePropertiesSchema( | ||
documentWithPropertiesSchema?.properties, | ||
); | ||
|
||
if (!isValidPropertiesSchema) { | ||
throw ValidationError.fromAjvError(validatePropertiesSchema.errors ?? []); | ||
} | ||
} | ||
|
||
const Status = { | ||
approve: 'approved', | ||
reject: 'rejected', | ||
revision: 'revisions', | ||
} as const; | ||
|
||
const decision = data.decision ? Status[data.decision] : null; | ||
|
||
await this.repository.updateById( | ||
id, | ||
projectIds, | ||
{ | ||
...data, | ||
decision, | ||
}, | ||
args, | ||
transaction, | ||
); | ||
|
||
const documents = await this.repository.findManyWithFiles(projectIds); | ||
|
||
return this.formatDocuments({ | ||
documents, | ||
documentSchema: workflowDefinition.documentsSchema, | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add transaction rollback for failed document decisions.
The updateDocumentDecisionById
method performs multiple database operations without transaction management. If the update fails after validation, the system could be left in an inconsistent state.
Wrap the operations in a transaction:
async updateDocumentDecisionById(
id: string,
projectIds: TProjectId[],
data: {
decision: 'approve' | 'reject' | 'revision' | null;
} & Pick<Prisma.DocumentUpdateInput, 'decisionReason' | 'comment'>,
args?: Prisma.DocumentUpdateManyArgs,
transaction?: PrismaTransactionClient,
) {
+ const tx = transaction ?? await this.prisma.$transaction();
+ try {
const document = await this.repository.findById(id, projectIds);
// ... validation logic ...
await this.repository.updateById(
id,
projectIds,
{
...data,
decision,
},
args,
tx,
);
const documents = await this.repository.findManyWithFiles(projectIds);
+ if (!transaction) await tx.commit();
return this.formatDocuments({
documents,
documentSchema: workflowDefinition.documentsSchema,
});
+ } catch (error) {
+ if (!transaction) await tx.rollback();
+ throw error;
+ }
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async updateDocumentDecisionById( | |
id: string, | |
projectIds: TProjectId[], | |
data: { | |
decision: 'approve' | 'reject' | 'revision' | null; | |
} & Pick<Prisma.DocumentUpdateInput, 'decisionReason' | 'comment'>, | |
args?: Prisma.DocumentUpdateManyArgs, | |
transaction?: PrismaTransactionClient, | |
) { | |
const document = await this.repository.findById(id, projectIds); | |
if (!document) { | |
throw new BadRequestException(`Document with an id of "${id}" was not found`); | |
} | |
if (!document.workflowRuntimeDataId) { | |
throw new BadRequestException(`Attempted to update decision for a document with no workflow`); | |
} | |
const workflowDefinition = await this.workflowDefinitionService.getByWorkflowRuntimeDataId( | |
document.workflowRuntimeDataId, | |
projectIds, | |
); | |
if (!workflowDefinition) { | |
throw new BadRequestException( | |
`Workflow definition for a workflow with an id of "${document.workflowRuntimeDataId}" was not found`, | |
); | |
} | |
const documentWithPropertiesSchema = addPropertiesSchemaToDocument( | |
// @ts-expect-error -- the function expects properties not used by the function. | |
{ | |
...document, | |
issuer: { | |
country: document.issuingCountry, | |
}, | |
}, | |
workflowDefinition.documentsSchema, | |
); | |
const propertiesSchema = documentWithPropertiesSchema.propertiesSchema ?? {}; | |
const shouldValidateDocument = | |
data.decision === 'approve' && Object.keys(propertiesSchema)?.length; | |
if (shouldValidateDocument) { | |
const validatePropertiesSchema = ajv.compile(propertiesSchema); | |
const isValidPropertiesSchema = validatePropertiesSchema( | |
documentWithPropertiesSchema?.properties, | |
); | |
if (!isValidPropertiesSchema) { | |
throw ValidationError.fromAjvError(validatePropertiesSchema.errors ?? []); | |
} | |
} | |
const Status = { | |
approve: 'approved', | |
reject: 'rejected', | |
revision: 'revisions', | |
} as const; | |
const decision = data.decision ? Status[data.decision] : null; | |
await this.repository.updateById( | |
id, | |
projectIds, | |
{ | |
...data, | |
decision, | |
}, | |
args, | |
transaction, | |
); | |
const documents = await this.repository.findManyWithFiles(projectIds); | |
return this.formatDocuments({ | |
documents, | |
documentSchema: workflowDefinition.documentsSchema, | |
}); | |
} | |
async updateDocumentDecisionById( | |
id: string, | |
projectIds: TProjectId[], | |
data: { | |
decision: 'approve' | 'reject' | 'revision' | null; | |
} & Pick<Prisma.DocumentUpdateInput, 'decisionReason' | 'comment'>, | |
args?: Prisma.DocumentUpdateManyArgs, | |
transaction?: PrismaTransactionClient, | |
) { | |
const tx = transaction ?? await this.prisma.$transaction(); | |
try { | |
const document = await this.repository.findById(id, projectIds); | |
if (!document) { | |
throw new BadRequestException(`Document with an id of "${id}" was not found`); | |
} | |
if (!document.workflowRuntimeDataId) { | |
throw new BadRequestException(`Attempted to update decision for a document with no workflow`); | |
} | |
const workflowDefinition = await this.workflowDefinitionService.getByWorkflowRuntimeDataId( | |
document.workflowRuntimeDataId, | |
projectIds, | |
); | |
if (!workflowDefinition) { | |
throw new BadRequestException( | |
`Workflow definition for a workflow with an id of "${document.workflowRuntimeDataId}" was not found`, | |
); | |
} | |
const documentWithPropertiesSchema = addPropertiesSchemaToDocument( | |
// @ts-expect-error -- the function expects properties not used by the function. | |
{ | |
...document, | |
issuer: { | |
country: document.issuingCountry, | |
}, | |
}, | |
workflowDefinition.documentsSchema, | |
); | |
const propertiesSchema = documentWithPropertiesSchema.propertiesSchema ?? {}; | |
const shouldValidateDocument = | |
data.decision === 'approve' && Object.keys(propertiesSchema)?.length; | |
if (shouldValidateDocument) { | |
const validatePropertiesSchema = ajv.compile(propertiesSchema); | |
const isValidPropertiesSchema = validatePropertiesSchema( | |
documentWithPropertiesSchema?.properties, | |
); | |
if (!isValidPropertiesSchema) { | |
throw ValidationError.fromAjvError(validatePropertiesSchema.errors ?? []); | |
} | |
} | |
const Status = { | |
approve: 'approved', | |
reject: 'rejected', | |
revision: 'revisions', | |
} as const; | |
const decision = data.decision ? Status[data.decision] : null; | |
await this.repository.updateById( | |
id, | |
projectIds, | |
{ | |
...data, | |
decision, | |
}, | |
args, | |
tx, | |
); | |
const documents = await this.repository.findManyWithFiles(projectIds); | |
if (!transaction) await tx.commit(); | |
return this.formatDocuments({ | |
documents, | |
documentSchema: workflowDefinition.documentsSchema, | |
}); | |
} catch (error) { | |
if (!transaction) await tx.rollback(); | |
throw error; | |
} | |
} |
const { mutate: mutateReviseDocumentById, isLoading: isLoadingReviseDocumentById } = | ||
useReviseDocumentByIdMutation(); | ||
const onReuploadNeeded = useCallback( | ||
({ | ||
workflowId, | ||
documentId, | ||
reason, | ||
comment, | ||
}: Pick< | ||
Parameters<typeof mutateRevisionTaskById>[0], | ||
'workflowId' | 'documentId' | 'reason' | ||
>) => | ||
> & { comment?: string }) => | ||
() => { | ||
if (!documentId) { | ||
toast.error('Invalid task id'); | ||
|
||
return; | ||
} | ||
|
||
mutateRevisionTaskById({ | ||
workflowId, | ||
documentId, | ||
reason, | ||
contextUpdateMethod: 'base', | ||
}); | ||
if (workflow?.workflowDefinition?.config?.isDocumentsV2) { | ||
mutateReviseDocumentById({ | ||
documentId, | ||
decisionReason: reason, | ||
comment, | ||
}); | ||
} | ||
|
||
if (!workflow?.workflowDefinition?.config?.isDocumentsV2) { | ||
mutateRevisionTaskById({ | ||
workflowId, | ||
documentId, | ||
reason, | ||
contextUpdateMethod: 'base', | ||
}); | ||
} | ||
}, | ||
[mutateRevisionTaskById], | ||
[ | ||
workflow?.workflowDefinition?.config?.isDocumentsV2, | ||
mutateReviseDocumentById, | ||
mutateRevisionTaskById, | ||
], | ||
); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add loading state management for document decisions.
The loading states for document decisions are not properly managed, which could lead to UI inconsistencies.
Add comprehensive loading state management:
const {
mutate: mutateReviseDocumentById,
isLoading: isLoadingReviseDocumentById,
+ error: reviseDocumentError,
} = useReviseDocumentByIdMutation();
+useEffect(() => {
+ if (reviseDocumentError) {
+ toast.error('Failed to revise document: ' + reviseDocumentError.message);
+ }
+}, [reviseDocumentError]);
const onReuploadNeeded = useCallback(
({
workflowId,
documentId,
reason,
comment,
}: Pick<
Parameters<typeof mutateRevisionTaskById>[0],
'workflowId' | 'documentId' | 'reason'
> & { comment?: string }) =>
() => {
+ if (isLoadingReviseDocumentById || isLoadingReuploadNeeded) {
+ return;
+ }
if (!documentId) {
toast.error('Invalid task id');
return;
}
Committable suggestion skipped: line range outside the PR's diff.
const directors = | ||
workflow?.context?.entity?.data?.additionalInfo?.directors?.map(directorAdapter); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error boundary for director mapping.
The directors mapping could throw if the workflow context is undefined.
Add error handling:
-const directors =
- workflow?.context?.entity?.data?.additionalInfo?.directors?.map(directorAdapter);
+const directors = useMemo(() => {
+ try {
+ return workflow?.context?.entity?.data?.additionalInfo?.directors?.map(directorAdapter) ?? [];
+ } catch (error) {
+ console.error('Failed to map directors:', error);
+ return [];
+ }
+}, [workflow?.context?.entity?.data?.additionalInfo?.directors]);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const directors = | |
workflow?.context?.entity?.data?.additionalInfo?.directors?.map(directorAdapter); | |
const directors = useMemo(() => { | |
try { | |
return workflow?.context?.entity?.data?.additionalInfo?.directors?.map(directorAdapter) ?? []; | |
} catch (error) { | |
console.error('Failed to map directors:', error); | |
return []; | |
} | |
}, [workflow?.context?.entity?.data?.additionalInfo?.directors]); |
…ayout (#3079) Co-authored-by: Omri Levy <[email protected]>
url: '{_app.apiUrl}collection-flow/files', | ||
method: 'DELETE', | ||
headers: { | ||
Authorization: 'Bearer {_app.accessToken}', |
Check failure
Code scanning / CodeQL
Hard-coded credentials Critical
authorization header
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
♻️ Duplicate comments (1)
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-creation.ts (1)
12-21
:⚠️ Potential issueAdd type validation for entityType to prevent runtime errors.
The function casts
element.params?.type
toTEntityFieldGroupType
without validation, which could lead to runtime errors if the type is invalid.Consider adding type validation:
+const isValidEntityType = (type: unknown): type is TEntityFieldGroupType => { + return type === 'director' || type === 'ubo'; +}; + export const buildEntityCreationPayload = async ( element: IFormElement<any, IEntityFieldGroupParams>, entity: IEntity, context: AnyObject, ): Promise<{ entity: IEntity; entityType: TEntityFieldGroupType; ballerineEntityId?: string }> => { + const type = element.params?.type; + if (!type || !isValidEntityType(type)) { + throw new Error(`Invalid entity type: ${type}`); + } + const entityToCreate = element.params?.httpParams?.createEntity?.transform ? await transform(context, entity, element.params!.httpParams?.createEntity.transform) : entity; return { entity: entityToCreate, - entityType: element.params?.type as TEntityFieldGroupType, + entityType: type, ballerineEntityId: undefined, }; };
🧹 Nitpick comments (14)
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-update.ts (1)
10-10
: Consider improving type safety.Two suggestions to enhance type safety:
- Replace
any
inIFormElement<any, IEntityFieldGroupParams>
with a more specific type.- Consider using type guards instead of type assertion for
element.params?.type as TEntityFieldGroupType
.Example improvement:
- element: IFormElement<any, IEntityFieldGroupParams>, + element: IFormElement<unknown, IEntityFieldGroupParams>,And for the type assertion:
- entityType: element.params?.type as TEntityFieldGroupType, + entityType: isEntityFieldGroupType(element.params?.type) + ? element.params.type + : throw new Error('Invalid entity type'),You'll need to add this type guard:
function isEntityFieldGroupType(type: unknown): type is TEntityFieldGroupType { return typeof type === 'string' && Object.values(TEntityFieldGroupType).includes(type as TEntityFieldGroupType); }Also applies to: 20-20
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts (3)
10-14
: Consider making the interface more type-safe.The interface uses
any
type which could be made more specific to improve type safety.export interface IDocumentCreationResult { payload: FormData; - documentDefinition: IFormElement<any, IDocumentFieldParams>; + documentDefinition: IFormElement<unknown, IDocumentFieldParams>; valueDestination: string; }
16-27
: Well-structured function with good early return pattern.The implementation is clean and efficient. Consider using
unknown
instead ofany
for better type safety.export const buildDocumentsCreationPayload = ( - element: IFormElement<any, IEntityFieldGroupParams>, + element: IFormElement<unknown, IEntityFieldGroupParams>, entityIds: string[], context: AnyObject, stack: TDeepthLevelStack, ): IDocumentCreationResult[] => {
37-40
: Enhance the warning message with more context.The warning message could be more descriptive to help with debugging.
- console.warn('No template found for document field', documentElement); + console.warn( + 'Document field is missing required template parameter', + { documentId: documentElement.id, name: documentElement.name } + );packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/defaults.ts (1)
1-7
: LGTM with suggestions for improvement.The constant is well-structured and immutable. However, consider these improvements:
- Make the URL path more specific to document removal
- Use consistent template syntax
Consider this refactor:
export const DEFAULT_ENTITY_FIELD_GROUP_DOCUMENT_REMOVAL_PARAMS = { - url: '{_app.apiUrl}collection-flow/files', + url: '{_app.apiUrl}/api/documents/{documentId}', method: 'DELETE', headers: { - Authorization: 'Bearer {_app.accessToken}', + Authorization: `Bearer ${_app.accessToken}`, }, } as const;🧰 Tools
🪛 GitHub Check: CodeQL
[failure] 5-5: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx (2)
13-13
: Consider replacingany
with a more specific type.Using
any
bypasses TypeScript's type checking. Consider defining a more specific type or usingunknown
if the exact type cannot be determined.- element: IFormElement<any, IEntityFieldGroupParams>; + element: IFormElement<unknown, IEntityFieldGroupParams>;
26-41
: Consider adding error boundaries.Since this component is part of a form system handling document management, consider wrapping it with an error boundary to gracefully handle rendering errors and prevent form state corruption.
Here's a suggested implementation:
import { ErrorBoundary } from '@/components/ErrorBoundary'; // ... inside the component return ( <ErrorBoundary fallback={<div>Error rendering entity fields</div>}> <EntityFieldProvider isSyncing={false} entityFieldGroupType={element.params?.type}> {/* ... existing JSX ... */} </EntityFieldProvider> </ErrorBoundary> );packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/EntityFieldGroupDocument.tsx (3)
42-44
: Improve type safety by adding runtime validation.The type assertion
as TEntityFieldGroupType
could fail at runtime. Consider adding validation or using a type guard.-entityFieldGroupType || (_element.params?.type as TEntityFieldGroupType), +const type = entityFieldGroupType || _element.params?.type; +if (!type || !Object.values(TEntityFieldGroupType).includes(type)) { + throw new Error(`Invalid entity field group type: ${type}`); +} +type,
83-85
: Improve File object creation from string ID.Creating a File object with an empty array and using the ID as the name might lead to issues when trying to read the file contents.
-return new File([], value); +return new File([new Blob()], value, { type: 'application/octet-stream' });
174-185
: Enhance accessibility for file input.The file input could benefit from additional accessibility attributes.
<Input data-testid={`${createTestId(element, stack)}-hidden-input`} type="file" placeholder={placeholder} accept={acceptFileFormats} disabled={disabled} onChange={handleChange} onBlur={onBlur} onFocus={onFocus} ref={inputRef} className="hidden" + aria-label={placeholder} + aria-hidden="false" + role="button" />packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx (4)
31-32
: Consider supporting additional entity types.While
'director'
and'ubo'
may suffice for now, you might later need more entity types (e.g.,'shareholder'
or'representative'
) to handle broader use cases.
86-87
: Enhance upload-specific error handling.While errors will eventually bubble up within the task function, consider adding retry logic or a fallback UI for failed upload attempts. This ensures a more graceful handling of partial failures in the upload step.
160-160
: Display errors to end users.
console.error
captures issues for developers but doesn’t surface them to users. Consider using a toast notification or a user-facing alert to provide proper feedback.
172-199
: Improve accessibility for the remove button.While the trash icon provides a clear visual cue, consider adding an accessible label (e.g.,
aria-label="Remove item"
) to aid screen readers and ensure a more inclusive experience.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (9)
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/EntityFieldGroupDocument.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/defaults.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-creation.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-update.ts
(1 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/update-entities.ts
(1 hunks)services/workflows-service/prisma/data-migrations
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/update-entities.ts
- services/workflows-service/prisma/data-migrations
🧰 Additional context used
🪛 GitHub Check: CodeQL
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/defaults.ts
[failure] 5-5: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: test_windows
- GitHub Check: test_linux
🔇 Additional comments (11)
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-creation.ts (2)
1-6
: LGTM!The imports are well-organized and properly used in the code.
7-11
: LGTM!The function signature is well-defined with proper typing for parameters and return value.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-entity-for-update.ts (2)
1-8
: LGTM! Clean import structure.The imports are well-organized, following a consistent pattern with clear separation between form-specific and common types.
9-23
: LGTM! Clean and well-structured implementation.The function effectively handles entity transformation with proper null checks and async/await patterns.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/helpers/build-documents-creation-payload.ts (1)
31-64
: Well-implemented document processing logic.The implementation effectively:
- Handles multiple entities and their documents
- Includes proper null checks and error handling
- Builds document payloads with appropriate metadata
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/defaults.ts (1)
1-7
:✅ Verification successful
Verify template interpolation and URL path.
Please confirm:
- The template syntax
{_app.apiUrl}
and{_app.accessToken}
is correctly handled by your template engine- The URL path
collection-flow/files
is the correct endpoint for document removalRun this script to verify the API endpoint:
🏁 Script executed:
#!/bin/bash # Description: Verify the document removal API endpoint # Search for API route definitions rg -g '*.ts' -g '*.js' --no-heading "collection-flow/files.*DELETE" . # Search for similar endpoints to ensure consistency rg -g '*.ts' -g '*.js' --no-heading "collection-flow/files" .Length of output: 1279
Review Comment Confirmation: Template Syntax & Endpoint Validated
After verifying:
- The template placeholders (
{_app.apiUrl}
and{_app.accessToken}
) are used consistently across the codebase (e.g., inpackages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/defaults.ts
), which confirms they are correctly handled by our templating engine.- The URL path
collection-flow/files
is consistent with its usage in other parts of our system (e.g., as seen in the API controller and other related API files), indicating this is indeed the correct endpoint for document removal.No further changes are needed.
🧰 Tools
🪛 GitHub Check: CodeQL
[failure] 5-5: Hard-coded credentials
The hard-coded value "Bearer {{_app.accessToken}}" is used as authorization header.packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx (1)
27-27
: Ensure consistent handling of partially created entities on upload errors.When an entity is successfully created, but document uploads fail, the code currently retains the partially created entity in the form state. Consider if you need to roll back the entity creation or provide a retry mechanism for document uploads to maintain data consistency.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFieldGroupDocument/EntityFieldGroupDocument.tsx (2)
1-31
: LGTM! Well-organized imports and clear type definitions.The imports are properly organized and the interface extension is well-defined.
95-100
: Enhance error handling for missing template ID.The current error handling for missing template ID only logs a warning. Consider throwing an error or showing a user-friendly message.
packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/EntityFieldGroup.tsx (2)
1-10
: Imports look consistent.All the necessary modules appear to be in place without any apparent duplication or missing references.
43-53
: ValidatehttpParams
definitions.Double-check that each field (
createEntity
,deleteEntity
,uploadDocument
,updateEntity
,deleteDocument
) provides the necessary configuration (URL, method, headers, etc.). Missing or incorrectly typed endpoints may lead to runtime failures.
<StackProvider stack={[...(stack || []), index]}> | ||
<Renderer | ||
elements={element.children || []} | ||
schema={elementsOverride as unknown as TRendererSchema} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Avoid double type assertion.
The double type assertion (as unknown as
) suggests potential type safety issues. Consider refactoring to ensure proper type compatibility.
🏁 Script executed:
#!/bin/bash
# Search for the TRendererSchema type definition to understand its structure
ast-grep --pattern 'type TRendererSchema = $_'
Length of output: 16486
Action Required: Refactor double type assertion and improve error handling
- Remove double type casting: The use of
elementsOverride as unknown as TRendererSchema
at line 36 in packages/ui/src/components/organisms/Form/DynamicForm/fields/EntityFieldGroup/components/EntityFields/EntityFields.tsx undermines type safety. Review the type ofelementsOverride
and adjust it so that it directly matches or converts toTRendererSchema
without resorting to a double assertion. - Enhance error handling: Since this component renders dynamic form fields via the Renderer component, consider adding an error boundary. This will help prevent runtime errors from breaking the form and improve the overall robustness of the UI.
if (typeof value === 'string') { | ||
await deleteDocument({ | ||
ids: [value], | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for document deletion.
The document deletion operation could fail but there's no error handling in place.
if (typeof value === 'string') {
- await deleteDocument({
- ids: [value],
- });
+ try {
+ await deleteDocument({
+ ids: [value],
+ });
+ } catch (error) {
+ console.error('Failed to delete document:', error);
+ // Consider showing user feedback
+ return;
+ }
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if (typeof value === 'string') { | |
await deleteDocument({ | |
ids: [value], | |
}); | |
} | |
if (typeof value === 'string') { | |
try { | |
await deleteDocument({ | |
ids: [value], | |
}); | |
} catch (error) { | |
console.error('Failed to delete document:', error); | |
// Consider showing user feedback | |
return; | |
} | |
} |
const createEntitiesCreationTaskOnChange = useCallback(async () => { | ||
const TASK_ID = element.id; | ||
removeTask(TASK_ID); | ||
|
||
try { | ||
const taskRun = async (context: AnyObject) => { | ||
const entities = get(context, element.valueDestination, []) as IEntity[]; | ||
|
||
const entitiesToProcess = await Promise.all( | ||
entities.map(entity => | ||
entity.ballerineEntityId | ||
? buildEntityUpdatePayload(element, entity, context) | ||
: buildEntityCreationPayload(element, entity, context), | ||
), | ||
); | ||
const createdEntitiesIds: string[] = await Promise.all( | ||
entitiesToProcess.map(entity => | ||
entity.ballerineEntityId | ||
? updateEntity(entity.entity, { | ||
params: { entityId: entity.ballerineEntityId }, | ||
}) | ||
: createEntity(entity), | ||
), | ||
); | ||
|
||
const documentsCreationPayload = buildDocumentsCreationPayload( | ||
element, | ||
createdEntitiesIds, | ||
context, | ||
stack, | ||
); | ||
|
||
await Promise.all( | ||
documentsCreationPayload.map(async documentData => { | ||
const documentId = await uploadDocument(documentData.payload); | ||
|
||
const updatedDocuments = createOrUpdateFileIdOrFileInDocuments( | ||
get(context, documentData.valueDestination, []), | ||
documentData.documentDefinition, | ||
documentId, | ||
); | ||
|
||
set(context, documentData.valueDestination, updatedDocuments); | ||
|
||
return documentId; | ||
}), | ||
); | ||
|
||
const updatedEntities = updateEntities(entities, createdEntitiesIds); | ||
set(context, element.valueDestination, updatedEntities); | ||
|
||
onChange(updatedEntities); | ||
|
||
return context; | ||
}; | ||
|
||
const task: ITask = { | ||
id: TASK_ID, | ||
element, | ||
run: taskRun, | ||
}; | ||
|
||
addTask(task); | ||
} catch (error) { | ||
console.error(error); | ||
} | ||
}, [onChange, element, createEntity, uploadDocument, stack, removeTask, addTask, updateEntity]); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Refactor the creation and update logic for maintainability.
The createEntitiesCreationTaskOnChange
function shoulders multiple responsibilities (entity creation, updates, document uploads, patching context, etc.). Splitting these into smaller, purpose-specific functions or a dedicated workflow can improve clarity, scalability, and test coverage. Additionally, consider handling partial successes through targeted error handling instead of failing all operations at once.
* fix(*): hotfixes * fix(backoffice-v2): renamed revisions to revision for backwards compatibility --------- Co-authored-by: Alon Peretz <[email protected]>
* feat: implemented entity creation & file upload on submit & updated layout * fix: fixed persisted revision reason after reupload
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
♻️ Duplicate comments (2)
packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts (2)
41-48
:⚠️ Potential issueAdd error handling for JSON.stringify.
The JSON.stringify operation could throw for circular references or invalid values.
Add try-catch block:
- payload.append( - 'metadata', - JSON.stringify({ - type: documentType, - variant: documentVariant, - page: pageIndex + 1, - }), - ); + try { + payload.append( + 'metadata', + JSON.stringify({ + type: documentType, + variant: documentVariant, + page: pageIndex + 1, + }), + ); + } catch (error) { + throw new Error('Failed to serialize document metadata'); + }
19-26
:⚠️ Potential issueAdd validation for template properties before usage.
The code assumes template properties exist without validation, which could lead to runtime errors if any of these properties are undefined.
Add validation:
+ if (!template?.category || !template?.type || !template?.issuingVersion || + !template?.version || !template?.issuer?.country) { + throw new Error('Required template properties are missing'); + } payload.append('category', template.category as string); payload.append('type', template.type as string); payload.append('issuingVersion', template.issuingVersion as unknown as string); payload.append('version', template.version as unknown as string); payload.append('status', 'provided'); payload.append('properties', JSON.stringify(template.properties || {})); payload.append('issuingCountry', template.issuer.country as string);
🧹 Nitpick comments (2)
apps/backoffice-v2/src/lib/blocks/hooks/useDocumentBlocks/useDocumentBlocks.tsx (2)
41-50
: Consider adjusting the timeout value.The 40-second timeout for document retrieval might be too long for a good user experience. Consider implementing a progressive loading strategy or reducing the timeout.
const [documents, error] = await apiClient({ method: Method.GET, endpoint: `../external/documents/${entityId}/${workflowId}`, schema: z.any(), - timeout: 40_000, + timeout: 20_000, });
74-77
: Optimize loading state management.The loading state could be more granular to improve user experience during document loading.
const { data: documentsV2, isLoading: isLoadingDocumentsV2 } = useDocumentsQuery({ workflowId: workflow?.id ?? '', entityId, }); + const [isProcessingDocuments, setIsProcessingDocuments] = useState(false); const { isDocumentsV2 } = workflow?.workflowDefinition?.config ?? {};
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
apps/backoffice-v2/src/lib/blocks/hooks/useDocumentBlocks/useDocumentBlocks.tsx
(9 hunks)packages/ui/src/components/organisms/Form/DynamicForm/fields/DocumentField/helpers/build-document-form-data/build-document-form-data.ts
(1 hunks)services/workflows-service/prisma/data-migrations
(1 hunks)services/workflows-service/src/collection-flow/dto/update-collection-flow-document.schema.ts
(1 hunks)services/workflows-service/src/document/document.service.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- services/workflows-service/prisma/data-migrations
- services/workflows-service/src/collection-flow/dto/update-collection-flow-document.schema.ts
if (!workflow?.workflowDefinition?.config?.isDocumentsV2) { | ||
mutateApproveTaskById({ documentId: taskId, contextUpdateMethod, comment }); | ||
} | ||
|
||
if (workflow?.workflowDefinition?.config?.isDocumentsV2) { | ||
mutateApproveDocumentById({ documentId: taskId, decisionReason: '', comment }); | ||
} | ||
|
||
onClearComment(); | ||
}, | ||
[mutateApproveTaskById, onClearComment], | ||
[ | ||
mutateApproveDocumentById, | ||
mutateApproveTaskById, | ||
onClearComment, | ||
workflow?.workflowDefinition?.config?.isDocumentsV2, | ||
], | ||
); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add loading state handling for document updates.
The updateById method should handle loading states to provide feedback during updates.
async updateById(
id: string,
projectIds: TProjectId[],
data: Prisma.DocumentUpdateInput,
args?: Prisma.DocumentUpdateManyArgs,
transaction?: PrismaTransactionClient,
) {
+ setIsProcessingDocuments(true);
try {
await this.repository.updateById(id, projectIds, data, args, transaction);
const documents = await this.repository.findManyWithFiles(projectIds);
return this.formatDocuments({
documents,
documentSchema: null,
});
+ } finally {
+ setIsProcessingDocuments(false);
+ }
}
Committable suggestion skipped: line range outside the PR's diff.
const uploadedFile = await this.fileService.uploadNewFile(projectId, entityId, { | ||
...file, | ||
mimetype: | ||
file.mimetype || | ||
( | ||
await getFileMetadata({ | ||
file: file.originalname || '', | ||
fileName: file.originalname || '', | ||
}) | ||
)?.mimeType || | ||
'', | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add error handling for file upload.
The file upload process lacks proper error handling and cleanup in case of failures.
Add error handling:
+ let uploadedFile;
+ try {
const uploadedFile = await this.fileService.uploadNewFile(projectId, entityId, {
...file,
mimetype:
file.mimetype ||
(
await getFileMetadata({
file: file.originalname || '',
fileName: file.originalname || '',
})
)?.mimeType ||
'',
});
+ } catch (error) {
+ // Clean up any partially uploaded files
+ if (uploadedFile?.id) {
+ await this.storageService.deleteFile(uploadedFile.id).catch(console.error);
+ }
+ throw new Error(`File upload failed: ${error.message}`);
+ }
Committable suggestion skipped: line range outside the PR's diff.
const uploadedFile = await this.fileService.uploadNewFile(projectId, entityId, { | ||
...file, | ||
mimetype: | ||
file.mimetype || | ||
( | ||
await getFileMetadata({ | ||
file: file.originalname || '', | ||
fileName: file.originalname || '', | ||
}) | ||
)?.mimeType || | ||
'', | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Refactor duplicate file upload logic.
The file upload logic is duplicated in multiple places. Extract it into a reusable method.
Create a helper method:
+ private async uploadFile(
+ projectId: string,
+ entityId: string,
+ file: Express.Multer.File,
+ ): Promise<File> {
+ let uploadedFile;
+ try {
+ uploadedFile = await this.fileService.uploadNewFile(projectId, entityId, {
+ ...file,
+ mimetype:
+ file.mimetype ||
+ (
+ await getFileMetadata({
+ file: file.originalname || '',
+ fileName: file.originalname || '',
+ })
+ )?.mimeType ||
+ '',
+ });
+ return uploadedFile;
+ } catch (error) {
+ if (uploadedFile?.id) {
+ await this.storageService.deleteFile(uploadedFile.id).catch(console.error);
+ }
+ throw new Error(`File upload failed: ${error.message}`);
+ }
+ }
Then use it:
- const uploadedFile = await this.fileService.uploadNewFile(projectId, entityId, {
- ...file,
- mimetype:
- file.mimetype ||
- (
- await getFileMetadata({
- file: file.originalname || '',
- fileName: file.originalname || '',
- })
- )?.mimeType ||
- '',
- });
+ const uploadedFile = await this.uploadFile(projectId, entityId, file);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const uploadedFile = await this.fileService.uploadNewFile(projectId, entityId, { | |
...file, | |
mimetype: | |
file.mimetype || | |
( | |
await getFileMetadata({ | |
file: file.originalname || '', | |
fileName: file.originalname || '', | |
}) | |
)?.mimeType || | |
'', | |
}); | |
// Assuming this is within your DocumentService class | |
// Add the new helper method (place it at an appropriate location within the class) | |
private async uploadFile( | |
projectId: string, | |
entityId: string, | |
file: Express.Multer.File, | |
): Promise<File> { | |
let uploadedFile; | |
try { | |
uploadedFile = await this.fileService.uploadNewFile(projectId, entityId, { | |
...file, | |
mimetype: | |
file.mimetype || | |
( | |
await getFileMetadata({ | |
file: file.originalname || '', | |
fileName: file.originalname || '', | |
}) | |
)?.mimeType || | |
'', | |
}); | |
return uploadedFile; | |
} catch (error) { | |
if (uploadedFile?.id) { | |
await this.storageService.deleteFile(uploadedFile.id).catch(console.error); | |
} | |
throw new Error(`File upload failed: ${error.message}`); | |
} | |
} | |
// Original code snippet refactored to use the new helper method: | |
const uploadedFile = await this.uploadFile(projectId, entityId, file); |
documents: Array<{ | ||
type: string; | ||
category: string; | ||
decisionReason?: string; | ||
issuingCountry: string; | ||
issuingVersion: string; | ||
version: string; | ||
entity: { | ||
id: string; | ||
type: 'business' | 'ubo' | 'director'; | ||
}; | ||
}>, | ||
) { | ||
const documentsToCreate = documents.map(document => ({ | ||
category: document.category, | ||
type: document.type, | ||
decisionReason: document.decisionReason, | ||
issuingVersion: document.issuingVersion, | ||
issuingCountry: document.issuingCountry, | ||
version: parseInt(document.version), | ||
status: DocumentStatus.requested, | ||
properties: {}, | ||
projectId: projectId, | ||
workflowRuntimeDataId: workflowId, | ||
businessId: document.entity.type === 'business' ? document.entity.id : undefined, | ||
endUserId: ['ubo', 'director'].includes(document.entity.type) | ||
? document.entity.id | ||
: undefined, | ||
})); | ||
|
||
const workflowRuntimeData = await this.workflowService.getWorkflowRuntimeDataById( | ||
workflowId, | ||
{ | ||
select: { | ||
workflowDefinition: true, | ||
context: true, | ||
}, | ||
}, | ||
[projectId], | ||
); | ||
|
||
const uiDefinition = await this.uiDefinitionService.getByWorkflowDefinitionId( | ||
workflowRuntimeData.workflowDefinitionId, | ||
'collection_flow', | ||
[projectId], | ||
); | ||
|
||
const createdDocuments = await Promise.all( | ||
documentsToCreate.map(doc => this.repository.create(doc)), | ||
); | ||
|
||
const contextWithDocuments = createdDocuments.reduce((context, document) => { | ||
const createdDocument = document; | ||
|
||
if (!createdDocument) { | ||
return context; | ||
} | ||
|
||
return addRequestedDocumentToEntityDocuments( | ||
context, | ||
document.type as 'business' | 'ubo' | 'director', | ||
uiDefinition, | ||
{ | ||
id: createdDocument.id, | ||
status: DocumentStatus.requested, | ||
decision: null, | ||
version: createdDocument.version.toString(), | ||
type: createdDocument.type, | ||
category: createdDocument.category, | ||
issuingCountry: createdDocument.issuingCountry, | ||
issuingVersion: createdDocument.issuingVersion, | ||
}, | ||
); | ||
}, workflowRuntimeData.context); | ||
|
||
const contextWithRevision = setCollectionFlowStatus( | ||
contextWithDocuments, | ||
CollectionFlowStatusesEnum.revision, | ||
); | ||
|
||
await this.workflowService.updateWorkflowRuntimeData( | ||
workflowId, | ||
{ | ||
context: contextWithRevision, | ||
}, | ||
projectId, | ||
); | ||
|
||
await this.workflowService.event( | ||
{ | ||
id: workflowId, | ||
name: CommonWorkflowEvent.REVISION, | ||
payload: {}, | ||
}, | ||
[projectId], | ||
projectId, | ||
); | ||
|
||
return { message: 'Documents requested successfully', count: createdDocuments.length }; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add transaction management for document requests.
The document request process involves multiple database operations that should be wrapped in a transaction.
Add transaction management:
async requestDocumentsByIds(
projectId: TProjectId,
workflowId: string,
documents: Array<{
type: string;
category: string;
decisionReason?: string;
issuingCountry: string;
issuingVersion: string;
version: string;
entity: {
id: string;
type: 'business' | 'ubo' | 'director';
};
}>,
) {
+ const prisma = this.repository.getPrismaClient();
+ return await prisma.$transaction(async (tx) => {
const documentsToCreate = documents.map(document => ({
// ... document creation logic
}));
const workflowRuntimeData = await this.workflowService.getWorkflowRuntimeDataById(
// ... workflow data retrieval
);
const createdDocuments = await Promise.all(
documentsToCreate.map(doc => this.repository.create(doc, undefined, tx)),
);
// ... context updates
await this.workflowService.updateWorkflowRuntimeData(
// ... workflow updates
tx
);
await this.workflowService.event(
// ... event handling
tx
);
return { message: 'Documents requested successfully', count: createdDocuments.length };
+ });
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async requestDocumentsByIds( | |
projectId: TProjectId, | |
workflowId: string, | |
documents: Array<{ | |
type: string; | |
category: string; | |
decisionReason?: string; | |
issuingCountry: string; | |
issuingVersion: string; | |
version: string; | |
entity: { | |
id: string; | |
type: 'business' | 'ubo' | 'director'; | |
}; | |
}>, | |
) { | |
const documentsToCreate = documents.map(document => ({ | |
category: document.category, | |
type: document.type, | |
decisionReason: document.decisionReason, | |
issuingVersion: document.issuingVersion, | |
issuingCountry: document.issuingCountry, | |
version: parseInt(document.version), | |
status: DocumentStatus.requested, | |
properties: {}, | |
projectId: projectId, | |
workflowRuntimeDataId: workflowId, | |
businessId: document.entity.type === 'business' ? document.entity.id : undefined, | |
endUserId: ['ubo', 'director'].includes(document.entity.type) | |
? document.entity.id | |
: undefined, | |
})); | |
const workflowRuntimeData = await this.workflowService.getWorkflowRuntimeDataById( | |
workflowId, | |
{ | |
select: { | |
workflowDefinition: true, | |
context: true, | |
}, | |
}, | |
[projectId], | |
); | |
const uiDefinition = await this.uiDefinitionService.getByWorkflowDefinitionId( | |
workflowRuntimeData.workflowDefinitionId, | |
'collection_flow', | |
[projectId], | |
); | |
const createdDocuments = await Promise.all( | |
documentsToCreate.map(doc => this.repository.create(doc)), | |
); | |
const contextWithDocuments = createdDocuments.reduce((context, document) => { | |
const createdDocument = document; | |
if (!createdDocument) { | |
return context; | |
} | |
return addRequestedDocumentToEntityDocuments( | |
context, | |
document.type as 'business' | 'ubo' | 'director', | |
uiDefinition, | |
{ | |
id: createdDocument.id, | |
status: DocumentStatus.requested, | |
decision: null, | |
version: createdDocument.version.toString(), | |
type: createdDocument.type, | |
category: createdDocument.category, | |
issuingCountry: createdDocument.issuingCountry, | |
issuingVersion: createdDocument.issuingVersion, | |
}, | |
); | |
}, workflowRuntimeData.context); | |
const contextWithRevision = setCollectionFlowStatus( | |
contextWithDocuments, | |
CollectionFlowStatusesEnum.revision, | |
); | |
await this.workflowService.updateWorkflowRuntimeData( | |
workflowId, | |
{ | |
context: contextWithRevision, | |
}, | |
projectId, | |
); | |
await this.workflowService.event( | |
{ | |
id: workflowId, | |
name: CommonWorkflowEvent.REVISION, | |
payload: {}, | |
}, | |
[projectId], | |
projectId, | |
); | |
return { message: 'Documents requested successfully', count: createdDocuments.length }; | |
} | |
async requestDocumentsByIds( | |
projectId: TProjectId, | |
workflowId: string, | |
documents: Array<{ | |
type: string; | |
category: string; | |
decisionReason?: string; | |
issuingCountry: string; | |
issuingVersion: string; | |
version: string; | |
entity: { | |
id: string; | |
type: 'business' | 'ubo' | 'director'; | |
}; | |
}>, | |
) { | |
const prisma = this.repository.getPrismaClient(); | |
return await prisma.$transaction(async (tx) => { | |
const documentsToCreate = documents.map(document => ({ | |
category: document.category, | |
type: document.type, | |
decisionReason: document.decisionReason, | |
issuingVersion: document.issuingVersion, | |
issuingCountry: document.issuingCountry, | |
version: parseInt(document.version), | |
status: DocumentStatus.requested, | |
properties: {}, | |
projectId: projectId, | |
workflowRuntimeDataId: workflowId, | |
businessId: document.entity.type === 'business' ? document.entity.id : undefined, | |
endUserId: ['ubo', 'director'].includes(document.entity.type) | |
? document.entity.id | |
: undefined, | |
})); | |
const workflowRuntimeData = await this.workflowService.getWorkflowRuntimeDataById( | |
workflowId, | |
{ | |
select: { | |
workflowDefinition: true, | |
context: true, | |
}, | |
}, | |
[projectId], | |
); | |
const uiDefinition = await this.uiDefinitionService.getByWorkflowDefinitionId( | |
workflowRuntimeData.workflowDefinitionId, | |
'collection_flow', | |
[projectId], | |
); | |
const createdDocuments = await Promise.all( | |
documentsToCreate.map(doc => this.repository.create(doc, undefined, tx)), | |
); | |
const contextWithDocuments = createdDocuments.reduce((context, document) => { | |
const createdDocument = document; | |
if (!createdDocument) { | |
return context; | |
} | |
return addRequestedDocumentToEntityDocuments( | |
context, | |
document.type as 'business' | 'ubo' | 'director', | |
uiDefinition, | |
{ | |
id: createdDocument.id, | |
status: DocumentStatus.requested, | |
decision: null, | |
version: createdDocument.version.toString(), | |
type: createdDocument.type, | |
category: createdDocument.category, | |
issuingCountry: createdDocument.issuingCountry, | |
issuingVersion: createdDocument.issuingVersion, | |
}, | |
); | |
}, workflowRuntimeData.context); | |
const contextWithRevision = setCollectionFlowStatus( | |
contextWithDocuments, | |
CollectionFlowStatusesEnum.revision, | |
); | |
await this.workflowService.updateWorkflowRuntimeData( | |
workflowId, | |
{ | |
context: contextWithRevision, | |
}, | |
projectId, | |
tx | |
); | |
await this.workflowService.event( | |
{ | |
id: workflowId, | |
name: CommonWorkflowEvent.REVISION, | |
payload: {}, | |
}, | |
[projectId], | |
projectId, | |
tx | |
); | |
return { message: 'Documents requested successfully', count: createdDocuments.length }; | |
}); | |
} |
Summary by CodeRabbit