-
Notifications
You must be signed in to change notification settings - Fork 29
fix: migrate affected datasets where size numberOfFiles packedSize and numberOfFilesArchived were overwritten #1854
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
fix: migrate affected datasets where size numberOfFiles packedSize and numberOfFilesArchived were overwritten #1854
Conversation
…d numberOfFilesArchived were overwritten
0, | ||
); | ||
|
||
const datasetSizeSum = origDatablockSizeSum + datablockSizeSum; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you sure that this is the way that it is calculated?
I think that the dataset total size is the sum of the size of the files listed in the origDatablock
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you see this line for example on the creation of origDatablock: https://github.com/SciCatProject/scicat-backend-next/blob/master/src/datasets/datasets.controller.ts#L1945
Or this line in the update: https://github.com/SciCatProject/scicat-backend-next/blob/master/src/datasets/datasets.controller.ts#L2101
We can see that the size it is used and not the sum of the files listed.
const datasetNumberOfFilesSum = | ||
origDatablockNumberOfFilesSum + datablockNumberOfFilesSum; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should include only the origdatablock number of files
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we check the datasets controller you can see that even in the createDatablock endpoint we have the numberOfFiles updated. https://github.com/SciCatProject/scicat-backend-next/blob/master/src/datasets/datasets.controller.ts#L2230
migrations/20250423114619-populate-overwritten-datasets-size.js
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks fine to me
@nitrosx let me know what are your final thoughts about this one. As I know you left some comments before. |
Description
This PR aims to provide a migration script for datasets properties that were overwritten by a bug that was found and fixed in the code.
Motivation
size
,numberOfFiles
,packedSize
andnumberOfFilesArchived
were all overwritten by a bug in the code and those needed to be fixed.Fixes
Changes:
Tests included
Documentation
official documentation info