You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The tool rdfind gives a higher priority to keep files according to the order on the command line. This is very useful to remove duplicates in decades old and unstructured manual backup folders. Consider the following folders with Pictures containing roughly the same files:
Data/Pictures
Backup-2024/Pictures
Backup-2022/Pictures
Drive-D/Data/Pictures
Data/OldBackups/Drive-D/Data/Pictures
Imagine the above list is sorted by most recent files.
Files in directory 1 should be kept and removed from the other folders.
If a file only is in directories 2 and 3, it should be kept in 2 only.
For prior duplicate removal within only directory 1, for example --priority least-nested could be used.
From my understanding of experimenting with fclones (with dry-run) with a structure similar to the above, the output of the group command would have to be sorted. However, this does not seem to be the case currently.
The text was updated successfully, but these errors were encountered:
The tool rdfind gives a higher priority to keep files according to the order on the command line. This is very useful to remove duplicates in decades old and unstructured manual backup folders. Consider the following folders with Pictures containing roughly the same files:
Imagine the above list is sorted by most recent files.
--priority least-nested
could be used.From my understanding of experimenting with fclones (with dry-run) with a structure similar to the above, the output of the group command would have to be sorted. However, this does not seem to be the case currently.
The text was updated successfully, but these errors were encountered: