You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've got three folders containing each some 20-30k fotos. Searching for duplicates (with or without cache enabled) is a matter of seconds, during which it says
"Analyzed full hash of ... / 965 files."
It finds no duplicates (which is okay, if true). If I manually "create" a duplicate, it finds it as well.
What irks / confuses me is that it says "965 files" - rather than ~100k files. How can I be sure that all files were checked/compared?
When doing the full hash dupe check on so many files, it took hours in the past, not seconds or minutes. Again, with cache disabled in settings, it takes a few seconds more. But time it takes aside, if the three folders contain >100k files, why does it claim to only check <1k of these files?
PS: Corrolary: Is $all_files below all files checked and compared - or only a (random?) subset for which a hash it analysed:
progress_analyzed_full_hash = Analyzed full hash of {$file_checked}/{$all_files} files
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I've got three folders containing each some 20-30k fotos. Searching for duplicates (with or without cache enabled) is a matter of seconds, during which it says
"Analyzed full hash of ... / 965 files."
It finds no duplicates (which is okay, if true). If I manually "create" a duplicate, it finds it as well.
What irks / confuses me is that it says "965 files" - rather than ~100k files. How can I be sure that all files were checked/compared?
When doing the full hash dupe check on so many files, it took hours in the past, not seconds or minutes. Again, with cache disabled in settings, it takes a few seconds more. But time it takes aside, if the three folders contain >100k files, why does it claim to only check <1k of these files?
PS: Corrolary: Is $all_files below all files checked and compared - or only a (random?) subset for which a hash it analysed:
progress_analyzed_full_hash = Analyzed full hash of {$file_checked}/{$all_files} files
from https://fossies.org/linux/czkawka/czkawka_gui/i18n/en/czkawka_gui.ftl
btw, fdupes (in the command line) says:
Progress [ ... / 74918] 25%
and find no duplicates either. Still, I'm more happy with having the correct count as 74918, rather than 965...
Beta Was this translation helpful? Give feedback.
All reactions