1

Save the progress of the duplicate search


T
Tony Krijnen

I have to remove a very large number of duplicate files from a 12TB disk. It's a a lot, like 5Mio files.
After having the scanning process running non-stop for one week I saw that already 7TB was duplicate files. The scan was at 7k files left to scan yesterday and I went to bed. In the morning the PC had rebooted and all progress was gone :'(
I have to start all over... Is there anyway you could keep the progress of a running job so that if the application is killed, the PC crashes (power failure) that when I open the App again it finds the latest task, concludes that this wasn't finished and ask me if I want to continue where I was?

A

Activity Newest / Oldest

Avatar

Team TreeSize

Hi Tony, we have a more advanced disk space manager called "SpaceObServer", which collects the metadata using a background service and stores all the collected data in a database. This also includes MD5 hashes as an option. That way it can supply duplicated within a few seconds from its database.
You can get a trial from: www.jam-software.com/spaceobserver/