I believe that my first NAS (DS411j) caused bitrot on some of my files. I did however, had the copies of some of the same data on other storage. During the years, I moved this data to a couple of newer Synology NASs. I suspect that some data degraded over these years.
My question is - is there a built in feature, a package, or a soft that can be ran on the NAS in container, that could go through all of the files on the NAS and find the ones that are corrupted?
More importantly, find duplicate files and show which ones are good and which ones are corrupted?
And how to prevent this problem from happening in the future?
I do have a backup.
But how do I automatically replace all corrupted files without replacing all the files ( the back up also may contain some bitrot affected files )?
Is there an app that can automatically identify corrupted files and replace them with good ones from another source?
It may be too late for that now. To attempt this, you would perform a file by file comparison using command line tools such as diff (Linux) and fc (File Compare, Windows). If a difference is detected, only you know which file is correct (as it’s possible that the “backup” is corrupt).
It requires user action. Long story short, you must enable data checksum for each shared folder when it is created. You can not enable it afterward. Second, you periodically (once/6 months) run a process called data scrubbing.
Besides using BTRFS and doing regular data scrubs (every 6 months), I also have checksum files in each folder, which I regularly check. Works great for data that does not change (if file changes, you need to re-gen the cheksum). Check Diglloyds “integritychecker java” …
I wrote my own years ago (PHP code … running on Windows, macOS and DSM). Yes : it discovered errors … not bitrot, but errors during transfers I think.