We are committed to providing fast, efficient, and affordable software solutions that set new standards in the software development industry.
  • What is Data Wiping

Data wiping is the process of eliminating all of the data stored on a hard drive or other data storage device. Under normal circumstances, data that is deleted isn't actually gone for good. Instead, all the pointers and indexes that reference that file are simply removed. While this renders the file inaccessible by normal means, much of this data can still be recovered through advanced data recovery techniques.

But what happens if you want to erase the contents of a hard drive for good? What if you don't want the data to be recoverable at all? In that case, you'll need to wipe your data completely.

The Basics of Data Wiping
Most modern data wiping solutions function in a similar manner. Instead of deleting your data outright, which would result in data that could potentially be recovered, these solutions actually overwrite the entire contents of your drive or data storage device. By writing a series of binary 1s and 0s in a random fashion, these data wiping solutions effectively render your data unrecoverable.

Some programs take this process even further with multiple passes. For example, the first pass might involve overwriting your data with all 0s, while the next pass will follow that up by overwriting the 0s with all 1s. While a greater number of passes ultimately makes your original data that much harder to recover, a single pass is good enough in most scenarios.

Data Wiping Algorithms and Standards
While one pass with a modern data wiping solution is good enough for the average computer user, some organizations and institutions actually maintain stringent standards regarding data wiping - including the frequency of which data should be wiped, how many passes the software should perform, and other rules. Some of the most common algorithms and standards are listed below.

  • DOD 5220.22-M: This method is used by the Department of Defense (DoD) and the U.S. National Industrial Security Program (NISP). It involves writing a series of 0s in the first pass, 1s in the second pass, and then a random character in the third and final pass.
  • NCSC-TG-025: Published by the U.S. National Security Agency (NSA), this method actually uses the exact same method as DOD 5220.22-M. However, this algorithm gives the user the opportunity to customize the process by adding more overwrite processes when needed.
  • HMG IS5: Published and maintained by the Communications Electronics Security Group (CESG), the baseline version of HMG IS5 writes 0s during the first pass and a random character in the second and final pass. The HMG IS5 Enhanced algorithm writes 0s in the first pass, 1s in the second pass, and a random character in the third.

Although this isn't meant as an exhaustive list of the current data wiping algorithms and standards, it does cover the most common ones in use today.

Wiping Your Data
Remember, it isn't enough to simply delete your data. To truly render your data unrecoverable, it's essential that you perform at least one pass from a full-scale data wiping solution that follows one of the common data wiping algorithms.

You may read more about data wiping in Wikipedia: Data erasure.

Data Recovery Feedback
370 feedbacks
Rating: 4.8 / 5
I really love your R-Studio product, I am doing Data Recovery as a professional, I used RS since the early versions and I loved the product, as far as I can tell, R-Studio, especially the Tech Version (but including the standard) is one of the best and excellent tools for a pro to have in the arsenal of tools in a pro DR lab, especially combining with the specialized Data Recovery hardware providers like DeepSpar, and PC3000, the rest of `wannabees` out there are waste of time, strongly recommend
I lost more than 200K files from my NAS due to a mistake. I tried 3 different recovery solutions over the 4 TB raid disks, and all of them performed ok but to be honest none of them were able to Raw recover the files and rename them with meaningful names out of the Metadata like R-TT did, then I was able to sort again my files and pictures and kind of restore all of them.

R-TT may not be the easiest or most user-friendly solution, but the algorithm used for the renaming saved me THOUSAND of hours of opening ...
Just recovered my old ext4 partition with R-Studio after trying testdisk and R-Linux without success. That partition was overwritten by another ext4 partition and I was losing my hope until I tried R-Studio demo. It detected all my files and directories again!

Bought it and 100% recommend it for anyone with a similar issue.
Genuinely tried every free program available without luck of recovering a deleted file from months ago. Thinking my file was deleted forever and lose all hope I came across this website as a recommendation.

I was reluctant as it seemed pricey compared to other programs, but damn worth every penny. It managed to even find files I thought were wiped from existence.

Kudos to r-tools, thank you!
Why make incremental backups, when there is R-Studio?

I`m an IT professional who has worked from home for over a decade. Early on in my career, I configured an HP ProLiant Server (Raid 1+0) as a workstation that I would remote into from my laptop. As technology evolved, I began to use it only for email and as a config file repository.

A short while ago, one of the drives degraded, but the HP ProLiant Server (Raid 1+0) still functioned fine on the remaining drive. I was complacent and didn`t replace the ...