For long-term data storage, it is common to use classic hard drives on magnetic plates, since this is simply cheaper. However, if a user needs to create a “cold archive” on a solid-state drive, he must understand that he risks losing some of the data in a few years.

Image source: Leven

At least, this is the conclusion that blogger HTWingNut has drawn from an experiment in which four Leven JS-600 solid-state drives based on TLC memory with a capacity of 128 GB are used to store 100 GB of data when disconnected from the system unit. The experiment began two years ago, and two of the four drives were new at that time, while the other two had already accumulated 280 TB of rewritten data, with the manufacturer’s stated resource being no more than 60 TB.

To check the integrity of the data, all files were supplied with a checksum. The author of the experiment undertook to check the safety of information on disconnected SSDs with a frequency of approximately one year. After the first year of storage, everything looked quite safe even on “worn out” SSDs, but data verification began to take longer and longer over time, in the second year this trend was expressed in a multiple increase in the time spent on this operation. Secondly, the number of blocks of information restored through the ECC function increased noticeably, although this did not always lead to data loss.

In addition, out of ten thousand sectors on worn-out drives, three were found to be faulty according to the HD Sentinel version, and the speed of data transfer when working with them began to fluctuate after two years. This had a negative effect on the performance of the drives in tests. The author of the experiment came to the conclusion that even solid-state drives that had not been in use, on which a decent amount of data was recorded, demonstrate an increased risk of losing some of the data during long-term storage without connection to a PC.

Leave a Reply

Your email address will not be published. Required fields are marked *