Speed degradation after heavy testing
On this page I will test how the SSD performs after heavy
testing and usage, and also how the SSDs perform when the amount of data stored
on the SSD increases.
I now have a new policy as to how I go about testing an SSD.
In the past I would deliberately try and get an SSD into a “used state”, by
filling the drive several times before starting the tests. This seemed to work
quite well up until the SandForce based SSDs appeared, but because of the way
the SandForce controller works, it was near impossible to tell if deliberately
trying to get a SandForce based SSD into a “used state” had actually worked or
A new strategy was required. So now I begin the tests with
the SSD in a clean state and allow it to look after itself during the testing
period. I start off the tests by running AS SSD benchmark. This gives me the
“as new” reading and writing performance of the SSD.
Once all the tests have been completed, the drive is then
tested as a system drive, and just used normally for many days which will also
includes idle time (this is something I have always done with a review sample).
At the end of the period, the drive is filled to capacity and then all files
are deleted from the drive and then a “quick format” is performed.
The last test is a rerun of AS SSD benchmark, and the result
from the final test is compared with the first run when the SSD was in an “as
Let’s find out what happens.
New state 21/02/2013
Used state 3/03/2013
With 4.16 Terabytes of data already written to the drive
during a testing period of 11 days, one would have expected the performance to
have dropped off slightly. This wasn’t the case with the Samsung 840, which is
all rather strange. Strange in the fact that it actually increased in speed
when this test was run, which really sums up how the Samsung 840 performs, and I’ll
elaborate more on this in the conclusion page in this article.
Filling up the SSD with data
For obvious reasons, when an SSD is tested, the drive is
always tested as a spare drive, and is generally always empty (no data on the
drive) during the synthetic benchmarks. There is no other way of having a level
playing field for all the SSDs under test. This of course changes during the real
world tests we conduct here at Myce.com.
Real users of course don’t buy an SSD for it to remain
empty, and how full the SSD will eventually become varies from one user to the
next. What I thought would be useful is to run tests on the SSDs with real data
on the drives, and at different levels regarding how full the drive is.
For these tests the SSD is connected as a spare, and I test
at three different levels.
- Level 1: There an operating system installed on the
SSD, and all the applications that I use are also installed. In my case
that amounts to approximately 44GB of data on the SSD.
- Level 2: The SSD is filled to 60% of its formatted
- Level 3: The SSD is filled to 80% of its formatted
For the 60% and 80% tests, the type of data varies from
compressible to incompressible data, and file sizes range from a few Kilobytes
to very large files of several Gigabytes, then a single run of Anvil’s SSD
Benchmark is run (100% incompressible).
It is also worth noting that the larger capacity SSDs will
tend to slow down less than their smaller counterparts, as the larger SSDs will
have more free NAND available to work with.
Level 1: Operating system and applications installed.
Samsung 840 250GB SSD – Operating system and applications installed.
Level 2: SSD filled to 60% of its formatted capacity.
Samsung 840 250GB SSD – Filled to 60% of the drive’s formatted capacity.
Level 3: SSD filled to 80% of its formatted capacity.
Samsung 840 250GB SSD – Filled to 80% of the drive’s formatted capacity
In the graph below, I present the results.
Filling up an SSD with data can certainly cause a slowdown
to occur on some SSDs, as we can see from the table above. Filling up the Samsung
840 with data caused no such problems, and it is still maintaining its performance
extremely well in this short burst test.
Myce Sustainable Performance Test
Over the last four months I have been studying countless
analyzer traces of real computing workloads, and also developing a test that
would accurately emulate and measure how performance is sustained over a period
of time. For obvious reasons, it is not possible to test an SSD review sample
over several months before publishing a review. The solution was to condense
this down to a manageable test, that doesn’t take too long to run.
I will make it clear right from the outset that this is not
a torture test. Bringing any SSD to its knees is not helpful in the least, as I
for one would not use any SSD that had slowed down to crawl, just to prove a
point. The Myce Sustainable Performance test, I believe is a tough, but
sensible test pattern to use for measuring how an SSD will be behave once it’s pushed
hard over a period of time.
The test pattern is "workstation" based, and
closely emulates a typical video or graphics workstation environment. The
results are measured using the same hardware I use for the Myce Reality Suite
tests, however, the test data and measuring system use a different method.
From the 80% full test listed above, I already have an SSD
with a lot of data on it. Adding to the data that is already there, the
"Sustainable Performance" test data is added. This test data is
approximately 20GB is size, so once this is added the SSD is pretty full.
The test is then run for a period of 20 minutes. 60
performance measurements are taken for every minute of the test, and an average
performance figure is generated after each minute. At the end of the test I
have 20 performance measurements which are then used to generate the graph
The faster SSDs will obviously sustain more writes then the
slower SSDs. For the fastest SSD in this test, the test pattern generated 146GB
of writes, and 193GB of data was read from the SSD during the test.
When reading the graph, you should not pay too much
attention to which drive is the fastest, but instead look at the sustainable
performance curve of each SSD, as this is what this test is all about.
For the SSD that I am reviewing, I will also add a second
graph which looks at the result in more detail.
So let’s look at the results.
Sustainable Performance test
Detailed results for the review drive
We knew from the previous Anvil’s SSD benchmark tests that
the Samsung 840 could maintain performance in a short burst test when it was
pretty full of data. The Myce Sustained performance test is a much tougher
challenge for any SSD. The test pattern used for the test is workstation based,
and we already know that Samsung 840 is not strong in a simulated workstation
environment from the IOMeter workstation test run.
What this test does show, is that garbage collection on the Samsung
840 can’t keep pace with the demands of the test, and the 840 slows down quite
This concludes our review. To read the final thoughts and
conclusion, click the link below….