Filling up the SSD with data
For obvious reasons, when an SSD is tested, the drive is
always tested as a spare drive, and is generally always empty (no data on the
drive) during the synthetic benchmarks. There is no other way of having a level
playing field for all the SSDs under test. This of course changes during the
real world tests we conduct here at Myce.com.
Real users of course don’t buy an SSD for it to remain
empty, and how full the SSD will eventually become varies from one user to the
next. What I thought would be useful is to run tests on the SSDs with real data
on the drives, and at different levels regarding how full the drive is.
For these tests the SSD is connected as a spare, and I test
at three different levels.
- Level 1: There an operating system installed on the
SSD, and all the applications that I use are also installed. In my case
that amounts to approximately 44GB of data on the SSD.
- Level 2: The SSD is filled to 60% of its formatted
- Level 3: The SSD is filled to 80% of its formatted
For the 60% and 80% tests, the type of data varies from
compressible to incompressible data, and file sizes range from a few Kilobytes
to very large files of several Gigabytes, then a single run of Anvil’s SSD
Benchmark is run (100% incompressible).
It is also worth noting that the larger capacity SSDs will
tend to slow down less than their smaller counterparts, as the larger SSDs will
have more free NAND available to work with, and this is only a quick burst test
that all members will be able to run for themselves. The real test is the Myce
Sustained Performance test, which you can find a little further down the page.
In the graph below, I present the results.
Filling up an SSD with data can certainly cause a slowdown
to occur on some SSDs. However, filling up the Samsung 840 EVO mSATA 1000GB SSD
with data had no real affect on the performance. Of course this is just a quick
Myce Sustainable Performance Test
Over the last few months I have been studying countless
analyzer traces of real computing workloads, and also developing a test that
would accurately emulate and measure how performance is sustained over a period
of time. For obvious reasons, it is not possible to test an SSD review sample
over several months before publishing a review. The solution was to condense
this down to a manageable test, that doesn’t take too long to run.
I will make it clear right from the outset that this is not
a torture test. Bringing any SSD to its knees is not helpful in the least, as I
for one would not use any SSD that had slowed down to crawl, just to prove a
point. The Myce Sustainable Performance test, I believe is a tough, but
sensible test pattern to use for measuring how an SSD will be behave once it’s pushed
hard over a period of time.
The test pattern is "workstation" based, and
closely emulates a typical video or graphics workstation environment. The
results are measured using the same hardware I use for the Myce Reality Suite tests,
however, the test data and measuring system use a different method.
From the 80% full test listed above, I already have an SSD
with a lot of data on it. Adding to the data that is already there, the
"Sustainable Performance" test data is added. This test data is
approximately 20GB is size, so once this is added the SSD is pretty full.
The test is then run for a period of 20 minutes. 60
performance measurements are taken for every minute of the test, and an average
performance figure is generated after each minute. At the end of the test I
have 20 performance measurements which are then used to generate the graph
The faster SSDs will obviously sustain more writes then the
slower SSDs. For the fastest SSD in this test, the test pattern generated 146GB
of writes, and 193GB of data was read from the SSD during the test.
When reading the graph, you should not pay too much
attention to which drive is the fastest, but instead look at the sustainable
performance curve of each SSD, as this is what this test is all about.
For the SSD that I am reviewing, I will also add a second
graph which looks at the result in more detail.
So let’s look at the results.
Sustainable Performance test
Detailed results for the review drive
We knew from the previous Anvil’s SSD benchmark tests that
the Samsung 840 EVO mSATA 1000GB SSD could maintain performance pretty well in a
short burst test, when it was pretty full of data. The Myce Sustained
performance test is a much tougher challenge for any SSD. The test pattern used
for the test is workstation based, and from the IOMeter workstation test run, we
already know that the Samsung 840 EVO mSATA 1000GB SSD is a good performer in a
simulated workstation environment.
What this test does show, is that the Samsung 840 EVO mSATA
SSD does slow down a bit once that emulated SLC is all used up. The speed drop
isn’t a large one, and it isn’t likely that many normal PC users will ever push
their SSD this hard.
This concludes our review. To read the final thoughts and
conclusion, click the link below….