AnandTech DAS Suite - Benchmarking for Performance Consistency

Our testing methodology for DAS units takes into consideration the usual use-case for such devices. The most common usage scenario is transfer of large amounts of photos and videos to and from the unit. Other usage scenarios include the use of the DAS as a download or install location for games and importing files directly off the DAS into a multimedia editing program such as Adobe Photoshop. Some users may even opt to boot an OS off an external storage device.

The AnandTech DAS Suite tackles the first use-case. The evaluation involves processing three different workloads:

  • Photos: 15.6 GB collection of 4320 photos (RAW as well as JPEGs) in 61 sub-folders
  • Videos: 16.1 GB collection of 244 videos (MP4 as well as MOVs) in 6 sub-folders
  • BR: 10.7 GB Blu-ray folder structure of the IDT Benchmark Blu-ray

Each workload's data set is first placed in a 25GB RAM drive, and a robocopy command is issued to transfer it to the DAS under test (formatted in NTFS). Upon completion of the transfer (write test), the contents from the DAS are read back into the RAM drive (read test). This process is repeated three times for each workload. Read and write speeds, as well as the time taken to complete each pass are recorded. Bandwidth for each data set is computed as the average of all three passes.

Blu-ray Folder Read

It can be seen that there is no significant gulf in the numbers between the different units. For all practical purposes, the casual user will notice no difference between them in the course of normal usage.However, power users may want to dig deeper to understand the limits of each device. To address this concern, we also instrumented our evaluation scheme for determining performance consistency.

Performance Consistency

Aspects influencing the performance consistency include SLC caching and thermal throttling / firmware caps on access rates to avoid overheating. This is important for power users, as the last thing that they want to see when copying over 100s of GB of data is the transfer rate going down to USB 2.0 speeds.

In addition to tracking the instantaneous read and write speeds of the DAS when processing the AnandTech DAS Suite, the temperature of the drive was also recorded at the beginning and end of the processing. In earlier reviews, we used to track the temperature all through. However, we have observed that SMART read-outs for the temperature in NVMe SSDs using USB 3.2 Gen 2 bridge chips end up negatively affecting the actual transfer rates. To avoid this problem, we have restricted ourselves to recording the temperature at either end of the actual workloads set. The graphs below present the recorded data.

Performance Consistency and Thermal Characteristics

The first three sets of writes and reads correspond to the photos suite. A small gap (for the transfer of the video suite from the internal SSD to the RAM drive) is followed by three sets for the video suite. Another small RAM-drive transfer gap is followed by three sets for the Blu-ray folder. An important point to note here is that each of the first three blue and green areas correspond to 15.6 GB of writes and reads respectively. Of all the tested drives, only the Crucial Portable SSD X8 hiccups a little during certain writes. Temperature-wise, the OWC Envoy Pro EX USB-C and the Samsung T7 Touch both show only a 3C increase in temperature on eithe side of the workload. On the other hand, the Lexar SL100 Pro shows a significant 11C increase, pointing to possible thermal issues in the long run.

Synthetic Benchmarks - ATTO and CrystalDiskMark PCMark 10 Storage Bench - Real-World Access Traces
Comments Locked

38 Comments

View All Comments

  • zebrax2 - Thursday, January 23, 2020 - link

    I don't know about the others but I'm not a fan of having the benchmarks hidden inside a drop down box unless selected. I also feel that for some of the benchmark 1 or 2 charts containing all the data ,e.g. ATTO and CrystalDiskMark, would be better instead of the screenshots.
  • chaos215bar2 - Thursday, January 23, 2020 - link

    Agreed. Did I miss something, or are there no actual direct comparisons between the drives (aside from the feature table at the end)?

    This reads like 6 separate reviews, where I have to keep messing with drop downs to follow each one. It’s all but impossible to follow in a mobile browser. On desktop, I could at least open the screenshots side by side.
  • sonny73n - Friday, January 24, 2020 - link

    Who reads news on desktop anymore? Unless working on PC and want to have a peek at what’s on, nobody would give up the comfort of reading while lying down on the sofa or in bed with a mobile device. Therefore, drop down comparison is useless to most readers.
  • s.yu - Saturday, January 25, 2020 - link

    Sorry, I visit Anandtech almost exclusively on my PC.
  • dontlistentome - Sunday, February 2, 2020 - link

    This forum is increasingly populated by people with zero concept that other people may think or do things differently to them. Guess it's a microcosm of the wider no-platforming world.
    My advice? Spend a morning learning keyboard shortcuts and you'll understand why we oldies still prefer to browse on desktop rather than mobile when doing anything other than *really* casual browsing.
  • bigboxes - Monday, February 3, 2020 - link

    Yeah, reading this on mobile platform sucks. Desktop is way more comfortable. Then again, my workstation is fairly ergonomic (and badass!). Get off the couch if you want to live to old age.
  • Sivar - Thursday, February 6, 2020 - link

    I find mobile devices to be extremely annoying for web browsing. Small screen, slow CPU, extremely limited browser plugins, frustrating data entry, more difficult copy/paste.
    I have a tablet and smart phone, but my web browsing on them tends to be light and often only directs me on what to read later on my "real" device.
  • JanW1 - Monday, January 27, 2020 - link

    That was my first thought on this review as well. Plus, the scales of the charts hidden behind the dropdown menu are all different for no good reason. This almost looks like every effort was made to prevent readers from comparing the drives. Just let the throughput scale on all charts go to 1000MB/s and the temperature scale to 70°. This fits the data from all drives nicely and readers can see the differences in a glimpse.
  • alphasquadron - Thursday, January 23, 2020 - link

    Yeah I agree as well. Don't it was like this previously or maybe it was a different reviewer.
  • ganeshts - Thursday, January 23, 2020 - link

    Would it help you if I were to keep the 'Expand All' option as the default and allow readers to use the drop down to 'compress' it down to 1 graph / make the analysis text visible along with?

    As for the ATTO / CDM 'graphs' instead of 'screenshots' - the aspect I need to trade off with is the number of data points. For example, CDM has 12 sets per drive (or 24 if you include the IOPS version also). ATTO has more than 20 sets * 2 (R/W). That would be 64 graphs. It doesn't make sense to have that many graphs for two synthetic benchmarks.

Log in

Don't have an account? Sign up now