System Performance

Although there are several processor options in the Clevo P870DM2, Mythlogic has outfitted the review unit with the highest performance model they offer: the Intel Core i7-6700K. For those familiar with Intel’s lineup of processors, they offer several model lines, with the lowest power ones being the Y series, which are 4.5-Watt, and then the 15/28W U series, and then the highest tier of their mobile processors in the H and HQ, with the latter being the quad-core mobile parts. These top out at 45-Watts, and we see them in almost all gaming laptops. Clevo is marketing this as a desktop replacement, and as such, every processor option is from the Desktop lineup. The majority of the processor options are in the 65-Watt series from Intel S lineup, featuring quad-cores and 6 MB of cache. The processor in the Phobos 8716 review unit is from the K series, with a 91-Watt TDP, 4.0-4.2 GHz, 8 MB of cache, and it is unlocked for overclocking. From a power perspective, the Core i7-6700K has just over twice the thermal headroom of a typical H series processor found in a gaming notebook. Twice the TDP doesn’t mean twice the performance, of course, but there is certainly a lot more potential performance on tap than any H series could offer.

To see how much performance is available, the Clevo P870DM2 was run through our standard lineup of tests, and then compared against several other systems. Of particular interest will be comparisons against the last Desktop Replacement notebook that we tested in the Clevo P750ZM, which was outfitted with the Devil’s Canyon Core i7-4790K, and that CPU actually has a slight frequency advantage over the i7-6700K, with the former topping out at 4.4 GHz and the latter at 4.2 GHz, but with the advantage of the Skylake architecture versus the Haswell architecture in Devil’s Canyon. As always, check out our notebook bench if you’d like to compare this laptop to any other we have tested.

PCMark

PCMark 8 - Home

PCMark 8 - Creative

PCMark 8 - Work

PCMark 7 (2013)

Our first tests are from Futuremark’s PCMark benchmark. This test runs through several real-world applications, and is a complete system test, from the storage to the CPU to the GPU. The Phobos 8716 sets a new bar here for performance in a notebook. That’s not surprising since it has the fastest CPU, the fastest GPU, and the fastest SSD in the Samsung 950 Pro.

Cinebench

Cinebench R15 - Single-Threaded Benchmark

Cinebench R15 - Multi-Threaded Benchmark

Cinebench R11.5 - Single-Threaded Benchmark

Cinebench R11.5 - Multi-Threaded Benchmark

This test tends to be focused on pure CPU performance, and more cores with higher frequencies tend to dominate on this test. We can see that as well with the Devil’s Canyon in the P750ZM slightly edging the Skylake i7-6700K, thanks to a higher turbo frequency available, but only on the single-core tests. On the multi-core tests, the Skylake pulls ahead with its more advanced multi-core turbo, and benefits like Speed Shift.

x264

x264 HD 5.x

x264 HD 5.x

The better multi-core performance of Skylake once again pulls ahead easily here, with performance significantly better than the Haswell Devil’s Canyon.

Web Tests

Finally, web performance is always a priority, although it can be heavily influenced by the browser and platform. It is still something that everyone does every day. When Windows 10 launched, we switched to Edge for our tests, from Chrome on previous versions of Windows. Each browser is marked in the results.

Mozilla Kraken 1.1

Google Octane 2.0

WebXPRT 2015

WebXPRT 2013

It should come as no surprise after the previous benchmarks that the Clevo P870DM2 / Mythlogic Phobos 8716 easily trounces all previous computers on our web benchmarks as well. Packing a desktop CPU into a notebook has some drawbacks in terms of heat dissipation, but at the end of the day the performance is a good jump ahead of any of the H series processors.

Storage Performance

As with any of these boutique computers, there is quite a bit of customization available on all of the components, and the storage is no exception. For the review unit, Mythologic supplied the Samsung 950 Pro NVMe SSD, and if you saw the review on that drive, you’d be aware that it is one of the fastest consumer SSDs available today, and in fact was only recently pipped with the introduction of the 960 Pro.

With the 256 GB model which is in the review unit, there are not the maximum amount of NAND dies to give the maximum performance, but regardless the results are still very, very good. Read rates approaching 2000 MB/s is likely going to be enough for almost anyone. Mythlogic will outfit the Phobos 8716 in multiple ways, and there are actually two M.2 slots if you need more SSD storage, and also two 2.5-inch SATA slots as well if you need a bit more bulk storage for game downloads.

Design GPU Performance
Comments Locked

61 Comments

View All Comments

  • BrokenCrayons - Thursday, October 27, 2016 - link

    Minor details...the MYTH Control Center shows an image of a different laptop. It struck me right away because of the pre-Pentium MMX Compaq Presario-esque style hinge design.

    As for Pascal, the performance is nice, but I continue to be disappointed by the cooling and power requirements. The number of heat pipes affixed to the GPU, the fact that it's still reaching thermal limits with such cooling, and the absurd PSU requirements for SLI make it pretty obvious the whole desktop-class GPU in a laptop isn't a consumer-friendly move on NV's part. Sure it cuts back on engineering, manufacturing, and part inventory costs and results in a leaner organization, but it's hardly nice to people who want a little more than iGPU performance, but aren't interested in running up to the other extreme end of the spectrum. It's interesting to see NV approach the cost-cutting measure of eliminating mobile GPU variants and turning it into a selling point. Kudos to them for keeping the wool up on that aspect at least.

    The Killer NIC is something I think is a poor decision. An Intel adapter would probably have been a better choice for the end user since the benefits of having one have yet to be proven AND the downsides of poor software support and no driver flexibility outweigh the dubious claims from Killer's manufacturer.
  • ImSpartacus - Thursday, October 27, 2016 - link

    Nvidia just named their mobile GPUs differently.

    Fundamentally, very little has changed.

    A couple generations ago, we had a 780mx that was based on an underclocked gk104. Nvidia could've branded it as the "laptop" 770 because it was effectively an underclocked 770, just like the laptop 1080 is an underclocked 1080.

    But the laptop variants are surely binned separately and they are generally implemented on the mxm form factor. So there isn't any logistical improvements just by naming their laptop GPUs differently.
  • The_Assimilator - Thursday, October 27, 2016 - link

    "The number of heat pipes affixed to the GPU, the fact that it's still reaching thermal limits with such cooling, and the absurd PSU requirements for SLI make it pretty obvious the whole desktop-class GPU in a laptop isn't a consumer-friendly move on NV's part."

    nVIDIA is doing crazy things with perf/watt and all you can do is complain that it's not good enough? The fact that they can shoehorn not just one, but TWO of the highest-end consumer desktop GPUs you can buy into a bloody LAPTOP, is massively impressive and literally unthinkable until now. (I'd love to see AMD try to pull that off.) Volta is only going to be better.

    And it's not like you can't go for a lower-end discrete GPU if you want to consume less power, the article mentioned GTX 1070 and I'm sure the GTX 1060 and 1050 will eventually find their way into laptops. But this isn't just an ordinary laptop, it's 5.5kg of desktop replacement, and if you're in the market for one of these I very much doubt that you're looking at anything except the highest of the high-end.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    Please calm down. I realize I'm not in the target market for this particular computer or the GPU it uses. I'm also not displaying disappointment in order to cleverly hide some sort of fangirl obsession for AMD's graphics processors either. What I'm pointing out are two things:

    1.) The GPU is forced to back off from its highest speeds due to thermal limitations despite the ample (almost excessive) cooling solution.

    2.) While performance per watt is great, NV elected to put all the gains realized from moving to a newer, more efficent process into higher performance (in some ways increasing TDP between Maxwell/Kepler/etc. and Pascal in the same price brackets such as the 750 Ti @ 60W vs the 1050 Ti @ 75W) and my personal preference is that they would have backed off a bit from such an aggressive performance approach to slightly reduce power consumption in the same price/performance categories even if it cost in framerates.

    It's a different perspective than a lot of computer enthusiasts might take, but I much perfer gaining less performance while reaping the benefits of reduced heat and power requirements. I realize that my thoughts on the matter aren't shared so I have no delusion of pressing them on others since I'm fully aware I don't represent the majority of people.

    I guess in a lot of ways, the polarization of computer graphics into basically two distinct categories that consist of "iGPU - can't" and "dGPU - can" along with the associated power and heat issues that's brought to light has really spoiled the fun I used to find in it as a hobby. The middle ground has eroded away somewhat in recent years (or so it seems from my observations of industry trends) and when combined with excessive data mining across the board, I more or less want to just crawl in a hole and play board games after dumping my gaming computer off at the local thrift store's donation box. Too bad I'm screen addicted and can't escape just yet, but I'm working on it. :3
  • bji - Thursday, October 27, 2016 - link

    "Please calm down" is an insulting way to begin your response. Just saying.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    I acknowledge your reply as an expression of your opinion. ;)
  • The_Assimilator - Friday, October 28, 2016 - link

    Yeah, but my response wasn't exactly calm and measured either, so it's all fair.
  • BrokenCrayons - Friday, October 28, 2016 - link

    "...so it's all fair."

    It's also important to point out that I was a bit inflammatory in my opening post. It wasn't directed at anyone in particular, but was/is more an expression of frustration with what I think is the industry's unintentional marginalization of the lower- and mid-tiers of home computer performance. Still, being generally grumpy about something in a comments box is unavoidably going to draw a little ire from other people so, in essence, I started it and it's my fault to begin with.
  • bvoigt - Thursday, October 27, 2016 - link

    "my personal preference is that they would have backed off a bit from such an aggressive performance approach to slightly reduce power consumption in the same price/performance categories even if it cost in framerates."

    They did one better, they now give you same performance with reduced power consumption, and at a lower price (980 Ti -> 1070). Or if you prefer the combination of improved performance and slightly reduced power consumption, you can find that also, again at a reduced price (980 Ti -> 1080 or 980 -> 1070).

    Your only complaint seems to be that the price and category labelling (xx80) followed the power consumption. Which is true, but getting hung up on that is stupid because all the power&performance migration paths you wanted do exist, just with a different model number than you'd prefer.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    You know, I never thought about it like that. Good point! Here's to hoping there's a nice, performance boost realized from a hypothetical GT 1030 GPU lurking in the product stack someplace. Though I can't see them giving us a 128-bit GDDR5 memory bus and sticking to the ~25W TDP of the GT 730. We'll probably end up stuck with a 64-bit memory interface with this generation.

Log in

Don't have an account? Sign up now