ASUS G73SW + SNB: Third Time’s the Charm?
by Jarred Walton on March 4, 2011 12:00 AM ESTBattery Life
The G73 is a 17.3” notebook chassis with a 75Wh battery; we know what that means, right? You’re not going to get all-day computing, or even half a day (unless you have a six-hour workday). We also know that Sandy Bridge is more power efficient than Clarksfield, so we should see some gains relative to the G73JW. And that’s exactly what we get.
Idle battery life is up 17% to 3.75 hours, and that’s the smallest increase. Internet battery life is up 27%, though it falls just shy of the three-hour mark, and H.264 decoding is up 39% to over 2.5 hours. While those are all fairly impressive increases over the previous generation, let’s not lose sight of what we could get with switchable graphics. A vanilla i7-2820QM notebook running off the HD 3000 IGP manages roughly double the runtime in our battery life testing, and that’s with a slightly smaller battery.
We can actually estimate the idle power requirements of the GTX 460M based off of those results, and let me tell ya, it ain’t pretty. Based on the 71Wh battery and the various test results, the Compal SNB notebook we tested used an average of 9.04W at idle, 10.24W in the Internet test, and 16.38W in H.264 playback. In comparison, the G73SW averages 20.64W idle, 26.95W Internet, and 28.66W in H.264 decoding. That means the GTX 460M requires roughly 10W at idle with very low clocks (50MHz core, 270MHz RAM), around 15W with a web browser showing Flash advertisements, and the difference between HD 3000 and GTX 460M in H.264 encoding is back to ~10W. Sadly, these high-power GPUs just aren’t very friendly to battery life, with just the GPU using about as much power as the rest of the notebook combined. Here’s where NVIDIA’s Optimus Technology would have been useful: twice the battery life, the ability to use Intel’s QuickSync technology, and discrete graphics performance when you want it.
Plugged in power numbrs are all higher, as the GPU isn't in "limp mode" and we also ran the LCD at maximum brightness. At idle, we measured 27W at the outlet using a Kill-A-Watt device; taking into account power adapter inefficiency, that figure matches up pretty well with our above calculation. When we put a 100% load on just the CPU (using Cinebench 11.5 SMP), power draw gets as high as 87W. Running through the various 3DMark tests, we saw a maximum "typical gaming" load of 114W, whereas Furmark manages to push the GPU just a little harder and we measured up to 125W. It's also interesting to note that we couldn't get power draw any higher by running Furmark with a CPU loading utility; the CPU load was apparently enough to reduce Furmark performance and so our "worst-case" CPU+GPU load actually dropped to 118W. Again, factoring in power adapter inefficiencies, there's still plenty of headroom on the 150W power brick (unlike the 300W brick in the Clevo X7200 with SLI systems), so you can even play games while charging the battery.
We've had some reader requests for a few other battery life metrics. Charging time on the G73SW (with the system powered up) checked in at 168 minutes; it might recharge slightly faster if the system is off, but with a 150W power brick there should be plenty of extra juice for the charging circuit. If you want to run the LCD at maximum brightness rather than 100 nits (or cd/m2 if you prefer), idle battery life drops to 195 minutes. So the extra 55 nits of brightness requires an additional ~2.5W of power. We also ran our Internet test with Pandora open and streaming music in a separate browser tab; that dropped battery life down to 140 minutes. Finally, what about gaming on battery? With the GPU set for maximum performance, we managed 67 minutes looping 3DMark06, but there’s still a catch.
Even trying our best to achieve maximum gaming performance while on battery power, the GPU is still throttled—and the CPU appears somewhat slower as well. We ran the full Futuremark test suite on battery power, using the High Performance power profile. The following chart shows the percent of performance relative to the same test running off the mains (so a score of 100 would be no change in performance).
The best results are in PCMark, where battery performance is still 80% of plugged-in performance. Move over to graphics tests, and 3DMark05 managed to maintain nearly 40% of AC performance, 03/05/Vantage Entry are in the range of 30-32% of AC performance, and the Vantage Performance preset is less than one fourth as fast. To put that in perspective, you’re looking at gaming performance that’s actually worse than what you’d get with a midrange GT 425M GPU.
The problem appears to be getting sufficient power to the GPU from the battery for the higher clocks, as the midrange GPUs don’t forcibly throttle performance on battery power (at least in my experience). Then again, it’s not like you get significantly better battery life for gaming on a midrange GPU—I measured 104 minutes with a GT 425M on an 84Wh battery with an i3-380M CPU and a 14” chassis. In other words, while it’s possible to have a gaming laptop that gets good battery life (i.e. by shutting off the GPU), unless something changes in a big way we’re not going to get great gaming performance while on battery power. So fire up your smartphone and play some Angry Birds instead :)
56 Comments
View All Comments
7Enigma - Friday, March 4, 2011 - link
Turbo negates much of your comment.bennyg - Monday, March 7, 2011 - link
Good thing designers don't just ask you for your opinion and consider what others use.8Gb is not pointless, I run up against the limits of 4Gb all the time.
Granted, some dualcore arrandales were better than clarksfields were better than the quadcores, but:
1) this is comparing highest-end dual core vs lowest-end quad (i7 620M approx i7 720qm). Higher clocked quads esp. the XMs were still faster than the arrandales.
2) clarksfield was not a native mobile chip, it was a downclocked undervolted lynnfield released many months before
3) the total *work* which the quad cores can do is higher, the issue is optimisation of workloads, in particular games are pretty bad at using a 3rd and 4th core, and especially so when Turbo means that 3rd and 4th core use come at a clockspeed disadvantage if you want to think of it that way. I.E. the quad is slower because software doesn't use it well enough
4) the usage patterns and power of the i7 quads could be much better utilised but Intel locked all the Turbo logic away on-die except for the XMs. Meaning overclocking a QM can actually *decrease* performance due to preset limits causing less Turbo-ing. Hell, the BIOS in my laptop has absolutely nothing, I can't even disable hyperthreading - to reallocate more of the supposedly "unified" L3 to each core...
There is *so* much more the quads could have offered but tweakability was hugely reduced compared with the C2D/Qs.
bennyg - Monday, March 7, 2011 - link
bleh no edit featureOf course the point I missed (along with all the typos) was that we are now in the era of Sandy Bridge and "Turboboost 2.0", Nehalem and Turboboost is old news, and so are the opinions based off it.
DanNeely - Friday, March 4, 2011 - link
Depends how hard you use your system. My work laptop with MSSQL and VIsual Studio 2010 running in addition to the normal set of office apps regularly uses 5-6GB of memory. My heavily used destop (I7-930) is currently sitting at 12GB used (~10GB excluding BOINC).DooDoo22 - Friday, March 4, 2011 - link
Please include comparisons to the new MBPs in your next laptop review. It is easier to judge the relative merits of these dime-store versions when we have a reference model like the MBP.laytoncy - Friday, March 4, 2011 - link
How does the GT 555M 3GB that is in the new XPS 17" from Dell stack up to the 460M or GTX485M?Kaboose - Friday, March 4, 2011 - link
To my knowledge the GT 555m is the GT 445m clocked higher, here is a quote from notebookcheck"We tested the gaming performance of the GeForce GT 555M with DDR3 graphics memory in a pre sample of the Dell XPS 17. The synthetic benchmarks show a performance slightly above the old GeForce 9800M GT. GDDR5 versions should be slightly faster. Therefore, nearly all current demanding games should run fluently in 1366x768 and high detail settings. Only extremely demanding games like Metro 2033 will only run in medium detail settings. Older and low demanding games like Fifa 11 should run in high detail settings and Full HD resolution"
This is what they had to say about the GTX 460m
"We ran a set of benchmarks on an early pre-sample of a Qosmio X500 with a 740QM CPU. In the synthetic benchmarks, the GTX 460M was on par with the DDR3 based Mobility Radeon HD 5850. In actual game benchmarks and tests, the performance was better than a HD 5850 with GDDR5 on average. In some cases (e.g., Unigine Heaven 2.1 and Dirt 2 Demo), the card even beat a Mobility Radeon HD 5870. On average, the GF100 based GTX 480M was about 8-18% faster. The detailled benchmark and gaming results (including charts) can be found below"
So they arent really very close in performance the GT 540m is the low end gaming the GT 555m is a step in the right direction but not into the "gaming" area yet.
rscoot - Friday, March 4, 2011 - link
Playing a little bit of guitar before you wrote this article Jarred? :PJarredWalton - Friday, March 4, 2011 - link
Whoops... Freudian slip or something. ;)Pessimism - Friday, March 4, 2011 - link
chord != cord