Metro 2033

Our first analysis is with the perennial reviewers’ favorite, Metro 2033. It occurs in a lot of reviews for a couple of reasons – it has a very easy to use benchmark GUI that anyone can use, and it is often very GPU limited, at least in single GPU mode. Metro 2033 is a strenuous DX11 benchmark that can challenge most systems that try to run it at any high-end settings. Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1440p with full graphical settings. Results are given as the average frame rate from a second batch of 4 runs, as Metro has a tendency to inflate the scores for the first batch by up to 5%.

One 7970

Metro 2033 - One 7970, 1440p, Max Settings

With one 7970 at 1440p, every processor is in full x16 allocation and there seems to be no split between any processor with 4 threads or above. Processors with two threads fall behind, but not by much as the X2-555 BE still gets 30 FPS. There seems to be no split between PCIe 3.0 or PCIe 2.0, or with respect to memory.

Two 7970s

Metro 2033 - Two 7970s, 1440p, Max Settings

When we start using two GPUs in the setup, the Intel processors have an advantage, with those running PCIe 2.0 a few FPS ahead of the FX-8350. Both cores and single thread speed seem to have some effect (i3-3225 is quite low, FX-8350 > X6-1100T).

Three 7970s

Metro 2033 - Three 7970, 1440p, Max Settings

More results in favour of Intel processors and PCIe 3.0, the i7-3770K in an x8/x4/x4 surpassing the FX-8350 in an x16/x16/x8 by almost 10 frames per second. There seems to be no advantage to having a Sandy Bridge-E setup over an Ivy Bridge one so far.

Four 7970s

Metro 2033 - Four 7970, 1440p, Max Settings

While we have limited results, PCIe 3.0 wins against PCIe 2.0 by 5%.

One 580

Metro 2033 - One 580, 1440p, Max Settings

From dual core AMD all the way up to the latest Ivy Bridge, results for a single GTX 580 are all roughly the same, indicating a GPU throughput limited scenario.

Two 580s

Metro 2033 - Two 580s, 1440p, Max Settings

Similar to one GTX580, we are still GPU limited here.

Metro 2033 conclusion

A few points are readily apparent from Metro 2033 tests – the more powerful the GPU, the more important the CPU choice is, and that CPU choice does not matter until you get to at least three 7970s. In that case, you want a PCIe 3.0 setup more than anything else.

CPU Benchmarks GPU Benchmarks: Dirt 3
Comments Locked

242 Comments

View All Comments

  • Patrese - Wednesday, May 8, 2013 - link

    Awesome article, thanks! Is it possible to include some sort of gaming physics testing? Now that PhysX is beginning to catch some momentum, I'd be great to see if a 8-module AMD processor handles physics stuff better than a 4-core comparable Intel one, and at what point does a dedicated physics card starts to make sense, if at all.

    I’d be also nice if a “mainstream gaming” article could be made too. Benchmarks at 1080p with cards like the 660Ti and 7850, for instance. No need for 3 way SLI/CF on those, so you'll not need as much time in Narnia. :)
  • araczynski - Wednesday, May 8, 2013 - link

    interesting read, although i find it too focused to be of much general use (or useful future reference). i'd like to have seen for example how an E8500 holds up (too big of a gap between E6500 and i52500), as well as at least ONE game i would even bother playing (skyrim/witcher/etc). and of course like you mentioned, even a slightly bigger sampling of graphics cards. (i think you mentioned that).

    anywho, i realize this wasn't meant to be anything exhaustive (i do appreciate having the CPU/GPU benches available here as a good reference though), and i do like the detail/explanation length you went into.

    so thanks :)
  • xinthius - Wednesday, May 8, 2013 - link

    But AMD offers good price to performance at lower tiers, they should be recommend.
  • yougotkicked - Wednesday, May 8, 2013 - link

    Regarding your comments on the role of artificial intelligence in game performance/programming: I've just finished a course in AI, and while implementations may vary quite a bit from game to game, many AI programs can be reduced to highly-parallel brute-force computation, simply evaluating the resulting states of many potential decisions for a numerical representation of their desirability, then selecting the best option from the set of evaluated actions. Obviously this is something that will vary greatly from game to game, but in games with many independent AI managed elements, I would expect a certain amount of the processing to be offloaded to the GPU.

    Other than that I agree with you on the demands of AI in games; my professor (who specializes in game AI and has experience in the industry) said that the AI is usually given about 10% of the CPU time in a game, so it's rarely a limiting factor.

    I'm still working through the whole article (really enjoying it so far) so I'm sure I'll have many more comments/questions later.
  • IanCutress - Wednesday, May 8, 2013 - link

    Based on previous CUDA experience, CUDA doesn't like a lot of IF statements in its routines. So if you're offloading different AI parts onto the GPU, unless all the elements are being put through the same set of if commands (and states), it won't work too well, with some warps taking a lot longer than others if there is large branch deviation. It's a task suited to MIMD environments, like a CPU. Then again, it really depends on the game. Clever AI is already here, because we confine it to a self-created system. One could argue that the bots in CounterStrike are not particularly smart, but the system can put their accuracy up to 100% to make it harder. It's a lot of give and take, perhaps. It is times like these I wish I did CompSci rather than Chemistry :) I need to take one of those MIT online AI courses. You know, inbetween testing!

    Ian
  • yougotkicked - Wednesday, May 8, 2013 - link

    I suppose conditionals would make offloading some AI components to the GPU impractical, but there still remains a subset of AI computations which seem very GPU friendly to me. State evaluation functions seems like a prime example, the CPU would be responsible for deciding which options to evaluate, building an array of states to be evaluated by the GPU. These situations probably don't come up very often in FPSs, but in something like Civilization I can see it being quite common.

    I've actually got to head over to that class now, I'll ask the professor if he knows of any AI's using GPU computing in modern games.
  • airmantharp - Wednesday, May 8, 2013 - link

    Like Ian said, GPU's aren't good 'branch' processors, but I do see where you're coming from. Things like real physics, audio environment maps, and pre-render lighting maps could be fed to AI routines running on the CPU. This would allow for a much greater 'simulation awareness' for AI actions.
  • yougotkicked - Wednesday, May 8, 2013 - link

    I spoke with my professor and he said that as far as he knows, many people have discussed to prospect of using GPU's for AI, but nobody has actually done so yet. He's going to ask some friends of his at some major game studios to see if they are working on it.

    He did agree with me that there are some aspects that could be computed on a GPU, but a lot of the existing AI methods are inherently sequential, so offloading it to the GPU will require new algorithms in many cases.
  • TheQweaker - Thursday, May 9, 2013 - link

    You may wish to check nVidia's GTC conference web site where you can find some GPU AI Research. Also, nVidia published various PDF slides on GPU Path Planning.

    If you look deeper in some specific AI Domains such as, say, AI Planning (first used in F.E.A.R. in 2005, lately used in KillZone 3 and Transformers 3: The Fall of the Cybertron) you can find papers investigating the use of GPUs.

    On of the bottom line of current GPU AI research is that GPUs crunch large numbers of data very fast so, currently, there is not much hope in using the many GPU threads for tiny amounts of data of state space search.

    Hoping this helps.

    -- The Qweaker.
  • yougotkicked - Thursday, May 9, 2013 - link

    Thanks for pointing me towards those papers, they look pretty interesting and I've been looking for a topic to write my final paper on ;)

Log in

Don't have an account? Sign up now