Haswell isn't expected to launch until the beginning of June in desktops and quad-core notebooks, but Intel is beginning to talk performance. Intel used a mobile customer reference board in a desktop chassis featuring Haswell GT3 with embedded DRAM (the fastest Haswell GPU configuration that Intel will ship) and compared it to an ASUS UX15 with on-board NVIDIA GeForce GT 650M. 

Despite the chassis difference, Intel claims it will be able to deliver the same performance from the demo today in an identical UX15 chassis by the time Haswell ships.

The video below shows Dirt 3 running at 1080p on both systems, with identical detail settings (High Quality presets, no AA, vsync off). Intel wouldn't let us report performance numbers, but subjectively the two looked to deliver very similar performance. Note that I confirmed all settings myself and ran both games myself independently of the demo. You can be the judge using the video below:

Intel wouldn't let us confirm clock speeds on Haswell vs. the Core i7 (Ivy Bridge) system, but it claimed that the Haswell part was the immediate successor to its Ivy Bridge comparison point. 

As proof of Haswell's ability to fit in a notebook chassis, it did have another demo using older Haswell silicon running Call of Duty: Black Ops 2 in a notebook chassis. 

Haswell GT3e's performance looked great for processor graphics. I would assume that overall platform power would be reduced since you wouldn't have a discrete GPU inside, however there's also the question of the cost of the solution. I do expect that NVIDIA will continue to drive discrete GPU performance up, but as a solution for some of the thinner/space constrained form factors (think 13-inch MacBook Pro with Retina Display, maybe 11-inch Ultrabook/MacBook Air?) Haswell could be a revolutionary step forward.

Comments Locked

252 Comments

View All Comments

  • Krysto - Wednesday, January 9, 2013 - link

    Intel has been doing a lot of misleading lately. Here:

    http://www.theverge.com/2013/1/9/3856050/intel-can...

    But it's not the first time. They did it with the "3D chips" for IVB, which most people understood from their marketing that it will be BOTH 40% more efficient and 37% more powerful, when in fact it was an "either/or" thing, and in the end it compromised between the two, and got a lot less for each - 20% energy efficiency and I think 15% higher performance.

    Then they did it with Medfield, too, announcing it as a "2 Ghz processor" (it was 1.3 Ghz, really). And I think with the recent ones, too, like Lexington. They announced it as 1.2 Ghz processor, but it's probably more like 500 Mhz, with turbo-boost, that isn't used most of the time, because it eats a lot of power. This is probably why Verge also found it very slow:

    http://www.theverge.com/2013/1/8/3850734/intels-at...

    Lexington looks like no competitor to Cortex A7. I would distrust Intel's marketing by default, until proven otherwise, rather than trust what they say now.
  • IntelUser2000 - Wednesday, January 9, 2013 - link

    " They did it with the "3D chips" for IVB, which most people understood from their marketing that it will be BOTH 40% more efficient and 37% more powerful,"

    Uninformed people did, the rest did not.

    TSMC, Global Foundries, Samsung foundries, all claim the same thing. But you need to know that each metrics come seperately.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    I read your linked article several times. Let's break down how honest you are... in actually evalualting the information, which is why of course all you do is broad stroke accusation.

    The Intel chips in question at your article :

    a 17W that may be limited to a 13W

    a 13W that may be limited to 7W

    So, you and The Verge KNOW IT IS a 2GHZ processor, but you want the cheapo version that is limited by the demanding and POOR crowd, so you ignore the top speed implementations and claim they "lied".
    Instead, you and your pocketbook lied.

    Same on the version that can be limited to 7W.

    Intel produces the chips they claimed, and ALSO offers low wattage control on those same chips, and you call Intel a liar because you don't want to PAY for the top end version implementations.

    Sorry charlie - Intel didn't lie, Mr Cheapo lied though, that's your wallet's name.
  • Spunjji - Wednesday, January 9, 2013 - link

    http://www.anandtech.com/show/5333/intel-sort-of-d...

    hescominsoon is wrong about this, of course. Pretty sure this is just the in-game time demo. But yeah, Intel still do that video bullshit.
  • Spunjji - Wednesday, January 9, 2013 - link

    Oh dear... superfluous post. Sorry!
  • hescominsoon - Thursday, January 10, 2013 - link

    a timedemo isn't real gaming so it means nothing. I hope i'm wrong but heck ALL of the graphics companies have been scamming for quite some time.
  • Spunjji - Friday, January 11, 2013 - link

    I wouldn't say it means nothing. It's a standardised indication of game engine performance. It doesn't relate well to actual gaming performance, but it at least allows for cross-model comparisons.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    What a bunch of crap.

    Anand actually played the "demo version setups" in the "direct real game the demo runs 'emulating' ", after verifying the in game settings, and claimed " "THEY RELATE DIRECTLY" IN ACTUAL GAME PLAYING".

    Nothing like ignoring the real world test in this case in order to broadly brushstroke the "not so intelligent" cleche, that is supposed to make you "seem smart", when it is immensely STUPID in this case.

    A stupid idiotic talking point that does not here apply.

    In betwixt your stupid talking points, we get the rest of the whiners claiming the game runs so very fast - THUS "PROVING" real world performance in fact is directly related, right ? Right....

    And of course with those attacks in mind, we see someone added " it's only really a 630LE it's taking on" and "the laptop lid is closed so it's throttling" , " it's an 80W Intel processor " , "only the demo was tested ", " it's (Haswell) overclocked to the moon " .....

    The whiners lie a lot more than Intel ever dreamed of doing.
    Zero standards for the whining haterz, and the ultimate standard for the attacked Intel.

    I call you know what.
  • Spunjji - Monday, January 14, 2013 - link

    ...jeez.

    *In this article*, Anand stated that he "ran" the games independently of the demo. Forgive me for not assuming that meant "played". You can interpret it how you want, but in the language of the article there is no good reason to assume he played the games.

    So, what are you claiming here? You appear to be referring to other articles that I'm not aware of, then ranting about things I never said, whilst providing no evidence of your own.
  • CeriseCogburn - Monday, January 14, 2013 - link


    For not assuming ? What else can it mean you idiot.

Log in

Don't have an account? Sign up now