Intel’s Vision for Sharing Work: Deep Link & Additive Ai

As mentioned towards the start of this article, Intel is taking an interesting tack with Xe MAX. From a graphics standpoint, the company has not developed a multi-GPU solution to combine the rendering power of Xe MAX with Tiger Lake’s iGPU. As a result, Xe MAX is not significantly more powerful than Tiger Lake’s iGPU for 3D rendering/gaming tasks, greatly limiting the utility of Xe MAX for gaming purposes.

Although this wipes out the obvious route for using Xe MAX to augment Intel’s iGPU – and thus make Xe MAX a significant upgrade for graphics purposes – on the whole it’s a decision that makes sense for Intel. Multi-GPU graphics is hard, and it’s only getting harder. Even NVIDIA, with all of its experience in the field, has essentially pulled out as of their latest generation of hardware, thanks to rendering techniques getting less and less multi-GPU friendly. So what chance would Intel have, especially with such low-end hardware? Probably not much.

But that doesn’t mean that Xe MAX doesn’t have a purpose. Even if it can’t be used to help with any single task/thread, it can still handle additional tasks/threads, essentially having it function as a co-processor to offload tasks to or to spin up extra tasks on. This is a use case that professional-grade video cards and associated software have supported for a number of years, and it’s the same route Intel is taking with Xe MAX.

This kind of functionality is a core part of what Intel is terming its Deep Link technology, which is their umbrella name for all of the technologies backing and abilities sprouting from using Intel’s CPU and dGPU together. In practice, Deep Link is Intel’s software and firmware stack for Xe MAX, ranging from how they’re balancing TDP allocations between the CPU and GPU, out to how they present the additional processing blocks from an Xe MAX GPU to software so that it can easily use them. There is no real hardware magic here – as previously mentioned, Intel is using a standard PCIe 4.0 x4 link back to the CPU – but the company sees the synergy between their CPUs and Xe MAX as being a defining factor of the graphics solution – and why customers would want it.

Arguably the most critical part of Deep Link is what Intel is terming “Additive Ai”, which is the ability to use the iGPU and dGPU together within a program. As previously mentioned, Intel’s focus here is on enabling developers to use Xe MAX for additional workloads. Among other things, Intel’s examples have included using Xe MAX’s compute resources for batch processing images in Gigapixel AI, and using the chip’s video encode blocks to increase the number of video streams that can be simultaneously encoded.

This sort of batch-focused software is the ideal use case for Xe MAX. If a task can be broken down into multiple independent pieces, then it can easily be farmed out to both GPUs simultaneously – and thus justifying adding Xe MAX to the mix rather than just relying on Tiger Lake’s iGPU.

As for what software can use these capabilities, conceptually any software that can handle issuing work to multiple GPUs is in a good place. Even if it can’t handle Xe MAX out of the box, it should take very little work to get it seeing multiple Intel GPUs. Otherwise, this is where Intel’s control of the software stack should be an advantage, as it gives them opportunities to abstract certain parts of the multi-GPU equation from software developers. Though at the end of the day, that software still needs to be able to issue independent workloads to properly make use of Xe MAX.

The need for independent workloads and batch processing, in turn, is why Intel is focusing on what they term “mobile creation” workloads. These tasks aren’t typically processed in real-time, and broadly speaking have the greatest overlap with what the Xe MAX hardware can do. So although Xe MAX isn’t especially useful as stand-alone graphics adapter, Intel sees it as an excellent accelerator.

Overall, Intel is still in the early days of software support for Deep Link and Xe MAX. The company is working with software developers to get multi-GPU support added to more software down the line, so that more programs can take advantage of farming work out to Xe MAX. Along with getting more batch-style software enabled, the company is also working to enable Xe MAX to help with large, single-stream video encoding. Since video encoding is not hard-bound to being a serial task, Intel is looking at ways to split up a large encoding job so that each Xe encode block gets a chunk of the video to work on, a similar process to how multi-core CPU encoding works today. For now, Intel is targeting the first half of next year.

Sharing Power: Extending Adaptix to dGPUs

Along with sharing work, Deep Link also encompasses Intel’s technology for sharing/allocating power between their CPUs and Xe Max. Intel calls this Dynamic Power Share, and its an extension of their Adaptix power management technology, which the company has offered since Ice Lake.

Intel’s Adaptix is a suite of technologies that includes Dynamic Tuning 2.0, which implements DVFS feedback loops on top of supposedly AI-trained algorithms to help the system deliver power to the parts of the processor that need it most, such as CPU, GPU, interconnect, or accelerators. With Adaptix enabled, the idea is that the power can be more intelligently managed, giving a longer turbo profile, as well as a better all-core extended turbo where the chassis is capable.

Intel already uses Adaptix to allocate power between their CPU cores and iGPU, among other blocks, so extending it to include Xe MAX is a natural (and necessary) extension of the technology. According to Intel, the company has also learned a great deal from their previous dGPU-style effort, Kaby Lake-G and its on-chip AMD dGPU, which they have taken into account when extending Adaptix for Xe MAX.

Like Adaptix for CPUs, just how well this feature is used is going to be largely in OEM hands. Intel provides the tools, but it’s up to OEMs to set their various tuning values and plan for how that interacts with the power delivery and cooling capabilities of a laptop. But with Intel starting small on Xe MAX’s rollout – there are only 3 laptops shipping this year – hopefully it means Intel has been able to give the OEMs and the devices an appropriate level of attention.

Ultimately, Intel considers Adaptix/Dynamic Power Share to be another software-driven advantage for their gear. From a competitive standpoint the company believes that their tech does a better job of power management than how MX350-enabled laptops handle power allocations – particularly, that Xe MAX laptops don't have to permanently and continually reserve thermal and power headroom for the dGPU – and thus can unlock more performance even in CPU-limited workloads. That said, it's a bit of a dubious (or at least, non-intuitive) claim, as laptops have been able to shut off dGPUs for years now. But, as is often the case with power-saving features, how well any of this is tuned in shipping system is up to the OEMs – and Intel says that they've found that most systems in this class with (rival) dGPUs aren't allocating the CPU its full headroom.

A Word on Gaming Performance

Since Intel lacks a way to combine multiple GPUs for a single rendering/gaming task, the company is not really pushing Xe MAX as a gaming solution for obvious reasons. Nonetheless, on paper Xe MAX should be faster than Tiger Lake-U integrated graphics by around 20% thanks to the discrete adapter’s higher clockspeeds, so there are potential advantages to gaming on Xe MAX. So it’s something that Intel is making sure to support all the same.

The final pillar for Intel’s software stack, Xe MAX’s drivers include an arbiter of sorts to help direct games to use the correct GPU. The “correct” GPU in this case is often – but not always – the Xe MAX GPU. But in a surprising (and welcome) bit of transparency from Intel, the company admits that in some scenarios Tiger Lake’s iGPU may outperform Xe MAX, and as a result those games shouldn’t run on Xe MAX. So the arbiter’s job is to direct a game to use whatever Intel’s software has deemed the best choice for a given game, be it the iGPU or the dGPU.

This is another case where Intel will be providing a degree of abstraction, ideally hiding all of this from a game developer. Unless a game specifically goes ahead and implements support to detect and select from multiple GPUs, then Intel’s drivers should pick the right GPU for a game.

Functionally, all this sounds very close to how NVIDIA’s Optimus technology works, just with an added wrinkle of purposely sending some games to the iGPU rather than favoring the dGPU for all games. Now that Intel has mobile dGPUs they need a way to manage their use, and this is it. Plus Intel’s long-term plans of course call for more powerful Xe-HPG GPUs, so getting their GPU switching tech out and debugged now is going to benefit them in the long run.

As for performance expectations, with Xe MAX’s higher clockspeeds, Intel is promoting Xe MAX as being generally performance competitive with MX350. Mind you, Intel isn’t aiming to set a very high bar here, but Xe MAX should at least be good for 1080p gaming (most of the time).

Launch Laptops: Acer, ASUS, & Dell

Last but not least, let’s take a look at the first laptops that will be shipping with Xe MAX graphics. Intel is starting things off with a relatively small number of laptops, with Acer, ASUS, and Dell all set to release their Xe MAX-equipped notebooks in November. These are the Acer Swift 3X, the ASUS VivoBook TP470, and the Dell Inspiron 15 7000 2in1.

All three laptops generally fit the thin-and-light paradigm that Intel is pushing with Xe MAX. The Swift 3x is a 14-inch laptop at 3lbs, and the VivoBook Flip TP470 is 14-inches as well at a slightly heavier 3.3lbs. Finally, Dell’s Inspiron is a 15-inch convertible notebook that weighs around 4lbs. All of these notebooks come with high-end versions of Intel’s Tiger Lake-U SoCs using G7-class iGPUs.

At this point we’re still waiting for pricing info on the complete set of laptops. With this being a major Intel launch Intel is going to want to put their best foot forward – and will likely eat most of the marketing costs in the process – though at the same time the company is looking to sell Xe MAX-equipped laptops as premium notebooks, so there is a careful balance to be had.

Officially, Xe MAX is launching today. However it’s not immediately clear whether any of these laptops are actually going to be available right away, or if they’re going to show up later in the month. So it may be a couple of weeks until there’s actual retail availability. On which note, as far as regional distribution goes, the Acer laptop will be China-only, the ASUS laptop will be sold in both China and North America, while the Dell will be North America-only (sold via Best Buy).

Overall, the launch of Intel’s Xe MAX graphics and the DG1 GPU is an important day for Intel, but this is also a launch that strikes me as Intel having modest expectations. Xe MAX is only being launched in a small number of laptops for now, and Intel is not seriously chasing the gaming market with their first discrete laptop part. None the less, it will be interesting to see what kind of traction Intel can get as a new player in the market, especially with their focus on mobile creation and selling Xe MAX as an accelerator for productivity and content creation tasks. No matter what, Xe MAX will be something that bears keeping an eye on.

Intel Launches Xe MAX For Entry-Level Laptops
Comments Locked

118 Comments

View All Comments

  • Spunjji - Sunday, November 1, 2020 - link

    I hate the "AMD drivers suck" mantra more than most, but it's a fact that they repeatedly screwed things up with their mobile APUs. My understanding is that the situation has improved but is still not ideal.

    Intel CPU drivers suck, though, for sure. They rarely crash the system, but they're bug ridden.
  • ozzuneoj86 - Saturday, October 31, 2020 - link

    Serious question:

    How many people own a laptop with this level of graphics? Maybe it's just because I search for "deals" rather than specific models, but I hardly ever see a laptop with low end discreet graphics that I would consider a good deal. Generally I see good solid systems with the latest AMD or Intel CPUs with integrated graphics or affordable gaming systems with 1650-level graphics for surprisingly low prices.

    Who actually needs just a little bit more performance than integrated, but doesn't want too much performance? Seems oddly specific... but Intel clearly knows what is worth investing in, so there must be more of a market for these than I think?
  • lmcd - Saturday, October 31, 2020 - link

    The boost is entirely via memory bandwidth and thermal separation. A 1650 exceeds the thermal usage of even a 35W CPU. An MX450 does not most likely, and obviously an Xe max does not.

    This means that 2x a small, thin thermal solution can be allocated to 1 for the GPU, 1 for the CPU.
  • Spunjji - Saturday, October 31, 2020 - link

    Prior to Ice Lake and Renoir, this was a moderately popular segment well-served by Nvidia. Pretty much "how much GPU can we fit into this space where once there could have been none".

    But Intel just went and made sure pretty much all upcoming ultrabooks will already have a competent iGPU thanks to Tiger (drivers notwithstanding), rendering this product nearly pointless.
  • Tomatotech - Saturday, October 31, 2020 - link

    Not sure why Intel are going into discrete graphics.

    Their Optane tech had real advantages - all Intel had to do was actually implement it as properly tiered storage (a well-understood technology) for almost instant restarting / wake / hibernate / app startup. A few gig in every chip or mobo would be cheap at scale and Intel would be onto a winner. Instead they half-assed it.

    Perhaps with this ‘content creation’ dGPU they’re making a play for phones, car sensors, smart device sensors etc, but it’s ferociously competitive and price sensitive. Otherwise, well, I don’t know. dGPU market is tiny for a company like Intel and desktops / laptops are a mature / shrinking market, especially if your name isn’t Apple.
  • Spunjji - Saturday, October 31, 2020 - link

    They'll find plenty of OEMs to take these off then, and plenty of customers will end up with them - but IMHO it's all utterly pointless.
  • JayNor - Saturday, October 31, 2020 - link

    Intel's Xe gpus can use Optane... read their patent applications.
  • Kevin G - Saturday, October 31, 2020 - link

    Very meh on this. It is an important stepping stone for Intel releasing a discrete graphics card but feels late to market without anything really going for it.

    At ~72 mm^2, why not scale it upward to say 144 mm^2 or 288 mm^2 that'd be more performant? This really does feel like a TigerLake die with the x86 cores hacked off. Ditto for the idea of using the same memory controller as TigerLake. While there are some OEM benefits to sticking with LPDDR4X-4266 (same memory part for CPU and GPU), the lack of GDDR6 support is very surprising. Granted there can be a handful of benefits coupled with Tiger Lake (see Adaptix comments below), the big question is why would an OEM spend time/money on developing this for the advantages it gets on top of a higher bill of materials for the laptops? Intel would practically have to give these away for free

    The ability to leverage multi-GPU with Tiger Lake is interesting but likely not enough to be competitive a step up in the nVidia or AMD's mobile line up. While everyone loves (and in many cases deservingly so) to make fun of AMD for poor drivers, Intel historically has been even worse. I'm hyper skeptical that Intel has a good multi-GPU solution working. I can see the marketing slides indicating a solid performance gains in raw FPS but I'd predict that it'll be a microstuttering mess. Doubly so if Intel were to release a dual Iris Xe MAX discrete gaming card for desktops down the road.

    In defense of Intel a little bit, the Adaptix technology can be beneficial in terms of power/cooling in some scenarios based upon the laptop design. Thermally you have more silicon area that generates heat and thus less thermal density. IE it is easier to move the generated heat to the cooling solution which would result in lower aggregate system temperatures. Allowing the OEMs control over this is critical as the advantages to using Adaptix will be dependent upon the TigerLake + Iris XE Max Hyper Turbo Black Legendary Collector's Edition implementation. Intel does need to work with OEMs on this to not screw it up to avoid sub-optimal traps.

    I will say that this chip does look promising for the encoding/transcoding niche. Intel has the parts to build an impressive replacement to their VCA lineup around this chip. While the usage of LPDDR4X-4266 isn't that great for gaming performance, an expected support for DDR4L to enable SO-DIMM slots does make it interesting as a data center part: you can work on huge data sets if you put in larger SO-DIMMs. Slapping four of these Iris Xe MAX Delux Gold Ti Special Limited Editions on a card, eight SO-DIMM slots with 256 GB would be a transcoding beast to chew through the most demanding 8K footage. Or take one of those chips, some SO-DIMM slots and pair with a high end Intel NIC and/or FPGA on the card makes things interesting for some niche applications. Intel really needs to execute here as these markets are niche but they are lucrative.

    Oh, and Intel's marketing name need to figure out a better naming scheme.
  • lmcd - Saturday, October 31, 2020 - link

    AMD's laptop iGPU drivers are substantially worse than their dGPU drivers. Intel took power from OEMs a while ago but AMD has not yet to the same degree. As someone who has had their AMD driver settings panel broken 3+ times by Windows update by an older driver than I previously had installed, I can promise you there is upside.
  • supdawgwtfd - Saturday, October 31, 2020 - link

    Uhhhh....

    The drivers are the same.

    Has been for a while.

    AMD took back control of their drivers from OEMs because the OEMs were NOT updating them ever.

    I think you need to do some research before spouting more nonsense.

Log in

Don't have an account? Sign up now