Intel Dual Core Performance Preview Part II: A Deeper Look
by Anand Lal Shimpi on April 6, 2005 12:23 PM EST- Posted in
- CPUs
Multimedia Content Creation Performance
MCC Winstone 2004
Multimedia Content Creation Winstone 2004 tests the following applications in various usage scenarios:
- Adobe® Photoshop® 7.0.1
- Adobe® Premiere® 6.50
- Macromedia® Director MX 9.0
- Macromedia® Dreamweaver MX 6.1
- Microsoft® Windows MediaTM Encoder 9 Version 9.00.00.2980
- NewTek's LightWave® 3D 7.5b
- SteinbergTM WaveLabTM 4.0f
All chips were tested with Lightwave set to spawn 4 threads.
Once again, AMD's 3500+ takes the lead in the MCC tests, despite the benefits of dual core in the area.
ICC SYSMark 2004
The first category that we will deal with is 3D Content Creation. The tests that make up this benchmark are described below:
"The user renders a 3D model to a bitmap using 3ds max 5.1, while preparing web pages in Dreamweaver MX. Then the user renders a 3D animation in a vector graphics format."
Next, we have 2D Content Creation performance:
"The user uses Premiere 6.5 to create a movie from several raw input movie cuts and sound cuts and starts exporting it. While waiting on this operation, the user imports the rendered image into Photoshop 7.01, modifies it and saves the results. Once the movie is assembled, the user edits it and creates special effects using After Effects 5.5."
The Internet Content Creation suite is rounded up with a Web Publishing performance test:
"The user extracts content from an archive using WinZip 8.1. Meanwhile, he uses Flash MX to open the exported 3D vector graphics file. He modifies it by including other pictures and optimizes it for faster animation. The final movie with the special effects is then compressed using Windows Media Encoder 9 series in a format that can be broadcast over broadband Internet. The web site is given the final touches in Dreamweaver MX and the system is scanned by VirusScan 7.0."
SYSMark shows the exact opposite with the Pentium D taking the lead in all three of the ICC tests.
106 Comments
View All Comments
saratoga - Friday, April 8, 2005 - link
#90:HT is the same thing as SMT. You can thank Intel's marketing for that one.
Reflex - Friday, April 8, 2005 - link
#93: Intel has labeled it as SMT, however there is another name for what they are doing(that I cannot remember at the moment). What they are calling SMT is nowhere even close to solutions like Power.That aside, the implementation Intel has chosen is designed to make up for inefficiencies in the Prescott pipeline, such a implementation would make zero sense on the Athlon architecture, it does not share the same inefficiencies that the P4 design has. It would actually harm rather than help performance.
True SMT is not a 'bolt on' feature. Its something that has to be planned for from the very beginning of the CPU design cycle. You could not in any way add it to the current Athlon design and gain any performance. Whatever their next generation is may include it, it depends on what direction they decide to go, but you will not see it on the current generation, and thats actually a good thing as it would be purely a marketing move.
eeceret - Friday, April 8, 2005 - link
As always a very interesting article, one thing comes to mind though... In the gaming multitasking tests you adjusted the priority of the DVD Shrink process to see the effect on gaming performance. What I was wondering is if you could take a look at what effect explicitly binding the processes to seperate cores (processor affinity) has on gaming performancedefter - Friday, April 8, 2005 - link
Hyperthreading IS SMT. SMT stands for symmetric multithreading (ability to run two or more threads at once and this is exactly what hyperthreading does.Of course, CPUs from different manufacturers have vastly different internal structures, thus also the SMT is implemented differently.
"Intel's next major IA-32 processor release, codenamed Prescott, will include a feature called simultaneous multithreading (SMT)"
http://arstechnica.com/articles/paedia/cpu/hyperth...
tynopik - Friday, April 8, 2005 - link
and of course that's just the net part, don't want to leave out other background tasks like that resource sucker outlook and playing flac/ape filestynopik - Thursday, April 7, 2005 - link
to get repeatable multi-tasking/ncq benches, anand is going to have to bite the bullet and setup a full-blown network simulation:1: an nntp server
2. a bittorrent swarm
3. an irc server
with this setup, you can test these multi-tasking scenarios that seem more reasonable:
1. firewall (a pig like zonealarm)
2. pulling news articles with either 2 clients or 1 client with 2 threads (writing to different places on hd simultaneously)
3. about 10 torrents where it is BOTH downloading and uploading (so pulling from a gazillion different places on hd at once)
4. mirc with about 5 open channels and some scripts (like filters). At least one channel should be very high traffic (like #mp3passion on undernet)
5. icq
6. running all this with software raid 5
this would represent a typical background load, and then you can benchmark foreground tasks to see how much they are affected by what's going on in the background (specifically ncq could be tested by seeing how long it takes to copy a file from one partition to another under these circumstances)
Reflex - Thursday, April 7, 2005 - link
Just to be clear: SMT is NOT the same thing as HyperThreading. They go about what they are doing in radically different ways. The only similarity is in the CPU being able to execute two simultanious threads. How it goes about that though is implemented completely differently.Reflex - Thursday, April 7, 2005 - link
"#47, if HT is simply a "bandaid", then why is AMD the only major CPU vendor not using it? IBM uses it heavily in their Power5, Sun is making their next CPUs (Niagra) very highly SMT (same thing as HT). Arguably, both of those architectures have much more shallow pipelines than the P4, yet see reason to provide SMT. AMD is the only holdout."The SMT used in IBM's Power series is completely different from what Intel is doing with the P4 design. The only similarity is the fact that two threads can be run at once, the implementation has nothing even close to the same however. I do not have details on Sun's implementation, but I would assume it will be closer to IBM's than Intel's implementation considering the market they are targetting. The Power architecture was designed from the ground up to use SMT, it wasn't a tacked on feature, and you get considerably more of a performance boost in most scenerios with it than you would ever see with HT on Intel.
The Athlon64 architecture was not designed with SMT or HT in mind, it was designed around two physical cores. So adding HT to it would do very little, and SSE3(which mostly optimizes HT style multithreading) does almost nothing on the K8 architecture.
Not every feature would help every CPU design, it all depends on what was taken into account when the design was made. Power has some limitations you do not see on x86(in order execution for example), and x86 has challenges you do not see on Power. The multi-threading implementations are similarly different and not comparable. In the x86 world, HT makes sense on Intel in some situations(not always). It makes no sense on AMD and would likely result in performance drops rather than gains. It certainly would not improve performance in any way as the core does not often have idle units or execution steps due to its design.
Icehawk - Thursday, April 7, 2005 - link
"I'm also curious to see what effects RAID would have on testing striped setups."Uh, delete the "striped setups" from the end ;)
Can we please, please get some kind of short-term editing abilities here?
Icehawk - Thursday, April 7, 2005 - link
So you'd rather wait for information than recieve it now?Anand clearly shows that dual core is only a good choice now IF you use it in scenarios where it can run multiple applications. Otherwise, single core chips are still the better choice. So i don't see the marketing hype you are referring to. Basically we've been told now for quite a few years that multi-taskers can benefit from multiple CPUs but the costs have been prohibitive. Now it looks like within the next year a 2 CPU machine will cost no more than previous single core processors.
Thanks Anand for helping us out in planning for the future! The DVDShrink stuff was very interesting to me as was the NCQ information - makes switching to SATA drives a bit more appealing to me considering my usage profile.
I just recently went from 1.4 k7->3.2 P4 w/HT so I'm pretty happy at this point. It does look like a dual core system *might* allow me to get rid of my second box (the 1.4 K7) which would save me money in the long run - one less PC to power up and cool off. My home office requires year round A/C to cool my 2 21" CRTs and 2 PCs...
I'm also curious to see what effects RAID would have on testing striped setups. I'm very curious if a RAID5 type of setup with NCQ and a dual-core might make chores like encoding & gaming realistic - it sounds like from the review that at this point I/O may cause hiccups even when the processor still has headroom.