ATI Radeon HD 3870 X2: 2 GPUs 1 Card, A Return to the High End
by Anand Lal Shimpi on January 28, 2008 12:00 AM EST- Posted in
- GPUs
Single-board CrossFire
The Radeon HD 3870 X2 features a single CrossFire connector at the top of the PCB, meaning you'll eventually be able to add a second card to it to enable 3X or 4X CrossFire modes (depending on whether you add another 3870 X2 or just a single 3870).
Unfortunately driver support for the ATI CrossFireX technology isn't quite there yet, although AMD tells us to expect something in the March timeframe. Given that CeBIT is at the beginning of March we're guessing we'll see it at the show.
As we alluded to earlier, the fact that the 3870 X2 features two GPUs on a single board means that it doesn't rely on chipset support to enable its multi-GPU functionality: it'll work in any motherboard that would support a standard 3870.
Driver support is also seamless; you don't have to enable CrossFire or fiddle with any settings, the card just works. AMD's Catalyst drivers attempt to force an Alternate Frame Render (AFR) mode whenever possible, but be warned that if there are issues with the 3870 X2's multi-GPU rendering modes and a game you may only get single-GPU performance until AMD can fix the problem. In our testing we didn't encounter any such issues but as new games and OS revisions come out, as we saw with the GeForce 7950 GX2, there's always the chance.
AMD insists that by releasing a multi-GPU card it will encourage developers to take CrossFire more seriously. It is also committed to releasing future single-card, multi-GPU solutions but we'll just have to wait and see how true that is.
Last Minute Driver Drop: Competitive Crysis Performance
Today's launch was actually supposed to happen last week, on January 23rd. At the last minute we got an email from AMD stating that the embargo on 3870 X2 reviews had been pushed back to the 28th and we'd receive more information soon enough.
The reason for the delay was that over the weekend, before the launch on the 23rd, AMD was able to fix a number of driver issues that significantly impacted performance with the 3870 X2. The laundry list of fixes are as follows:
• Company of Heroes DX10 – AA now working on R680. Up to 70% faster at 2560x1600 4xAA
• Crysis DX10 – Improves up to ~60% on R680 and up to ~9% on RV670 on Island GPU test up to 1920x1200.
• Lost Planet DX10 – 16xAF scores on R680 improved ~20% and more. AF scores were horribly low before and should have been very close to no AF scores
• Oblivion – fixed random texture flashing
• COJ – no longer randomly goes to blackscreen after the DX10 benchmark run
• World in Conflict - 2560x1600x32 0xAA 16xAF quality=high we get 77% increase
• Fixed random WIC random crashing to desktop
• Fixed CF scaling for Colin McRae Dirt, Tiger Woods 08, and Blazing Angels2
• Fixed WIC DX9 having smearable text
With a list like that, we can understand why AMD pushed the NDA back - but most importantly, the Radeon HD 3870 X2 went from not scaling at all in Crysis to actually being competitive.
The Radeon 3800 series has always lagged behind NVIDIA when it came to performance under Crysis, and with the old driver Crysis was a black eye on an otherwise healthy track record for the 3870 X2. The new driver improved performance in Crysis by around 44 - 54% at high quality defaults depending on resolution. The driver update doesn't make Crysis any more playable at very high detail settings, but it makes the X2's launch a lot smoother than it would've been.
According to AMD, the fix in the driver that so positively impacted Crysis performance had to do with the resource management code. Apparently some overhead in the Vista memory manager had to be compensated for, and without the fix AMD was seeing quite poor scaling going to the 3870 X2.
The Test
Test Setup | |
CPU | Intel Core 2 Extreme QX9650 @ 3.00GHz |
Motherboard | EVGA nForce 780i SLI Power Measurements done on ASUS P5E3 Deluxe |
Video Cards | ATI Radeon HD 3870 X2 ATI Radeon HD 3870 NVIDIA GeForce 8800 GTX NVIDIA GeForce 8800 GTS 512 NVIDIA GeForce 8800 GT (512MB) |
Video Drivers | ATI: 8-451-2-080123a NVIDIA: 169.28 |
Hard Drive | Seagate 7200.9 300GB 8MB 7200RPM |
RAM | 4x1GB Corsair XMS2 DDR2-800 4-4-4-12 |
Operating System | Windows Vista Ultimate 32-bit |
74 Comments
View All Comments
boe - Monday, January 28, 2008 - link
I really appreciate this article.The things I'd really like to see on the next is adding FEAR benchmarks.
I'd also appreciate a couple of older cards added for comparison like the 7900 or the x1900.
Butterbean - Monday, January 28, 2008 - link
"And we all know how the 3870 vs. 8800 GT matchup turned out"Yeah it was pretty close except for Crysis - where Nvidia got busted not drawing scenes out so as to cheat out a fps gain.
Stas - Monday, January 28, 2008 - link
Conveniently the tests showed how *2* GTs are faster in most cases than X2. Power consumption test only shows *single* GT on the same chart with X2.geogaddi - Wednesday, January 30, 2008 - link
Conveniently, most of us can multiply by 2.
ryedizzel - Monday, January 28, 2008 - link
in the 2nd paragraph under 'Final Words' you put:Even more appealing is the fast that the 3870 X2 will work in all motherboards:
but i think you meant to say:
Even more appealing is the fact that the 3870 X2 will work in all motherboards:
you are welcome to delete this comment when fixed.
abhaxus - Monday, January 28, 2008 - link
I would really, really like to see a crysis benchmark that actually uses the last level of the game rather than the built in gpu bench. My system (q6600@2.86ghz, 2x 8800GTS 320mb @ 620/930) gets around 40fps with 'high' defaults (actually some very high settings turned on per tweakguides) on both of the default crysis benchies, but only got around 10fps on the last map. Even on all medium, the game only got about 15-20 fps on the last level. Performance was even lower with the release version, the patch improved performance by about 10-15%.Of course I'd be interested to see how 2 of these cards do in crysis :)
Samus - Monday, January 28, 2008 - link
Even though Farcry is still unplayable at 1900x1200 (<30FPS) but its really close. My 8800GT only manages 18FPS on my PC using the same ISLAND_DEM benchmark AT did, so to scale, the 3870 X2 will do about 27FPS for me. Maybe with some overclocking it can hit 30FPS. $450 to find out is a bit hard to swallow though :\customcoms - Monday, January 28, 2008 - link
Well, if you're still getting <30 fps on Far Cry, I think you're PC is a bit too outdated to benefit from an upgrade to an HD3870 X2.I assume you meant Crysis. This game is honestly poorly coded with graphical glitches in the final scenes. With my 8800GT (and a 2.6ghz Opteron 165, 2gb ram), I pull 50 FPS in the Island_Dem benchmark at 1680x1050 with a mixture of medium-high settings, 15 fps more than the 8800GTS 320mb I replaced it with. However, when you get to some of the final levels, these frames drop to more like 30 or less, and I am forced to drop to medium or a combination of medium-low settings at this point (perhaps my 2.6ghz dual core cpu isn't up to 3ghz C2D snuff).
Clearly, the game was rushed to market (not unlike Far Cry). Yes, the visuals are stunning, and the storyline is decent, but I much prefer Call of Duty 4, where the visuals are SLIGHTLY less appealing, but the storyline is better, the game is more realistic, the controls are better, and I can crank everything to max without worrying about slowdowns in later levels. Its the only game I have ever immediately started replaying on veteran.
The point is, no card(s) appears to be able to really play Crysis max everything at 1900x1200 or higher, and in my findings, the built in time demo's do not realistically simulate the games full demands in later levels.
swaaye - Wednesday, January 30, 2008 - link
Yeah, armchair game developer in action!In what way is CoD4 realistic, exactly? I suppose it does portray actual real military assets. It doesn't portray them in a realistic way, however.
Did you notice that Crysis models bullet velocity? It's not just graphical glitz.
Griswold - Monday, January 28, 2008 - link
Ahh, gotta love the arm chair developers who can see the game code just by looking at the DVD.