Conclusion

The easiest kind of product for us to write about is the kind that’s clearly superior to its competition. The hardest kind to write about is the kind that’s stuck in the middle. For the 5870, we have the latter case.

Let’s be clear here: the 5870 is the single fastest single-GPU card we have tested, by a wide margin. Looking at its performance in today’s games, as a $379 card it makes the GTX 285 at its current prices ($300+) completely irrelevant. The price difference isn’t enough to make up for the performance difference, and NVIDIA also has to contend with the 5850, which should perform near the GTX 285 but at a price of $259. As is often the case with a new generation of cards, we’re going to see a shakeup here in the market as NVIDIA in particular needs to adjust to these new cards.

The catch however is that what we don’t have is a level of clear domination when it comes to single-card solutions. AMD was shooting to beat the GTX 295 with the 5870, but in our benchmarks that’s not happening. The 295 and the 5870 are close, perhaps close enough that NVIDIA will need to reconsider their position, but it’s not enough to outright dethrone the GTX 295. NVIDIA still has the faster single-card solution, although the $100 price premium is well in excess of the <10% performance premium.

Meanwhile AMD is retiring the 4870X2, which ended up beating the 5870 enough that we would consider it a competitor to the 5870. However, you can’t consider it if you can’t buy it.

Then we have the multi-GPU space, where things are rather clear. Having the fastest single-GPU card makes the 5870 in Crossfire the fastest dual-GPU solution by far. Unfortunately we didn’t have a chance to benchmark a GTX 295 Quad SLI setup, but given the notoriously finicky nature of Quad SLI and Quad Crossfire, we’re comfortable calling the 5870 CF the better multi-GPU solution.

And that brings us to our next conundrum: if dual-GPU setups can overshoot the 5870, does that make them better? At equal performance levels, we would take a single-GPU setup any day of the week; there are no profiles to deal with or the sometimes inconsistent scaling in performance (see: Dawn of War II). Even with a slight lead in performance, we would pick the 5870 over the 4870X2 or GTX 295 so long as the latter were not significantly cheaper. As it stands the 5870 is the greater value, even if it's not the fastest card.

Moving away from performance, we have feature differentiation. AMD has a clear advantage here with DirectX11, as the 5870 is going to be a very future-proof card. The 8800GTX is a good parallel here – it took 3 years for it to really be outclassed in terms of features, and the performance is still respectable today. DX11 is going to give the 5870 the same level of longevity when it comes to being up-to-date on features, although we’ll see if its performance lasts for quite as long. When games using DX11 arrive, it’s going to bring about a nice change in quality (particularly with tessellation). However it’s going to be a bit of a wait to get there.

On that tangent, we have Eyefinity. Unlike DX11 Eyefinity is something we can take advantage of today, but also unlike DX11 it’s not necessarily an improvement. As Anand discussed when attempting to use it, when it works it’s absolutely great, but at the moment it has some real teething issues. And it’s expensive – even 3 cheap TV-quality monitors is an investment for most people of hundreds of dollars on top of everything else. It’s very much like a certain NVIDIA feature in terms of cost, goals, and its hit-or-miss nature. Eyefinity is something we’re going to want to keep an eye on to see what AMD does with it in the future, because they’re on the right track. It’s just not something that’s going to tickle the fancy of very many people today.

Wrapping things up, for those of you who were expecting the 5870 to shake things up, the 5870 is certainly going to do that. For those of you looking for the above and a repeat of the RV770/GT200 launch where prices will go into a free fall, you’re going to come away disappointed. That task will fall upon the 5850, and we’re looking forward to reviewing it as soon as we can.

At the end of the day, with its impressive performance and next-generation feature set, the Radeon HD 5870 kicks off the DirectX 11 generation with a bang and manages to take home the single-GPU performance crown in the process. It’s without a doubt the high-end card to get.

Power, Temperature, & Noise
Comments Locked

327 Comments

View All Comments

  • ClownPuncher - Wednesday, September 23, 2009 - link

    Absolutely, I can answer that for you.

    Those 2 "ports" you see are for aesthetic purposes only, the card has a shroud internally so those 2 ports neither intake nor exhaust any air, hot or otherwise.
  • Ryan Smith - Wednesday, September 23, 2009 - link

    ClownPuncher gets a cookie. This is exactly correct; the actual fan shroud is sealed so that air only goes out the front of the card to go outside of the case. The holes do serve a cooling purpose though; allow airflow to help cool the bits of the card that aren't hooked up to the main cooler; various caps and what have you.
  • SiliconDoc - Wednesday, September 23, 2009 - link

    Ok good, now we know.
    So the problem now moves to the tiny 1/2 exhaust port on the back, did you stick your hand there and see how much that is blowing ? Does it whistle through there ? lol
    Same amount of air(or a bit less) in half the exit space... that's going to strain the fan and or/reduce flow, no matter what anyone claims to the contrary.
    It sure looks like ATI is doing a big favor to aftermarket cooler vendors.

  • GhandiInstinct - Wednesday, September 23, 2009 - link

    Ryan,

    Developers arent pushing graphics anymore. Its not economnical, PC game supports is slowing down, everything is console now which is DX9. what purpose does this ATI serve with DX11 and all this other technology that won't even make use of games 2 years from now?

    Waste of money..
  • ClownPuncher - Wednesday, September 23, 2009 - link

    Clearly he should stop reviewing computer technology like this because people like you are content with gaming on their Wii and iPhone.

    This message has been brought to you by Sarcasm.
  • Griswold - Wednesday, September 23, 2009 - link

    So you're echoing what nvidia recently said, when they claimed dx11/gaming on the PC isnt all that (anymore)? I guess nvidia can close shop (at least the gaming relevant part of it) now and focus on GPGPU. Why wait for GT300 as a gamer?

    Oh right, its gonna be blasting past the 5xxx and suddenly dx11 will be the holy grail again... I see how it is.
  • SiliconDoc - Wednesday, September 23, 2009 - link

    rofl- It's great to see red roosters not crowing and hopping around flapping their wings and screaming nvidia is going down.
    Don't take any of this personal except the compliments, you're doing a fine job.
    It's nice to see you doing my usual job, albiet from the other side, so allow me to compliment your fine perceptions. Sweltering smart.
    But, now, let's not forget how ambient occlusion got poo-pooed here and shading in the game was said to be "an irritant" when Nvidia cards rendered it with just driver changes for the hardware. lol
    Then of course we heard endless crowing about "tesselation" for ati.
    Now it's what, SSAA (rebirthed), and Eyefinity, and we'll hear how great it is for some time to come. Let's not forget the endless screeching about how terrible and useless PhysX is by Nvidia, but boy when "open standards" finally gets "Havok and Ati" cranking away, wow the sky is the limit for in game destruction and water movement and shooting and bouncing, and on and on....
    Of course it was "Nvidia's fault" that "open havok" didn't happen.
    I'm wondering if 30" top resolution will now be "all there is!" for the next month or two until Nvidia comes out with their next generation - because that was quite a trick switching from top rez 30" DOWN to 1920x when Nvidia put out their 2560x GTX275 driver and it whomped Ati's card at 30" 2560x, but switched places at 1920x, which was then of course "the winning rez" since Ati was stuck there.
    I could go on but you're probably fuming already and will just make an insult back so let the spam posting IZ2000 or whatever it's name will be this time handle it.
    BTW there's a load of bias in the article and I'll be glad to point it out in another post, but the reason the red rooster rooting is not going beyond any sane notion of "truthful" or even truthiness, is because this 5870 Ati card is already percieved as " EPIC FAIL" !
    I cannot imagine this is all Ati has, and if it is they are in deep trouble I believe.
    I suspect some further releases with more power soon.



  • Finally - Wednesday, September 23, 2009 - link

    Team Green - full foam ahead!
    *hands over towel*
    There you go. Keep on foaming, I'm all amused :)
  • araczynski - Wednesday, September 23, 2009 - link

    is DirectX11 going to be as worthless as 10? in terms of being used in any meaningful way in a meaningful amount of games?

    my 2 4850's are still keeping me very happy in my 'ancient' E8500.

    curious to see how this compares to whatever nvidia rolls out, probably more of the same, better in some, worse in others, bottom line will be the price.... maybe in a year or two i'll build a new system.

    of course by that time these'll be worthless too.
  • SiliconDoc - Wednesday, September 23, 2009 - link

    Well it's certainly going to be less useful than PhysX, which is here said to be worthless, but of course DX11 won't get that kind of dissing, at least not for the next two months or so, before NVidia joins in.
    Since there's only 1 game "kinda ready" with DX11, I suppose all the hype and heady talk will have to wait until... until... uhh.. the 5870's are actually available and not just listed on the egg and tiger.
    Here's something else in the article I found so very heartwarming:
    ---
    " Wrapping things up, one of the last GPGPU projects AMD presented at their press event was a GPU implementation of Bullet Physics, an open source physics simulation library. Although they’ll never admit it, AMD is probably getting tired of being beaten over the head by NVIDIA and PhysX; Bullet Physics is AMD’s proof that they can do physics too. "
    ---
    Unfortunately for this place,one of my friends pointed me to this little expose' that show ATI uses NVIDIA CARDS to develope "Bullet Physics" - ROFLMAO
    -
    " We have seen a presentation where Nvidia claims that Mr. Erwin Coumans, the creator of Bullet Physics Engine, said that he developed Bullet physics on Geforce cards. The bad thing for ATI is that they are betting on this open standard physics tech as the one that they want to accelerate on their GPUs.

    "ATI’s Bullet GPU acceleration via Open CL will work with any compliant drivers, we use NVIDIA Geforce cards for our development and even use code from their OpenCL SDK, they are a great technology partner. “ said Erwin.

    This means that Bullet physics is being developed on Nvidia Geforce cards even though ATI is supposed to get driver and hardware acceleration for Bullet Physics."
    ---
    rofl - hahahahahha now that takes the cake!
    http://www.fudzilla.com/content/view/15642/34/">http://www.fudzilla.com/content/view/15642/34/
    --
    Boy do we "hate PhysX" as ati fans, but then again... why not use the nvidia PhysX card to whip up some B Physics, folks I couldn't make this stuff up.

Log in

Don't have an account? Sign up now