Rise of Nations: Rise of Legends

Rise of Nations: Rise of Legends is a game that we recently added to our list of benchmarks as a good third-person strategy game benchmark. First-person shooter games are always popular but aren't the only games people play on computers, and we need some tests to be able to reflect card performance for all types of games.

This is the first of our "fraps" benchmarks, and it basically consists of a twenty minute recording of a battle between several humans players using different races. We run fraps throughout the whole game (sped up to 4x) to get a good average framerate from start to finish. We test Rise of Legends with the graphics slider all the way to "look better" to enable all of the high quality settings. RoL doesn't support 800x600 resolution, so if you are so unlucky as to have a monitor that is limited to this resolution, you won't be able to play this game (and you probably won't want to use an ATI card because their .NET driver interface won't display properly at 800x600 either).

Rise of Legends

Rise of Legends

Rise of Legends

While this isn't a stealth-style game like Splinter Cell, the game is still more playable at lower framerates than other games like Battlefield 2 and F.E.A.R. However, this is much newer game than Splinter Cell: Chaos Theory, and as such it is more demanding on your graphics hardware as our test results show.

A framerate of 20 fps is about the lowest you will want to experience while playing this game, and only the 7800 GT and 7600 GT can achieve this at up to 1600x1200 resolution. Unfortunately this game isn't really playable at any resolution with cards like the X1300 Pro and 7300 GS without turning down the graphics quality settings. If Rise of Legends is your game of choice and you must have one of these silent cards, the Gigabyte 7600 GS is your best option given the performance and price.

Splinter Cell: Chaos Theory F.E.A.R.
Comments Locked

49 Comments

View All Comments

  • Josh Venning - Thursday, August 31, 2006 - link

    I also forgot to mention that some people use their pcs in home theater systems as well. This would be another case when you want as little noise from your computer as possible.
  • imaheadcase - Thursday, August 31, 2006 - link

    That was not always the case, my 9700 Pro i still use when fan went out a year ago, works like a charm without it on. It was in its time the high end card, lets hope those days come buy again :D
  • eckre - Thursday, August 31, 2006 - link

    What a great review, when tom did their silent VC review, they included a grand total of three cards...pfft. nice job anand.

    I have the 7600GT, very sweet and 0dB is oh so nice.
  • Josh Venning - Thursday, August 31, 2006 - link

    We just wanted to say thanks all for your comments and we are still trying to make sure we've caught any errors. (there are actually only 20 cards in the roundup and not 21) As Derek said, these cards were included in the article because we requested any and all silent cards that any of the manufacturers were willing to give us to review. That's also why we have more cards from ASUS and Gigabyte than the others.
  • Olaf van der Spek - Thursday, August 31, 2006 - link

    quote:

    If a general purpose CPU can offer a 40% improvement over its predecessor (Pentium D) while consuming 40% less power on average, why can't a GPU revolution accomplish the same thing?


    Because the videocard industry hasn't introduced such a bad design as the netburst architecture.
  • epsilonparadox - Thursday, August 31, 2006 - link

    No they've introduced worse. When they recommend a second PS just for grafx or even a 1Kw single PS, they've taken intel's lack of thermal control to a whole new level.
  • DerekWilson - Thursday, August 31, 2006 - link

    graphics cards use much much less power in 2d mode than in 3d mode -- and even their 3d power saving capabilities are really good.

    this is especially true when you consider the ammount of processing power a GPU delivers compared to a CPU.

    Theoretical peak performance of a current desktop CPU is in the 10-15 GFLOPS range at best. For a GPU, theoretical peak performance is at least one order of magnitude larger reaching up over 200 GFLOPS in high end cases.

    I'm not saying we can reach these theoretical peak rates on either a CPU or a GPU, but a GPU is doing much much more work under load than a CPU possibly could.

    Keep in mind we aren't even up to GHz on GPU cores. On the CPU front, Intel just shortened the pipeline and decreased clock speeds to save power -- doing more work in one cycle. This is absolutely what a GPU does.

    And the icing on the cake is the sheer options on the silent GPU front. Neither AMD nor Intel make a fast desktop CPU that can be (easily) passively cooled. These parts are a testiment to the efficiency of the GPU.

    On the flip side, ATI and NVIDIA push their high end parts way up in clock speed and power consumption trying as hard as possible to gain the performance crown.

    There are plenty of reasons GPUs draw more power than a CPU under load, but a lack of thermal control or inefficient desing is not one of them. It's about die size, transistor count, and total ammount of work being done.
  • JarredWalton - Saturday, September 2, 2006 - link

    I disagree with Derek, at least in some regards. The budget and midrange GPUs generally do a good job at throttling down power requirements in 2D mode. The high-end parts fail miserably in my experience. Sure, they consume a lot less power than they do in 3D mode, but all you have to do is look at the difference between using a Radeon Mobility X1400 and a GeForce Go 7800 in the Dell laptops to http://www.anandtech.com/mobile/showdoc.aspx?i=276...">see the difference in battery life.

    In 2D mode, graphics chips still consume a ton of power relatively speaking -- probably a lot of that going to the memory as well. A lot of this can be blamed on transistor counts and die size, but I certainly think that NVIDIA and ATI could reduce power more. The problem right now is that power use is a secondary consideration, and ATI and NVIDIA both need to have a paradigm shift similar to what Intel had with the Pentium M. If they could put a lot of resources into designing a fast but much less power-hungry GPU, I'm sure they could cut power draw quite a bit in both idle and load situations.

    That's really the crux of the problem though: resources. Neither company has anywhere near the resources that AMD has, let alone the resources that Intel has. Process technology is at least a year behind Intel if not more, chip layouts are mostly computer generated as opposed to being tweaked manually (I think), and none of the companies have really started at square one trying to create a power efficient design; that always seems to be tacked on after-the-fact.

    GPUs definitely do a lot of work, although GFLOPS is a terrible measure performance. The highly parallel nature of 3D rendering does allow you to scale performance very easily, but power requirements also scale almost linearly with performance when using the same architecture. It would be nice to see some balance between performance scaling and power requirements... I am gravely concerned about what Windows Vista is going to do for battery life on laptops, at least if you enable the Aero Glass interface. Faster switching to low-power states (for both memory and GPU) ought to be high on the list for next-generation GPUs.
  • DaveLessnau - Thursday, August 31, 2006 - link

    I'm wondering why Anandtech tested Asus' EN7800 GT card instead of their EN7600 GT. That card would be more in line with Gigabyte's 7600 GT version and, I believe, is more available than the 7800 version. In the near future, I'd like to buy one of these silent 7600GTs and was hoping this review would help. Oh, well.
  • DerekWilson - Thursday, August 31, 2006 - link

    you can get a really good idea of how it would perform by looking at Gigabyte's card.

    as I mentioned elsewhere in the comments, we requested all the silent cards manufacturers could provide. if we don't have it, it is likely because they were unable to get us the card in time for inclusion in this review.

Log in

Don't have an account? Sign up now