Eyefinity

Somewhere around 2006 - 2007 ATI was working on the overall specifications for what would eventually turn into the RV870 GPU. These GPUs are designed by combining the views of ATI's engineers with the demands of the developers, end-users and OEMs. In the case of Eyefinity, the initial demand came directly from the OEMs.

ATI was working on the mobile version of its RV870 architecture and realized that it had a number of DisplayPort (DP) outputs at the request of OEMs. The OEMs wanted up to six DP outputs from the GPU, but with only two active at a time. The six came from two for internal panel use (if an OEM wanted to do a dual-monitor notebook, which has happened since), two for external outputs (one DP and one DVI/VGA/HDMI for example), and two for passing through to a docking station. Again, only two had to be active at once so the GPU only had six sets of DP lanes but the display engines to drive two simultaneously.

ATI looked at the effort required to enable all six outputs at the same time and made it so, thus the RV870 GPU can output to a maximum of six displays at the same time. Not all cards support this as you first need to have the requisite number of display outputs on the card itself. The standard Radeon HD 5870 can drive three outputs simultaneously: any combination of the DVI and HDMI ports for up to 2 monitors, and a DisplayPort output independent of DVI/HDMI. Later this year you'll see a version of the card with six mini-DisplayPort outputs for driving six monitors.

It's not just hardware, there's a software component as well. The Radeon HD 5000 series driver allows you to combine all of these display outputs into one single large surface, visible to Windows and your games as a single display with tremendous resolution.

I set up a group of three Dell 24" displays (U2410s). This isn't exactly what Eyefinity was designed for since each display costs $600, but the point is that you could group three $200 1920 x 1080 panels together and potentially have a more immersive gaming experience (for less money) than a single 30" panel.

For our Eyefinity tests I chose to use every single type of output on the card, that's one DVI, one HDMI and one DisplayPort:

With all three outputs connected, Windows defaults to cloning the display across all monitors. Going into ATI's Catalyst Control Center lets you configure your Eyefinity groups:

With three displays connected I could create a single 1x3 or 3x1 arrangement of displays. I also had the ability to rotate the displays first so they were in portrait mode.

You can create smaller groups, although the ability to do so disappeared after I created my first Eyefinity setup (even after deleting it and trying to recreate it). Once you've selected the type of Eyefinity display you'd like to create, the driver will make a guess as to the arrangement of your panels.

If it guessed correctly, just click Yes and you're good to go. Otherwise ATI has a handy way of determining the location of your monitors:

With the software side taken care of, you now have a Single Large Surface as ATI likes to call it. The display appears as one contiguous panel with a ridiculous resolution to the OS and all applications/games:


Three 24" panels in a row give us 5760 x 1200

The screenshot above should clue you into the first problem with an Eyefinity setup: aspect ratio. While the Windows desktop simply expands to provide you with more screen real estate, some games may not increase how much you can see - they may just stretch the viewport to fill all of the horizontal resolution. The resolution is correctly listed in Batman Arkham Asylum, but the aspect ratio is not (5760:1200 !~ 16:9). In these situations my Eyefinity setup made me feel downright sick; the weird stretching of characters as they moved towards the outer edges of my vision left me feeling ill.


Dispite Oblivion's support for ultra wide aspect ratio gaming, by default the game stretches to occupy all horizontal resolution

Other games have their own quirks. Resident Evil 5 correctly identified the resolution but appeared to maintain a 16:9 aspect ratio without stretching. In other words, while my display was only 1200 pixels high, the game rendered as if it were 3240 pixels high and only fit what it could onto my screens. This resulted in unusable menus and a game that wasn't actually playable once you got into it.

Games with pre-rendered cutscenes generally don't mesh well with Eyefinity either. In fact, anything that's not rendered on the fly tends to only occupy the middle portion of the screens. Game menus are a perfect example of this:

There are other issues with Eyefinity that go beyond just properly taking advantage of the resolution. While the three-monitor setup pictured above is great for games, it's not ideal in Windows. You'd want your main screen to be the one in the center, however since it's a single large display your start menu would actually appear on the leftmost panel. The same applies to games that have a HUD located in the lower left or lower right corners of the display. In Oblivion your health, magic and endurance bars all appear in the lower left, which in the case above means that the far left corner of the left panel is where you have to look for your vitals. Given that each panel is nearly two feet wide, that's a pretty far distance to look.

The biggest issue that everyone worried about was bezel thickness hurting the experience. To be honest, bezel thickness was only an issue for me when I oriented the monitors in portrait mode. Sitting close to an array of wide enough panels, the bezel thickness isn't that big of a deal. Which brings me to the next point: immersion.

The game that sold me on Eyefinity was actually one that I don't play: World of Warcraft. The game handled the ultra wide resolution perfectly, it didn't stretch any content, it just expanded my viewport. With the left and right displays tilted inwards slightly, WoW was more immersive. It's not so much that I could see what was going on around me, but that whenever I moved forward I I had the game world in more of my peripheral vision than I usually do. Running through a field felt more like running through a field, since there was more field in my vision. It's the only example where I actually felt like this was the first step towards the holy grail of creating the Holodeck. The effect was pretty impressive, although costly given that I only really attained it in a single game.

Before using Eyefinity for myself I thought I would hate the bezel thickness of the Dell U2410 monitors and I felt that the experience wouldn't be any more engaging. I was wrong on both counts, but I was also wrong to assume that all games would just work perfectly. Out of the four that I tried, only WoW worked flawlessly - the rest either had issues rendering at the unusually wide resolution or simply stretched the content and didn't give me as much additional viewspace to really make the feature useful. Will this all change given that in six months ATI's entire graphics lineup will support three displays? I'd say that's more than likely. The last company to attempt something similar was Matrox and it unfortunately didn't have the graphics horsepower to back it up.

The Radeon HD 5870 itself is fast enough to render many games at 5760 x 1200 even at full detail settings. I managed 48 fps in World of Warcraft and a staggering 66 fps in Batman Arkham Asylum without AA enabled. It's absolutely playable.

DirectCompute, OpenCL, and the Future of CAL The Race is Over: 8-channel LPCM, TrueHD & DTS-HD MA Bitstreaming
Comments Locked

327 Comments

View All Comments

  • SiliconDoc - Thursday, September 24, 2009 - link

    Oh really ? Now wait a minute, spin master. When the site here whined about "paper launch" it was Derek who brought up a two or three year old nvidia card, and cried and whined about it. Then speculated the GTX275 was paper, and then "a phantom card".
    Well, that didn't happen.... no apologies about it ever either.
    ---
    The PAPER launches of late are ATI ATI ATI ! ! !
    We have the 4770, and now this one !
    ----
    Gee, when ATI BLOWS IT, we suddenly talk in vague terms about "the companies" having "papery launches" as " the general rule of thumb of how it's done.." - and that makes us "not a fan boy!??!"
    R0FLMAO !!!!
    Yes, of course, since the red ati is bleeding paper launches and the last one from nvidia one can actually cite is YEARS AND YEARS ago, yes, of course, you're correct, it's "unnamed companies in the multiple" that "do it"....
    ---
    I swear to god, I cannot even believe the massive brainwashing that is a gigantic pall all over the place.
    ---
    If I'm WRONG, please be kind, and tell me what nvidia paper launches I missed.... PLEASE LET ME KNOW.
  • Genx87 - Wednesday, September 23, 2009 - link

    Is\was a good idea to shoot for as this is most certainly what Nvidia is going to attempt to achieve. But I am a bit disappointed this care rarely achieved it.

    I do like angle independent AF though. Should be interesting to see what Nvidia brings to the table. But kind of like the CPU situation(i5) I am kind of meh. But will say this has more potential compared to its predecessor than the i5 series does compared to Core 2 Duo.
  • SiliconDoc - Wednesday, September 23, 2009 - link

    I thought that was just great, those pretty pictures, and then I get to reading. I see the 4890 and SQUARES. I see the GTX285, with CIRCLES and an outer rounded octagon.
    Then the 5870- and it's "perfectly round" angle independent algorithm, but I still see some distortions.
    --
    So I get to reading and am told "the 4890 and 285 are virtually the same". I guess the wheel was first made square, and rolled as well as when it became round. No chance the reviewer could tell the truth and remark that NVidia has the best, until today.. NOPE can't do that!
    ---
    Then, of course, the celebration for the "perfection" of the 5870 and ATI's superb success in the "round" category...
    EXCEPT:
    We get to the actual implementations and NO PERCIEVED DIFFERENCE IS VISUALLY THERE. It cannot be seen. The article even states they searched in vain for some game to show the difference. LOL
    All that extra effort to for pete sakes show that ATI superiority...all WASTED EFFORT, but for red roosters I'm certain it was a very exciting quest, titillating, gee a change to take down big green...
    ---
    So bottom line is IT'S A BIG FAT ZERO, even the older, worse ati implementation is apparently "non distinguishable".
    It is remarked that NVidia doesn't "officially" support this method in game, and of course, after much red rooster effort, one finds out why.
    THERE IS NO DIFFERENCE in visual quality. Another phantom red "win".
    Another reason NVidia makes money (why waste it on worthless crap in developement that makes no difference), while ATI does not.
    Yeah, that was so cool.
    So happy the "mental ideation of perfection in the card for ati fans" was furthered. ROFL
  • Dante80 - Wednesday, September 23, 2009 - link

    A quick question. Why is there no 5850 review available atm?

    1> Was there a separate NDA for the 2 cards?
    2> Were there no sample cards given by AMD to reviewers?
    3> Did AMD ask reviewers to postpone said reviews due to market supply problems/glitches?
    4> Was this a strategy decision by AMD, for marketing or other reasons?
  • Ryan Smith - Wednesday, September 23, 2009 - link

    AMD only provided us with 2 5870s, the 5850 was not sampled. 5800 series cards are in short supply, even for reviewers.
  • Dante80 - Thursday, September 24, 2009 - link

    Thank you for the prompt answer, that was what I was guessing too. Cheers...^^
  • Spoelie - Wednesday, September 23, 2009 - link

    To get enough 5870 cards in the channel for a hard launch, they used every possible die.

    There are probably not enough harvested dies to create the 5850 line just yet. And they're not gonna use fully functional ones that can go in a 5870 when supply for them is tight already.

    Once the 5850 is launched, demand for them is up and yields matured, they'll have to use fully functional dies to keep supply up, but now they're building up inventory for a hard launch during the coming weeks.
  • SiliconDoc - Wednesday, September 23, 2009 - link

    Uhh, just a minute there feller. The SOFT or PAPER LAUNCH has already hit, the big LAUNCH DATE is today....
    Newegg is a big ZERO available... (one Powercolor was there 30 mins ago, the other 3 listed are NOT avaailable, I watched them appear last night).
    ---
    So, when Ati has a "hard launch" they get "many weeks after the launch date" to "ramp up production" and "fill the need".
    ROFLMAO
    I was here when this site and the red roosters whined about Nvidia and appear launches, and I believe it was the GTX275 that was predicted to be PAPER (not very long ago in fact) here, and the article EVEN SPECULATED IT WAS A PHANTOM CARD.
    All the red roosters piled on, but.... the card was available on launch, it wasn't a PHANTOM, and all that bs was quickly forgotten and shoved into the memory hole like it never happened...
    ---
    Oh, but when it's ATI and not Nvidia, the 4770 can remain almost pure paper near forever, and this one, golly it can be 95% paper and it's just " getting ready for a hard launch" WEEKS BEYONDS the launch date!
    ROFLMAO
    --
    No bias here?!? "Where's da' bias?!?!" said the red rooster (to the green goblin)...
    Give me a break.
  • chrnochime - Friday, September 25, 2009 - link

    Just because you can't find it in the states doesn't mean it's a fake launch. And fake launch? What are you a 12 year old or something? You're like the nvidia version of snakeoil. Just go play with your nvidia part m'kay ?

  • SiliconDoc - Sunday, September 27, 2009 - link

    Well, since you insulted, and mischaracterized, I came across the reminder about the 4870 paper launch.
    Yes, that's correct, this is how ATI rolls, a big fat lying launch date, a piddle of a few cards, then wait a couple weeks or a month.
    --
    " The cards are fast, but as many pointed out HD 5870 is not faster than Geforce GTX 295, which is something that many have expected. Radeon 5850 will also start selling in October time, but remember, last summer when ATI launched 4870, the card was almost impossible to buy and weeks if not months after, the availability finally improved. "
    http://www.fudzilla.com/content/view/15643/1/">http://www.fudzilla.com/content/view/15643/1/
    --
    Like I've kept saying, the bias is so bad... I keep discovering more big fat ati blunderous moves, that are instead ascribed to imaginarily to Nvida.
    Thanks for the incorrect whining, anger, and standard PC e-mindless chatroom repeated, non original, heard ten thousand times, brainless insult, it actually helped me.
    I learned ATI blew their 4870 launch with paper lies as well.
    You're a great help friend.

Log in

Don't have an account? Sign up now