NVIDIA GeForce GTS 250: A Rebadged 9800 GTX+
by Derek Wilson on March 3, 2009 3:00 AM EST- Posted in
- GPUs
Why NVIDIA Did It
To understand the motivation behind NVIDIA's naming and renaming and renaming we have to once again look its approach to GPU design. NVIDIA continued to architect very high end GPUs and allow their technology to, over the course of 9 - 12 months, trickle down to mid range and lower end market segments. AMD stepped in and launched a very competitive performance mainstream part instead of a high end GPU, allowing it to windfall down to lower price points and market segments quicker than NVIDIA could for this generation.
Let's attach some code names shall we?
NVIDIA's flagship, the GT200 GPU used in the GTX 295, 285, 280 and 260, isn't available in a cheaper version yet. AMD's flagship, the RV770, is already more affordable and is available in cheaper versions. NVIDIA has to rely on its last generation GPU, the G92b, to compete in the rest of the market while the lower end GT200 derivatives get ready for production. Rather than continue to ship products with old names to vendors and customers, NVIDIA slaps a new name on an old GPU and hopes to at least provide the appearance of being just as agile and competitive as AMD despite being clearly caught off guard this generation.
Of course, NVIDIA has a case to make. This is their current generation of hardware, and it is practical and useful to maintain a consistent nomenclature so that the general public knows what the product positioning actually is. We agree, only our solution is top to bottom launches in line with new GPU architectures rather than simply changing the name of old parts so that they look shiny and new.
NVIDIA's take on this is also flawed in that it treats customers like idiots and underlines the fundamental issue we have. Do I need a card with a new name on it to believe that it is worthy of my purchase, or can I go read reviews comparing the hardware and learn for myself whether or not any card (regardless of the name) fills my need? Maybe this name change is for people who don't know anything about graphics hardware then. In that case the thing that "sells" the card is the simple fact that NVIDIA has convinced someone that this part is an affordable version of a card from their latest line of products. Saying they need a name change to maintain current naming is essentially admitting that the only reason the name needs to be changed is to mislead uninformed people.
NVIDIA would love to have 40nm GT200 derivatives out today. Until that day comes, we'll get cards that sound like GT200 based products.
Anyway, we haven't previously tested a 1GB 9800 GTX+, and until this announcement their prices haven't been anywhere near reasonable (currently they're up at $200, so the $50 price drop will make a big difference). There is also a slight tweak between the GTS 250 1GB and the 9800 GTX+ 1GB: the memory on the 1GB 9800+ was underclocked by about 9.1%, and the GTS 250 1GB brings clock speed back in line with the 512MB 9800 GTX+. So while the 512MB part doesn't perform any different in any way, we should no longer see any performance degradation in games that don't benefit from memory size but are memory bandwidth sensitive from moving up to 1GB.
Oh, also wide availability won't be until March 10th. Seriously.
Also, not explained until now is the way the new naming scheme will go forward. Now, GTX, GTS, GT and G (as far as we can gather) will indicate performance segment. The number will be the model number and within a performance segment, higher is better. Essentially NVIDIA has swapped the meaning of letters and numbers in their naming. They have also clearly told us that naming will no longer be attached to GPU architecture, but that vendors may somehow still indicate architecture on the box if they so choose. If nothing else, the feature list and specifications will be a guide. Here's to requiring that people read the fine print to know what they're buying.
For What it's Worth
Early last week Charlie over at The Inquirer posted a story saying that a number of reviewers were cut out of the GeForce GTS 250 launch. We felt a bit hurt, by the time the story launched we weren't even asked to be briefed about the GTS 250. Cards had already gone out to other reviewers but we weren't on any lists. Oh, pout.
Magically, a couple of days after Charlie's article we got invited to a NVIDIA briefing and we had a GTS 250 to test. Perhaps NVIDIA was simply uncharacteristically late in briefing us about a new GPU launch. Perhaps NVIDIA was afraid we'd point out that it was nothing more than a 9800 GTX+ that ran a little cooler. Or perhaps we haven't been positive enough about CUDA and PhysX and NVIDIA was trying to punish us.
Who knows what went on at NVIDIA prior to the launch, we're here to review the card, but for what it's worth - thank you Charlie :)
103 Comments
View All Comments
Wurmer - Tuesday, March 3, 2009 - link
Performances aside, Nvidia should get their naming scheme straight. All this renaming and name swapping only contributes to get the customers confused. Now matter how it's explained, make this simple. Higher number, more powerful card ! In this regard, I find that ATI has made an effort of late.I'll also agree with one of the above poster that Nvidia was taken aback by the release of the 4870 and 4850. ATI hit the nail right on the head and the green team seems to have a bit hard time devising a proper response. Instead of getting their pray in their gun sight they use a shotgun and pepper the target all around......
SiliconDoc - Wednesday, March 18, 2009 - link
Shotguns usually KILL with just one shot - and ATI has caused another charge off, another BILLION dollar loss for AMD.I'm not sure NVidia hit them, (they obviously don't need to) so let's hope they don't, as well.
I'm also not sure what it means when people keep speculating that ATI "caught them off guard" - it doesn't really mean anything - it's just a stupid way of saying "ATI did better than I expected" ( but it's "cool" to not say that and put down NVidia instead, huh... since so many around the geek spots "taught you to say it that way" ).
Then after "being caught off guard" NVidia "drops a card" and it's because "they panicked" - right ?
DEREK - explained it - didn't he.... " NVida released the 9800GTX... "on the eve of the 4850 launch"....
YES, THAT'S HOW OFF GUARD THEY WERE.... NVIDIA HUH...
They released a DAY BEFORE ati did....
And they're STILL USING the 9800 - to battle the 4850 that was relased AFTER Nvidia....released their card.
Oh well....
DEREK tried to make nvidia sound evil, too - for releasing "on the eve of " the sainted red card 4850 release day - those nasty nvidia people spoiling the launch by releasing on ati's "eve"...
By golly, it's no wonder Obama was elected my friend.
Mr Perfect - Tuesday, March 3, 2009 - link
Hey, uhm, in that link posted on page two, there is mention that the press review cards are specially picked by Nvidia. Any idea if this is true?SiliconDoc - Wednesday, March 18, 2009 - link
Doesn't matter when they are stock clocked. The reviewers can do all sorts of things to "compensate" for how they want outcomes, anyway, like editing ini files and choosing the games and the order - heck even using a card that isn't the card they are supposedly reviewing.spinportal - Tuesday, March 3, 2009 - link
The GTX250 1GB is barely useful over it's 512MB counterpart except for power usage and slot size. The est. street price of 149 is already countered by the AMD4870 512MB and tests shows it's a hair better. Given the 250 uses less juice over the 4870, it's odd that the 250SLI is using more juice than the 260 core 216 SLI, so there goes that benefit.NVidia cannot strip down the GT200 core to reduce the power load from 2 6-pin to just 1 for 150W. Perhaps there is something to be said for GDDR5 power reduction.
Either way, the 250 is a win for nvidia in the mainstream budget for less power usage, CUDA, & Physx at the same price of the 4870 512MB which runs hotter, noisier (probably) and less feature rich. Does DX 10.1 matter at this point? PureVideo 2 is a wash vs. AMD's UVD.
It's distateful rebadging a GT92b in the GT200 naming scheme. This helps NVidia's costs by EOL the whole 8800GT architecture.
But by April, who's going to care? New spins are comings. This stop gap is only to reduce bleeding. Hopefully next gen is executed better so performance grows as power demands decrease.
kx5500 - Thursday, March 5, 2009 - link
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
SiliconDoc - Wednesday, March 18, 2009 - link
Take your finger off the key, you're repeating, you dummy.Wolfpup - Tuesday, March 3, 2009 - link
I'm none to crazy about the random GPU/CPU naming this industry has always seen, but I disagree with Nvidia somehow needing a budget version of the 260/280 GPUs.I mean the 8800 series is basically the same thing. Sure there are some improvements here and there, but basically the 260 is the same thing, but with massively more execution hardware. I don't really see this huge distinction between buying a 250/9800GTX+ and it not being a stripped down 260...I mean it would almost be the same thing anyway.
And if nothing else, it's great to see how much less this 250 uses in terms of power. I mean this is still a really nice GPU that I'd be glad to have in a system. (I'm on a laptop with a 9650GT, which is yet another 32 processor part...and even this isn't half bad at all!)
SiliconDoc - Wednesday, March 18, 2009 - link
The red fanboys need Nvidia to cut the arms and legs off the GT200 and turn it into a puny 4830 (that has the top core ati has made, can make).The PROBLEM for red fanboy red freaks is just that... their great and sainted ATI has their "superwhomper super efficient super technology core! Oh Goddd! it's so greaaattt ! (*red boy orgasm)" - in the lowly 4830 - now compare that to the GTX260/192 and THAT'S where the red fanboys stands (or, really cries, rather).
Now look at the 4830 top ATI core and compare it to the GTX285... oh that's PAINFUL.
Now do the reverse...
Compare the top core 4870 to the top Nvidia core GTX260/192 (lowest itteration) - and golly, it EVEN BEATS the 4870 sometimes...
So THERE you have "the big ugly problem" for the red fanboys - who keep wailing that Nvidia MUST make a lower GT200 part for them...
( their tiny inner red child needs some comfort - it's just not fair with that big evil green GT200 SPANKING the rv770 so badly ! It's "abuse" ! )
Can we have a GT200 core that is as LOUSY at the 4830 ?!?! please please pretty please!!! we have some really hurting little red fanboy crybaby whining fud propaganda doper diaper babies that need some satisfaction and their little red egos massaged...
DON'T make them face the truth, EVER , nvidia, you big mean greedy green ...
( Yes, dude, they keep begging for it, and THAT'S WHY ! )
There is no doubt about it.
LuxZg - Tuesday, March 3, 2009 - link
How about making an article where you'd test all those G92 renames, rebrands, overclocks and shrinks?So put a fight between 8800GT 256MB (137$), 8800GT 512MB (154$), 8800GTS (171$), 9800GT 512MB (148$), 9800GT 1GB (171$), 9800GTX (205$), 9800GTX+ (214$), 9800GTX 1GB (228$), GTX250 512MB (+230$ ??), GTX250 1GB (+240$ ??). And if I've missed some variation, please include that too :)
Than test them all overclocked with their default cooling.
I just want to see how much have they gone from the first revision till these last ones. Prices in brackets are for local prices in Croatia where I live. And yes, you can buy them all.. I even found an 8800GTX 768MB card and 8800GTS 320MB as well, interesting, both at the same 171$ price that gets you "new" 8800GTS (512MB) or 9800GT 1GB :D
Now, since you can buy all these cards, and most of them are really close in price (some 100$ from all-new top to the rock-bottom 8800GT 256MB; if you exclude those than it's just some 60$ difference) - it would be an interesting article. Especially for those aiming at SLI for their old card which they've bought earlier.
And while here, I support the above comment - try SLI these 8800/9800 cards with these new GTX 250. Should have no problem, but to check the performance (gains) anyway..