NVIDIA GeForce GTS 250: A Rebadged 9800 GTX+
by Derek Wilson on March 3, 2009 3:00 AM EST- Posted in
- GPUs
Why NVIDIA Did It
To understand the motivation behind NVIDIA's naming and renaming and renaming we have to once again look its approach to GPU design. NVIDIA continued to architect very high end GPUs and allow their technology to, over the course of 9 - 12 months, trickle down to mid range and lower end market segments. AMD stepped in and launched a very competitive performance mainstream part instead of a high end GPU, allowing it to windfall down to lower price points and market segments quicker than NVIDIA could for this generation.
Let's attach some code names shall we?
NVIDIA's flagship, the GT200 GPU used in the GTX 295, 285, 280 and 260, isn't available in a cheaper version yet. AMD's flagship, the RV770, is already more affordable and is available in cheaper versions. NVIDIA has to rely on its last generation GPU, the G92b, to compete in the rest of the market while the lower end GT200 derivatives get ready for production. Rather than continue to ship products with old names to vendors and customers, NVIDIA slaps a new name on an old GPU and hopes to at least provide the appearance of being just as agile and competitive as AMD despite being clearly caught off guard this generation.
Of course, NVIDIA has a case to make. This is their current generation of hardware, and it is practical and useful to maintain a consistent nomenclature so that the general public knows what the product positioning actually is. We agree, only our solution is top to bottom launches in line with new GPU architectures rather than simply changing the name of old parts so that they look shiny and new.
NVIDIA's take on this is also flawed in that it treats customers like idiots and underlines the fundamental issue we have. Do I need a card with a new name on it to believe that it is worthy of my purchase, or can I go read reviews comparing the hardware and learn for myself whether or not any card (regardless of the name) fills my need? Maybe this name change is for people who don't know anything about graphics hardware then. In that case the thing that "sells" the card is the simple fact that NVIDIA has convinced someone that this part is an affordable version of a card from their latest line of products. Saying they need a name change to maintain current naming is essentially admitting that the only reason the name needs to be changed is to mislead uninformed people.
NVIDIA would love to have 40nm GT200 derivatives out today. Until that day comes, we'll get cards that sound like GT200 based products.
Anyway, we haven't previously tested a 1GB 9800 GTX+, and until this announcement their prices haven't been anywhere near reasonable (currently they're up at $200, so the $50 price drop will make a big difference). There is also a slight tweak between the GTS 250 1GB and the 9800 GTX+ 1GB: the memory on the 1GB 9800+ was underclocked by about 9.1%, and the GTS 250 1GB brings clock speed back in line with the 512MB 9800 GTX+. So while the 512MB part doesn't perform any different in any way, we should no longer see any performance degradation in games that don't benefit from memory size but are memory bandwidth sensitive from moving up to 1GB.
Oh, also wide availability won't be until March 10th. Seriously.
Also, not explained until now is the way the new naming scheme will go forward. Now, GTX, GTS, GT and G (as far as we can gather) will indicate performance segment. The number will be the model number and within a performance segment, higher is better. Essentially NVIDIA has swapped the meaning of letters and numbers in their naming. They have also clearly told us that naming will no longer be attached to GPU architecture, but that vendors may somehow still indicate architecture on the box if they so choose. If nothing else, the feature list and specifications will be a guide. Here's to requiring that people read the fine print to know what they're buying.
For What it's Worth
Early last week Charlie over at The Inquirer posted a story saying that a number of reviewers were cut out of the GeForce GTS 250 launch. We felt a bit hurt, by the time the story launched we weren't even asked to be briefed about the GTS 250. Cards had already gone out to other reviewers but we weren't on any lists. Oh, pout.
Magically, a couple of days after Charlie's article we got invited to a NVIDIA briefing and we had a GTS 250 to test. Perhaps NVIDIA was simply uncharacteristically late in briefing us about a new GPU launch. Perhaps NVIDIA was afraid we'd point out that it was nothing more than a 9800 GTX+ that ran a little cooler. Or perhaps we haven't been positive enough about CUDA and PhysX and NVIDIA was trying to punish us.
Who knows what went on at NVIDIA prior to the launch, we're here to review the card, but for what it's worth - thank you Charlie :)
103 Comments
View All Comments
sbuckler - Wednesday, March 4, 2009 - link
I don't understand the hate. They rebranded but more importantly dropped the price too. This forced ati to drop the price of the 4850 and 4870. That's a straight win for the consumer - whether you want ati or nvidia in your machine.SiliconDoc - Wednesday, March 18, 2009 - link
Oh, now stop that silliness ! Everyone worthy knows only ati drops prices and causes the evil green beast to careen from another fatal blow. ( the evil beast has more than one life, of course - the death blow has been delivered by the sainted ati many times, there's even a shrine erected as proof ).Besides, increasing memory, creating a better core rollout, redoing the pcb for better efficiency and pricing, THAT ALL SUCKS - because the evil green beast sucks, ok ?
Now folllow the pack over the edge of the cliff into total and permanent darkness, please. You know when it's dark red looks black, yes, isn't that cool ? Ha ! ati wins again ! /sarc
Hrel - Wednesday, March 4, 2009 - link
I can't wait to read your articles on the new mobile GPU's and I'm REALLY looking forward to a comparison between 1GB 4850 and GTS250 cards; as well as a comparison between the new design for the GTS250 512MB and the HD4850 512MB.It seems to me, if Nvidia wanted to do right by their customers, that they'd just scrap the 1GB GTS250 and offer the GTX260 Core216 at the $150 price point, it has a little less RAM so there's a little savings for them there. But then, that's if they wanted to do the right thing for their customers.
It's about time they introduced some new mobile GPU's, I hope power consumption and price is down as performance goes up!
I look forward to AMD releasing a new GPU architecture that uses significantly less power, like the GT200 series cards do. 40nm should help with that a bit though.
Finally, a small rant: When you think about it, we really haven't seen a new GPU architecture from Nvidia since the G80. I mean, the G90 and G92 are just derivatives of that and they only offer marginally better performance on their own; if you disregard the smaller manufacturing process the prices should even be similar at release. Then even the GT200 series cards, while making great gains in power efficiency, are still based on G92 and STILL only offer marginally better performance than the G92 parts; and worse, they cost a lot to make so they're overpriced for what they offer in performance. I sincerely hope that by the end of this year there has been an official press release and at least review samples sent out of completely new architectures from both AMD and Nvidia. Of course it'd be even better if those parts were released to market some time around November. Those are my thoughts anyway; congrats to you if you actually read through all of this:)
SiliconDoc - Wednesday, March 18, 2009 - link
" It seems to me, if Nvidia wanted to do right by their customers, that they'd just scrap the 1GB GTS250 and offer the GTX260 Core216 at the $150 price point, it has a little less RAM so there's a little savings for them there. But then, that's if they wanted to do the right thing for their customers. "_________________
So, they should just price their cards the way you want them to, with their stock in the tank, to satisfy your need to destroy them ?
Have fun, it would be the LAST nvidia card you could ever purchase. "the right thing for you" - WHAT EVER YOU WANT.
Man, it's just amazing.
Get on the governing board and protect the shareholders with your scheme, would you fella ?
Hrel - Saturday, March 21, 2009 - link
Hey, I know they can't do that. But that's their fault too; they made the GT200 die TOO BIG. I'm just saying, in order for them to compete in the market place well that's what they'd have to do. I DO want them to still make a profit; cause I wanna keep buying their GPU's. It's just that compared to the next card down, that's what the GTX260 is worth, cause it's just BARELY faster; maybe 160. But that's their fault too. The GT200 DIE is probably the WORST Nvidia GPU die EVER made, from a business AND performance standpoint.SiliconDoc - Saturday, March 21, 2009 - link
PS - you do know you're insane, don't you ? The " GT200 is the probably the worst die from a performance standpoint."Yes, you're a red loon rooster freak wacko.
Hrel - Thursday, April 9, 2009 - link
you left out Business standpoint, so I guess you at least concede that GT200 die is bad for business.SiliconDoc - Saturday, March 21, 2009 - link
Now you claqim you know, and now you ADMIT there is no place for it if they did, anyhow. Imagine that, but "you know" - even after all your BLABBERING to the contrary.Now, be aware - Derek has already stated - the 40nm is coming with the GT200 shrunk and INSERTED into the lower bracket.
Maybe he was shooting off his mouth ? I'm sure "you konw" -
( Like heck I am )
Six months from now, or more, and 40nm, will be a different picture.
Hrel - Wednesday, April 1, 2009 - link
seriously, what are you talking about?pretty sure I'm gonna just ignore you from now on; pretty certain you are medically insane!
I'd respond to what you said, I honestly have no idea what you were TRYING to say though.
SiliconDoc - Wednesday, April 8, 2009 - link
You don't need to respond, friend. You blabber out idiocies of your twisted opinion that noone in their right mind could agree with, so its clear you wouldn't know what anyone else is talking about.You whine nvidia made the gt200 core too big, which is merely your stupid opinion.
The g92 core(ddr3) with ddr5 would match the 4870(drr5), which is a 4850(ddr3) core.
So nvidia ALREADY HAS a 4850 killer, already has EVERYTHING the ati team has in that region - AND MORE BECAUSE OF THE ENDLESS "REBRANDING".
But you're just too crewed up to notice it. You want a GT200 that is PATHETIC like the 4830 - a hacked down top core. Well, only ATI can do that, because only their core SUCKS THAT BADLY without ddr5.
NVidia ALREADY HAS DDR3 ON IT.
SHOULD THEY GO TO DDR2 TO MOVE THEIR GT200 CORE DOWN TO YOUR DESIRED LEVEL ?
Now, you probably cannot understand ALL of that either, and being stupid enough to miss it, or so emotionally petrified, isn't MY problem, it's YOURS, and by the way, it CERTAINLY is not NVidia's - they are way ahead of your tinny, sourpussed whine, with aJUST SOME VERY BASIC ELEMENTARY FACTS THAT SHOULD BE CLEAR TO A SIXTH GRADER.
Good lord.
the GT200 chips already have just ddr3 on them mr fuddy duddy, they CANNOT cut em down off ddr5 to make them as crappy as the 4850 or 4830, which BTW is matched by the two years old g80 revised core- right mr rebrand ?
Wow.
Whine whine whine whine.
I bet nvidia people look at that crap and wonder how STUPID you people are. How can you be so stupid ? How is it even possible ? Do the red roosters completely brainwash you ?
I know, you don't understand a word, I have to spell it out explicitly, just the very simple base drooling idiot facts need to be spelled out. Amazing.