vacancies advertise contact news tip The Vault
facebook rss twitter

Review: NVIDIA (eVGA) GeForce GTS 250 1GB: much ado about nothing?

by Tarinder Sandhu on 3 March 2009, 08:00 3.2

Tags: GeForce GTS 250 Superclocked 1GB, EVGA, NVIDIA (NASDAQ:NVDA), PC

Quick Link: HEXUS.net/qarad

Add to My Vault: x

Introduction


NVIDIA chose the start of 2009 as an opportune time to release a couple of high-end graphics cards, GeForce GTX 295 and GeForce GTX 285, etailing at £400 and £300, respectively, whose introduction solidified NVIDIA's position as provider of cutting-edge graphics. Whilst performance was undeniably good, especially from the twin-GPU GTX 295, the asking price deterred many.

The battle for volume sales happens far lower in the pricing spectrum, clearly, and both NVIDIA and arch-rival ATI need to have compelling discrete cards from, say, £30 to £150. ATI's shown its performance hand last year with a mid-range upgrade in the form of the Radeon HD 4800-series, whilst NVIDIA's been content to extend the longevity of the GeForce 9800 class of GPUs, which, in turn, are based on GeForce 8800 cards.

Long story short, right now, NVIDIA has three options by which to increase the attractiveness of its mid-range wares, priced at between £100-£150. Firstly, it can build an entire programming and application ecosystem around modern GPUs that make them more than just a pixel-cruncher. It's done this, reasonably successfully, with the nascent CUDA-driven and Graphics Plus initiatives, beating ATI to the punch. Secondly, it can reduce the price that it sells various GPUs in to its partners, and the GeForce 9800 GTX+ is currently available from around £115. Lastly, to bring greater performance gains the company can leverage the existing high-end architecture found in GTX 285 (GT200), cheapen it through architecture-cutting maneuvers, and release a £125 card that traces it lineage directly back to GT200.

Today, NVIDIA introduces the GeForce GTS 250 GPU, designed to be the mid-range standard-bearer for much of 2009. Read on to find out which option NVIDIA chose, exactly what it is, and how it stacks up to competition.