facebook rss twitter

Review: NVIDIA's GeForce 6200 Launch

by Ryszard Sommefeldt on 11 October 2004, 00:00


Quick Link: HEXUS.net/qa3s

Add to My Vault: x

NVIDIA's GeForce 6200 Launch


At the end of my recent preview of NVIDIA's GeForce 6600 GT, powered by their NV43 GPU, I remarked, "Now there's only one piece of the pie left", in reference to NVIDIA's claim that they'd offer top-to-bottom GeForce 6-series models before the end of 2004, based on the same technology that debuted this April with NV40 and 6800 Ultra.

With the mid-range 6600 and high-end 6800 range of hardware covered, the bit that's left is 6200, which makes its initial mark today. With Euro-zone review samples held up at customs for the majority of last week and only just released to NVIDIA in Europe today, from what we can make out, it's up to our American and Canadian counterparts to provide you with reviews. However, I can still go over the basics with you before our own sample arrives in due course.

PCI Express once again

The push to get everyone using PCI Express graphics cards on the desktop marches on at full speed. 6200 assists in that endeavour by being PCI Express only, including for the forseeable future. NVIDIA have no concrete plans for an AGP bridged version, especially since that requires a new GPU package and added cost, something that's not attractive for the 6200's price point.

NV43's basic building blocks

Like GeForce 6600, 6200s use NV43 GPUs. In 6600 that's eight fragment pipelines, three vertex shaders and four ROPs fed by a fragment crossbar. That means up to eight pixel fragments output per clock, into four output units, with three of NV40's six vertex shaders. In 6200 everything's the same as far as basic architecture is concerned, just four of the fragment pipelines are disabled, presumably after failing QA testing at the fab.

4 pipes, 4 ROPs, three vertex units. Easy peasy.

But there's more

But on top of those basic building blocks, 6200 has other differentiating features that mark it out as more than just four-pipe 6600. 6200 has no support for NVIDIA's SLI, meaning you won't be able to combine a pair of 6200's in a dual PCI Express system for more graphics power. It also has none of NV43's colour or Z-buffer optimisations enabled. So there's no colour compression on pixel fragments, no Z-buffer compression, no fast Z-clear. You can argue that at this end of the market, where the GPU is less capable and clocked slower than its mid-range counterparts, that colour and Z optimisations, under NVIDIA's Intellisample technology umbrella, are needed more at the low end to aid performance, than in the higher performing parts.

6200 also has no support for 64-bit texture formats, meaning that it can't do 64-bit texture blending, combining or rendering of any kind where a 64-bit precision surface is required, integer or otherwise. That has knock on effects for certain kinds of rendering techniques and means 6200 is even less capable than a four-pipe 6600 would suggest.

It's not all bad

NVIDIA UltraShadow II technology is complete and working in 6200, offering rendering speedups for graphics techniques that make heavy use of stencil shadows. You've also got full support for Direct X Shader Model 3.0, NVIDIA's newest 4X RGMS multi-sampling anti-aliasing, 16X angle-adaptive aniso-tropic filtering and the rest of what defines NV4x GPUs.


NVIDIA's price points for 6200 boards sit at $129 for a 128MB version (all 6200 variants will have a 128-bit memory bus initially, with AIBs able to implement even lower cost 64-bit versions if they choose, at a later date) and $149 for a 256MB version. NVIDIA in the UK fully expect that to translate to < £100 across the board, unless AIBs decide to go nuts on the bundle and charge you a bit more.

To sum up

Think 6600 and NV43. Think the same 0.11µ process production at TSMC. Think half the pixel power at the same clocks, without some optimisations before pixels are generated (Z opts) and after they're written to their render buffers (colour opts), no SLI, but complete SM3.0. If that makes sense, you're most of the way there.

With NVIDIA's basic reference clocks for 6200 at 300MHz core and 275MHz (550MHz DDR) memory, it's an X300/X600 competitor, and I'd expect 6600 GT and ATI's X700 XT to be at least twice as fast in most situations. That pitches 6200 as the step up from integrated graphics that most people look at when they're considering low-cost systems. It'll be interesting to see how that works out.

Pretty pictures

While we didn't get actual boards for today, we did get pretty pictures. So here's some geek porn to satisfy that low-end graphics lust. Kleenex at the ready, budget fans!



It's a single slot board using a small PCB and the cooler many of you might remember from some GeForce3 boards back in the day. Equipped with S-Video input and output, DVI-I and VGA, the pictured reference board doesn't need any external power source and should run very cool and quiet, to steal a phrase from AMD.

We'll have our own review up as soon as possible. In the meantime, The Tech Report were kind enough to let me preview their article, full of performance numbers, an hour or so before you'll get to read this. It's great stuff as always from those guys, so check it out here.

Their reference board has 128MB of memory clocked at 500MHz DDR, with the core at 300MHz, and Geoff pits it against GMA900 and a couple of PCI Express Radeons from the X300 and X600 camps. Performance sits just shy of X600 Pro in most cases. Be sure and read their review for full details.