facebook rss twitter

Review: NVIDIA Cg

by Ryszard Sommefeldt on 13 June 2002, 00:00

Tags: NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qal3

Add to My Vault: x

Page 3



What does Cg mean for the industry and the consumer?

I had a nice chat with NVIDIA, specifically with Dave Kirk, NVIDIA's Chief Scientist, one of the people at NVIDIA responsible for Cg. We covered Cg as far their presentation laid things out and David Ross and I had a chance to talk to Dave about Cg specifics, their plans for Cg and what it means to the industry and the consumer (you!).

On the surface it looks like Cg is nice for NVIDIA hardware and you'll have to have an NVIDIA card to make use of apps and games that use Cg generated effects, but Dave Kirk was very keen to point out that this isn't the case which is very good news.

Since Cg outputs DX8 and OpenGL 1.3 (and soon, although we can't tell you when just yet, DirectX 9.0 and OpenGL 1.4) compliant shader programs, it's feasible that other compliant DX8/DX9 and OpenGL 1.3/1.4 hardware should be able to run Cg output too right? Correct! Any compliant shader hardware on any of Cg's target platforms (Windows, Linux, Mac OS X, Xbox etc) provided they implement the API specifications correctly will be able to make use of Cg output.

So Radeon 8500 users, future ATi R300 users and any user of compatible standards compliant hardware will be able to use Cg output.

But what about if R8500 or whatever doesn't quite output correct visuals when using Cg as provided by NVIDIA? This is where the open source nature of Cg comes in. That's right, NVIDIA is comitted to it being an open standard. They argue, quite correctly, that how do they expect people to adopt Cg right across the industry unless they have broad, pervasive support for Cg? So they are commited to making the spec, the compiler and other key parts of what makes up Cg open in terms of information and also source code.

So lets take a look at the hardware side of things first as far as Cg is concerned.

The runtime compiler takes care of any microarchitecture specific nuances. So if NVIDIA's reference Cg implementation for NVIDIA GPU's doesn't quite output correct visuals on say ATi R300, due to the open nature of Cg, ATi are quite free to write their own fixes for their hardware inside the Cg compiler and ship their version of Cg to consumers of their boards.

This means that Cg output runs correctly on all hardware and GPU makers can perform any fixes inside Cg directly, right in the compiler. NVIDIA argue that their implementation is the most standards compliant, after all they did help write the DirectX 8 shader specifications and Cg is their baby (with some help from Microsoft regarding DX9 compliance) so providing that everyone elses hardware does the same things, Cg should run OK on other hardware.

But the facility is there for other GPU vendors to tweak Cg to suit, if needed. A good thing as far as you are concerned as you know your hardware will be supported.

What about on the software side of things? How will software support Cg?

With the interface between Cg and the render API's taken care of, what else needs to be done? Well of course, 3D applications need to use Cg to write their shader programs. Writing shader programs the old way bypasses Cg so what's the point? To this end, NVIDIA have been silently recruiting developers over the past 3 or 4 months, giving them access to Cg and the developer relations team at NVIDIA to working with developers to see what they think.

According to NVIDIA, they've had nearly a 100% thumbs up rate with games developers with regards to Cg. That means you'll see massive industry wide support from game developers and lots of Cg usage.

NVIDIA have also been recruiting tool developers to see if Cg support is something they'd like to add to their tools and again the answer was a resounding yes. SoftImage|XSI, Maya, 3DStudio Max, Lightwave and other tools will all support Cg via plug-ins just now or natively in the next major iteration of the products. Dave Kirk made a big thing about getting the tool developers on board and you can expect an announcement from Soft Image at the next SIGGRAPH conference about native support for Cg output in the next major release of their tools.

Cg output directly from these tools can be used in 3D apps using data output from those tools. So say a game developer uses Maya to animate and model the in game models. They can then use the Cg output facilities in Maya to describe the shader effects needed to see the same output in a game engine. Then the engine developer can use that exact same Cg output from Maya inside the game and see the same effects.

This cuts down time to develop games effects by a huge factor.

So safe in the knowledge that due to NVIDIA recruiting software developers, the commitment to it being a pervasive open standard so that regardless of the GPU on your graphics card it can use Cg output, lets take a look at why Cg will help reduce the lag between features becoming available on GPU's and you seeing them in games.