facebook rss twitter

Review: NVIDIA Cg

by Ryszard Sommefeldt on 13 June 2002, 00:00

Tags: NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qal3

Add to My Vault: x

Page 4



Games!

One of the biggest concerns these days from you and me is the introduction of features on GPU's that aren't taken advantage of fully. We pay for the development of the new features by buying cards and then we pay again when we upgrade to the card supporting the features so it's not unreasonable that we expect those features to be used!

Dave Kirk was keen to point out that he thinks that because of Cg, the lag between features being introduced on GPU's and them being supported on games should drop to virtually nothing.

Features should be supported almost instantaneously using Cg and here how he and NVIDIA think it will happen. Using Cg, development of shaders that are too complex for todays hardware is quite easy. But there's nothing stopping developers writing these shader programs using Cg, including them in their applications and have them enabled and optimised by a new version of the Cg runtime compiler when the hardware becomes available to run them at good speed.

So games should start to look better over time and support features as the hardware does, not with a huge lag unlike now.

Combined with the big reason for Cg, the enabling of high quality effects for developers without the need to be a Jedi Shader Master (thanks for that quote Dave!) and Cg should be a big hit for gamers.

Overall

Overall while this article, and I guess other Cg articles too, will sound like NVIDIA adverts, the technology as laid out to me by Dave, really does sound like a boost for everyone. Games developers are behind it whole heartedly, tool developers will add support in to tools very soon if a plugin doesn't exist already and it will be supported by any DX8, DX9, OpenGL 1.3 and OpenGL 1.4 capable hardware with a compliant driver. If things aren't quite right, being an open standard, GPU vendors can tweak the compiler to suit their GPU's.

It wouldn't surprise us to see ATi announcing support soon for a Cg compiler for their GPU's along with Matrox, 3DLabs and anyone else producing GPU's in the future.

According to Kurt Akeley, one of the founding fathers of SGI and OpenGL, Cg is "The biggest revolution in graphics in 10 years, and the foundation for the next 10". What bigger endorsement for a technology do you need? And with Microsoft behind it 100% in terms of helping it off the ground and supporting it in their tools it should be a resounding win for the industry and the consumer.

It was a pleasure to talk to Dave Kirk about Cg and many thanks to the guys at NVIDIA and Bastion here in the UK for arranging for us talk to him about Cg for this article, future GPU's and other things ;)

Here's hoping you the consumer think it's as cool as NVIDIA do, let us know by dropping into our forums! You can also drop by developer.nvidia.com and Cgshaders.org for more information and even download the Beta release and take a look yorself!

Again, thanks to Dave Kirk, NVIDIA UK and Bastion for hooking us up!