facebook rss twitter

Intel is preparing to annihilate NVIDIA: the war is on

by Tarinder Sandhu on 11 April 2008, 15:05

Tags: NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qamoe

Add to My Vault: x

Shouldn't they just be friends?


Knowing the inside track, NVIDIA's relationship with Intel has, at times, been fair to good - borne out of the respect for two companies who've consistently pushed the performance envelope through innovation and timely execution.

Intel's been around for 40 years and has established itself as the foremost semiconductor manufacturer on the planet, with its fingers in many associated, and profitable, pies. NVIDIA's the relatively young upstart, a 15-year-old whiz that's inexorably grown into a multi-billion-dollar company.

Prima facie, the companies' products appear to complement each other. Intel produces quality processors and chipsets. NVIDIA, too, provides chipsets but also the discrete GPUs that power the mainstay of the mid-to-high-end PC ecosystem. Crucially, and here's where the beef begins, Intel doesn't currently engineer discrete GPUs and NVIDIA doesn't play in the consumer/enterprise x86 CPU market. Live and let live, shall we say.

Now, there is some further commonality between the two. Both companies engineer onboard IGPs (integrated graphics processors) for their chipsets. Intel has traditionally oriented its IGPs towards the volume business market where basic 2D video-exporting is key. NVIDIA, like ATI (now AMD), has focussed on adding a greater feature-set and pandering to the whims of the occasional gamer, through discrete-derived technology. Again, live and let live.

The squeeze is on

Intel now wants a large portion of that lovely, profitable discrete GPU pie that NVIDIA and ATI have devouring, year after year. The realisation of that goal goes by the name of Larrabee. Rather than imitate the established raster-based architectures put forward by NVIDIA and ATI, Intel wants to rewrite the rule book.

Larrabee, due to be launched late next year, is a many-core (loosely) x86 GPGPU architecture with a vector unit slapped on to it. What that means is that a completely different SDK (software development kit) is required to code for it, yet it will still retain compatibility with DirectX and OpenGL.

NVIDIA seems annoyed for various reasons. The very fact that Larrabee will work best with an Intel-developed SDK dictates that with the help of the mighty chequebook, and Intel's is mighty indeed, games developers may, and probably will, be swayed into optimising code for the new GPUs, at NVIDIA and ATI's collective expense.

NVIDIA, too, knows that if Intel is successful, the company will own the crucial CPU-chipset-graphics chain in every pricing sector, just like AMD, but with the financial resources to really make it pay.

Further perceived annoyance stems from the fact that Intel has declared that it will use the AIB (add-in board) partner model, potentially luring clients - such as XFX, eVGA, and Gainward - from being NVIDIA-only houses. This is where, we reckon, NVIDIA will really feel it, on the bottom line.

Then, conjecturing a little further down the line, Intel could squeeze NVIDIA at the top-end of the market, and AMD do the same at the low-end, significantly shrinking NVIDIA's footprint in the discrete sector.

But conjecture really is just that. NVIDIA's chief scientist, David Kirk, has publicly cast some doubt on the future of Intel's raytracing-based adventure, rightly so in our opinion, because starting a cutting-edge GPU from scratch isn't like shelling peas.

Ultimately, given the appropriate funding and dollops of persistence, it could be a question of who will blink first, NVIDIA or Intel, and that's a match-up that'll be fun to watch.