facebook rss twitter

Scale this

by Jon Peddie on 18 April 2006, 09:24

Quick Link: HEXUS.net/qafgo

Add to My Vault: x

The law is the law

And here you see Amdahl’s law kick in and it acts like Catch 22. Amdahl’s law says that it is the algorithm that decides the speedup and not the number of processors, and you eventually reach a place where parallelization doesn’t buy you anything, the point of diminishing returns.

Although not explicit in Amdahl’s law, trying to scale to four (or more) GPUs or AIBs in a PC is compounded by associative factors like basic Cost (can anyone one other than Bill Gates afford to buy such a thing), Cost/Benefit (assuming normal humans can afford it, do they get 4X the benefit for the purchase), Cost-per-Watt (can anyone besides a nuclear reactor power such a thing), and Cost-per-Decibels (creating a wind tunnel by cooling the system.)

All but one of the cost factors can be rationalized and justified (under the heading, I don’t give a damn, I just want it) and that’s Cost/Benefit. Here is the Achilles’ heal of SLI >2. Compromises in packaging and power mitigate the performance. Getting the drivers right mitigate the performance, and then, finding a problem to solve to justify the cost mitigates the experience.

To rationalize the last point, when queried about the benefit of Quad SLI, Nvidia says, go big, get a 2580 x 1600 monitor (or two) and run your games at the max everything. Now I have no problem with that concept, remember, in computer graphics too much is not enough (Jon Peddie 1981), so gimme that Quad and get outta my way. Oh, but, there’s more to the problem.

Quads is big

Quad is big - Source JPR


As of now, I can’t build one. Well I can build one, and did (see photo) but Nvidia and its partners don’t really want me to build one. Not because they’re mean or stingy, but because it’s not kid’s play. As you can see from my Frankenstein system (so aptly named by Derek Pereze at Nvidia) kludging together multiple power supplies and getting those long AIBs to fit is challenging. And, you have to have really rock solid drivers, both chip and GPU. And then, you’ve got to use games that will support the really big monitors, and as of now you have two sources for those monitors, Apple and Dell.

Franky

Jon’s Frankenstein, kids don’t do this at home - Source JPR


And although I could get my monster to walk and talk, I couldn’t get him to sing and dance. Nvidia points out that the Quad was not designed to be used in the way I was trying to use it, nor was I working with the most up-to-date drivers. So I packed it all away and went back to Ghost Recon. (BTW, if you haven’t seen Logitech’s G15 gamer’s keyboard, I heartily recommend you check it out.)

So Quad is going to be for the next generation of big games on big displays driven by big boards that cost big bucks.

So where’s the beef?

Well to find that out we have to wait a while. The powers that be have said they’ll loan me a fully assembled Quad system, and that the drivers will be up to date and I can see for myself the benefits of $2,000 worth of AIBs. In the meantime, I remain politely skeptical. But lest I pass judgment without the benefit of facts and/or experience (empirical research) we must remember the folks who are promoting this technology, Alienware, Dell, Falcon, and Nvidia are not fools or looking for someplace to waste money. So they must see a benefit and a need or else they wouldn’t plow so much time and money into it. I just hope the emperor really has clothes on.



HEXUS Forums :: 1 Comment

Login with Forum Account

Don't have an account? Register today!
It's certainly a good bet that going from 2 to 3 GPUs will provide less of a benefit than that from 1 to 2, and quad SLI has formidable issues regarding power consumption and cooling to deal with That “dual-layer card” is going to be a challenge for anyone wanting to use watercooling or phase-change on their GPUs also.

However there is one case that would seem to be a good argument for multiple GPUs - surround gaming using 3 monitors and Matrox's Triplehead2Go. Having a GPU for each monitor would seem an excellent way of splitting workload and providing a better gaming experience. I just wonder why ATI/Nvidia never bothered following this up.