facebook rss twitter

PhysX As Standard on DELL XPS 1730


Been waiting a while? Get Flash to see this player.

With Warmonger and UT3 to come, Ageia's confidence is at a new high following Dell's decision to included a PhysX daughter card as standard on their brand new, top-of-the-line gaming notebook, the XPS 1730. Dan Forster from Ageia explains...

Related Reading

HEXUS Forums :: 5 Comments

Login with Forum Account

Don't have an account? Register today!
The next few months will be very busy - no doubt

With Intel taking over Havok, it will be interesting to see if there is a movement from nVidia and AMD/ATI toward non-Havok engines like Ageia's

The option Dan showed us is smaller, neater and less power hungry than previous generations

There is also the subject of drivers - and I was told that the new driver is much better than any of the previous generations - but that remains to be seen

Another factor in the PPU or not PPU question, will be the take-up of PhysX within the UT3 engines that are going to be licensed over the next 3-36 months

Overall, a facinating battle around a technology who's goal is to improve realism/immersability WITHOUT involving the classic CPU/GPU combination

Will it end with Ageia :wallbash: or :rockon: ?
Did we not already have an “add-on” solution years ago with the 3D-FX addon cards, will this not end in the same way that in say 18 months time the GPU's will do what this card does today?
Hi Trig,

Just started writing a reply - and it ended up being too long winded !

I'll check my facts with a colleague and look at posting the reply as a mini-feature

The GPU physics idea seems to be petering out. Havok doesn't even mention its FX version on its website.

It would be nice if Ageia were able to put their product on an ExpressCard, though looking at that daughter card it seems they'd have some slimming down to do. Not everyone has $3500 for a semi-luggable.
…GPU physics idea seems to be petering out…
The jist of the reply I started to write (but gave up at the end of page 2!) centres on exactly that point - if there is no middleware firm creating physics specifically targeted to the GPU - then who will be enabling the parallel processing power ?
For a game developer to do it would (a) cost an arm and leg and (b) mean a huge bias in methodology for nVidia or ATI - depending on which set of shaders they target