Log in

View Full Version : support PhysicsX and/or Havok?


Flyby
06-17-2008, 11:18 PM
New video cards coming out form ATi and Nvidia. Both have physics support on the cards themselves ( I think). Will SoW_BoB support this technology, or will the cpu have to handle the whole wad? BTW, will the sim support quad cores right out of the box, or will a simple dual core processor (like the E8400) be sufficient to handle those clouds?
thanks,
Flyby

IceFire
06-18-2008, 03:32 AM
Definitely a worthwhile question Flyby but I think the answer is going to be complicated.

PhysX hasn't made allot of waves recently after their initial outing. Yes accelerated and dedicated physics hardware is nice but there really aren't any mature APIs for it like there are for 3D graphics. Once upon a time it was identical for 3D accelerated graphics.

It'll probably need to be a DirectX component for physics acceleration and some agreed to standards before it really hits the mainstream. With nVidia and ATI/AMD working on competing but similar products no doubt some standardization will come into effect sooner or later.

mondo
06-18-2008, 09:23 AM
I've played most of the supported games with PhysX with and without a PhysX card and there is little noticable difference other than some added visuals like bits of debris.

I bet if they do put in support for any third hardware physics engine (bare in mind Oleg and Co have created there own physics engine for IL2 and were using facets of it in the updated flight engine for some time now) it won't be core to the game so I wouldn't worry. Given that users who don't have PhysX hardware will make up the majority of those who play the game I'd expect the load to be taken on the CPU like the current IL2 engine does.

Flyby
06-18-2008, 11:41 AM
thanks for the replies, guys. Good food for thought. I seem to recall reqding somewhere that implementation might be through DirectX, as you said, Ice. Hey maybe that means there will be a choice, and a price break on models without the...nah! never happen.
Flyby out

Codex
06-18-2008, 11:50 AM
nVidia bought the PhysX API from Agea and have now incorporated it into their drivers, which means all 8000, 9000 and the new 200 series of cards can run the PhysX API via a GPU shader. Smart move really, this means about 70 million nVidia cards can run Agea PhysX in hardware mode.

Don't know if Oleg will use the API in SoW, but I've always believed this technology can be employed for things far more sophisticated that just eye candy, I think it can actually enhance damage and flight modelling in the sim.

Also Goggle CUDA ... very interesting reading.

proton45
06-18-2008, 12:31 PM
nVidia bought the PhysX API from Agea and have now incorporated it into their drivers, which means all 8000, 9000 and the new 200 series of cards can run the PhysX API via a GPU shader. Smart move really, this means about 70 million nVidia cards can run Agea PhysX in hardware mode.

Don't know if Oleg will use the API in SoW, but I've always believed this technology can be employed for things far more sophisticated that just eye candy, I think it can actually enhance damage and flight modelling in the sim.

Also Goggle CUDA ... very interesting reading.

So the physics(X) is calculated on the GPU instead of the CPU??

Is the 8000 series a dual chip graphics card?

That seems kind of weird to me (but if it works it works)... my graphics card works pretty hard calculating the eye candy already, I can't imagine what my FPS would be if the GPU was calculating a bunch of bricks flying about and their shadows & reflections in a rolling mist...

It seems to me that you really need two chips (one for GPU, and one for physics) to see any true improvement in performance and visuals...

Interesting...

mondo
06-18-2008, 01:00 PM
No, not two chips, more than one core on the CPU ;) Look at the tech demos for "Alan Wake". They explain everything you'll need to know about how physics in games is progressing.



Don't know if Oleg will use the API in SoW, but I've always believed this technology can be employed for things far more sophisticated that just eye candy, I think it can actually enhance damage and flight modelling in the sim.

You won't see a game engine that is a two tier system where more complex calculations are made on more complex hardware when it relates to calcualtion that all users will have to make e.g. FM's or DM'. Doing such things with eye candy is viable because it doesn't affect the games balance, although even that can be argued.

If that was the case you'd find people buying and using hardware that gives the most advantage to them online, not to mention the developer has to design and build every calculation twice, once for CPU only users and another for specialist hardware.

For anything other than a bespoke bit of software you build it to the most common specificaitons and especially set standards so the widest possible market and use the product.

Flyby
06-18-2008, 01:55 PM
I think therefore I am (lucky I posted here!) good replies, and info!
thanks!
Flyby out

Thunderbolt56
06-18-2008, 02:22 PM
To directly answer your questions, It was once stated (by Oleg or Ilya in some interview) that SoW:BoB will support dual core processors so we know it will do that for certain. They may have actually coded in support for quads since that interview, but who knows. Secondly, it was directly stated that they will NOT support the PhysX hardware but, again, that was at least 6 months ago (probably longer) and could have changed since it was originally written. As was already stated, PhysX has made some strides since it was first introduced though real-world applications are still showing less than stellar performance gains.


TB

Flyby
06-18-2008, 04:52 PM
thanks TB,
It would be nice to know if quad cores were coded in or not, especially for those gamers among us looking to build new systems. IIRC only of the two new Nvidia cards, only the Uber GX280 will have the PhysX implementation. Maybe Oleg can address these issues in his next update?
Flyby out

IceFire
06-18-2008, 10:10 PM
I imagine that once you get over the hurdle of making something dual core capable that moving things around to four cores isn't that big of a deal as you already have the frameworks in place. I'm saying this without any real programming knowledge...just some general reading on the subject.

What they should say is mutlicore programming or multithreaded programming. Once you get past just doing everything for one core I would think you could scare upwards...as long as you had threads that could be broken out into and still remain synchronized.

Feuerfalke
06-19-2008, 06:43 AM
Both cards are at the moment basically used for eyecandy only. I think it would be pretty hard to make it worth much more. Consider that not everybody will buy this hardware and if you dedicate parts of the physics to these systems, you have to make physics the same for single- and multicore-PCs and both in combination with a physics coprocessor.

As tempting as it may sound, I doubt it will be more than providing additional eyecandy.

Flyby
06-19-2008, 12:33 PM
I thought Nvidia was implementing PhysX through it's drivers somehow. Maybe I misunderstood what I read (somewhere)

Feuerfalke
06-19-2008, 12:49 PM
I doubt that - at least not as the PhysX-Chip. The statements were merely plans how it could be implemented, but on the technical side, the interface between the card and the mainboard still is a bottleneck, especially at higher resolutions with 16xAF & 16XFSAA.

Considering the bandwidth and functionality of this chip, I doubt a shared interface will be the solution.

On the other hand, there may be parts of the former PhysX-Chip being implemented on the GFX-card to support graphics further.

Basically the same thing with the step from pure 2D-Cards that were aided with 3D-Acceleration by the 3DFX-Voodoo-Chipset. All cried out how completely useless it was, a short while later everybody implemented the functionality on their own chip in one way or the other.

mondo
06-19-2008, 02:44 PM
I thought Nvidia was implementing PhysX through it's drivers somehow. Maybe I misunderstood what I read (somewhere)

All Nvidia 8XXX cards have the ability to run the PhysX software. The new ATI boards about to come out also will be able to run PhysX too because they went into partnership with Nvidia as long as they used the CUDA programming model.

JG27CaptStubing
06-19-2008, 05:57 PM
Well just to further comment about physx if you will...

I would really like to see the Damage models improve dramatically. Not only from taking damage from weapons but possible "bending" your plane as well. Being hit while under a strong G load can cause a catastrophic failure or if the game models more complex systems like O2 and fuel management that they can fail on you. Now I know that really doesn't have a lot to do with Physx but it would be nice to see this sim take it to the next level.

Codex
06-20-2008, 09:19 AM
Don't forget each shader is a programmable core. The 280GTX has 240 of them. That is some serious parallel power, and they can be programmed using C and C++.

I suspect that is why Intel is making a lot noise lately about how GPU's will never be good for everyday computing. But I reckon their $hit'n their pants at the prospect that GPGPU's (General Purpose GPUs) are becoming more than just a fleeting hobby now.

airguitarist
06-20-2008, 12:24 PM
There's a little bit of info about PhysX on NVidia with the new drivers.

http://guru3d.com/news.html

proton45
06-20-2008, 02:22 PM
NOT to high-jack this thread but I'm starting to wonder about "copy-protection" software and "BoB SoW"? I wonder if they will be using something like SecuROM ?

Sorry about the quiet aside...

PS...I'm also not clear on the issue of running PhysX software on a graphics card without slowing down the graphics performance...

airguitarist
06-20-2008, 04:52 PM
"PS...I'm also not clear on the issue of running PhysX software on a graphics card without slowing down the graphics performance..."

Many people rate flight sims as being CPU limited, indeed I only got a ~20% framerate increase when I went from a NV6800 to ATI 1950pro despite the new card having about twice the performace.

From what I remember, the PhysX card was good at collision detection, hence the tech demo showing a stack of boxes falling and interacting with one another. GPU based collision detection could improve the framerate at critical moments ( High anti-aircraft fire rates, Bomb fragments, Many firing aircraft with many damage hit boxes ).

True you would have a trade off with the look of the sim, but with the rate of change in the graphics cards you may not have long to wait before ALL settings are maxed out again.

K_Freddie
06-20-2008, 10:07 PM
From what I remember, the PhysX card was good at collision detection, hence the tech demo showing a stack of boxes falling and interacting with one another. GPU based collision detection could improve the framerate at critical moments ( High anti-aircraft fire rates, Bomb fragments, Many firing aircraft with many damage hit boxes ).

You've hit it on the button here.
So what we're going to see is a drastic improvement in physics, not eye candy.
The eye candy is all there already, but what has been lacking is the accuracy of physical things, due to the computational complexity.

I haven't read any articles yet, but this has been a problem for decades, which is finally come from the old 'Cray supercomputer' down to a little chip in a box.
Luv this stuff
:)

Feuerfalke
06-21-2008, 09:07 AM
Well just to further comment about physx if you will...

I would really like to see the Damage models improve dramatically. Not only from taking damage from weapons but possible "bending" your plane as well. Being hit while under a strong G load can cause a catastrophic failure or if the game models more complex systems like O2 and fuel management that they can fail on you. Now I know that really doesn't have a lot to do with Physx but it would be nice to see this sim take it to the next level.

This is what Oleg already has planned to be in the game. As you posted, it has nothing to do with PhysX, though. The number of debris coming off a plane, the way these bits tumble and fall to the ground, smoke-effects and visibile effects from explosions, that's what the PhysX-chip does right now.

While the demo showed it working with collision, you cannot easily transfer that to a game. You cannot base complex collision-modeling on a hardware-system only 3% of the market has available.

virre89
06-21-2008, 09:09 AM
Guys, chill out on the requirements abit will ya.

We all know BoB will be a huge leap from IL2 in terms of realism, graphics , physics and hardware and on and on, but it's not gonna be a benchmark as demanding as Crysis.

Dont get me wrong im sure it will look just awsome and sweap everything away im just saying if your looking at buying a system now, just go for a Quad and 8800 GTS Nvidia at least and say 2 gigabyte of ddr2 memory. If you do that i can personally promise you you'd be able to play on high settings.

Codex
07-02-2008, 10:56 PM
http://www.atomicmpc.com.au/article.asp?CIID=115375

Flyby
07-03-2008, 12:27 AM
Guys, chill out on the requirements abit will ya.

We all know BoB will be a huge leap from IL2 in terms of realism, graphics , physics and hardware and on and on, but it's not gonna be a benchmark as demanding as Crysis.

Dont get me wrong im sure it will look just awsome and sweap everything away im just saying if your looking at buying a system now, just go for a Quad and 8800 GTS Nvidia at least and say 2 gigabyte of ddr2 memory. If you do that i can personally promise you you'd be able to play on high settings.
Does that include the dreaded new clouds too?

BadAim
07-03-2008, 12:55 PM
From my understanding, Nvidia bought phys-x's software and is incorporating it into its new drivers for the 200 series cards. I would be suprised if BOB doesn't use the graphics card for at least some of the physics, gaming theory has been moving that way for some time. As for the phys-x hardware, I believe that horse is deader than a door nail.

mondo
07-03-2008, 04:55 PM
I posted that in this thread already. Nvidia incorportated Physx into drivers some time ago for the 8xxx series.

BadAim
07-03-2008, 05:46 PM
I posted that in this thread already. Nvidia incorportated Physx into drivers some time ago for the 8xxx series.

Oops, missed that one.

Urufu_Shinjiro
07-03-2008, 07:46 PM
Yeah, I'm running physx on my 8800GTS 512 at home and it works great.

Skoshi Tiger
07-05-2008, 06:18 AM
Yeah, I'm running physx on my 8800GTS 512 at home and it works great.

Hi there, I'm running a 9800GTX and I've installed the latest beta drivers. How do I find out if PhysX is running. When I run the Ageia PhysX properties program it says I don't have an ageia card. Does Nvidia have a test program?

Solrac
07-05-2008, 06:34 PM
Please Urufu_Shinjiro!
I have the 175.80 driver.
Can you give us some more specific information how to do that? ( PhysicsX with a 8800GTS)
I would really be very thankful.

Regards.

BadAim
07-05-2008, 11:41 PM
Please Urufu_Shinjiro!
I have the 175.80 driver.
Can you give us some more specific information how to do that? ( PhysicsX with a 8800GTS)
I would really be very thankful.

Regards.

I think it is transparent to the user on the graphics card end. As far as I know only a few games us it.

Urufu_Shinjiro
07-07-2008, 08:02 PM
Ok, you need one of the latest beta drivers, I'm using 177.41. These will work just fine for the G92 cards and up but need a modded .inf file to install on the G92 8800's. Once you have that driver installed then install the physx software from the nvidia site. You should then be able to open the physx control panel and select GeForce for physx. More info here: http://www.overclock.net/software-news/349699-nvidia-forceware-177-41-released.html

Zoom2136
07-07-2008, 08:24 PM
All Nvidia 8XXX cards have the ability to run the PhysX software. The new ATI boards about to come out also will be able to run PhysX too because they went into partnership with Nvidia as long as they used the CUDA programming model.


I though ATI went with Intel and HAVOQ.... No?

Feuerfalke
07-07-2008, 09:15 PM
I though ATI went with Intel and HAVOQ.... No?

That's what I heared, yes. At least the new 4xxx-cards are being released with full HAVOK-Support.

@ urufu's post:

Be advised, that this driver-modification does NOT work with G80 GPUs. Not all 8800-series cards do have the newer G92-chipset, but you will need that in order to have the PhysX-support. Without that, you run in danger of destroying your current drivers and settings!

Solrac
07-07-2008, 09:59 PM
Got all now!

Thank you Urufu_Shinjiro! Thx all!

Regards, Solrac.:grin:

Urufu_Shinjiro
07-08-2008, 09:52 PM
That's what I heared, yes. At least the new 4xxx-cards are being released with full HAVOK-Support.

@ urufu's post:

Be advised, that this driver-modification does NOT work with G80 GPUs. Not all 8800-series cards do have the newer G92-chipset, but you will need that in order to have the PhysX-support. Without that, you run in danger of destroying your current drivers and settings!

Right, thats why I said G92 and up.

Zoom2136
07-09-2008, 01:37 PM
That's what I heared, yes. At least the new 4xxx-cards are being released with full HAVOK-Support.

@ urufu's post:

Be advised, that this driver-modification does NOT work with G80 GPUs. Not all 8800-series cards do have the newer G92-chipset, but you will need that in order to have the PhysX-support. Without that, you run in danger of destroying your current drivers and settings!

OK just read something on Tom's hardware... stating that a guy is working on a way to run CUDA on ATI cards... It says that he actually is getting NVIDIA support to do so...

Here is the link:

http://www.tomshardware.com/news/nvidia-ati-physx,5841.html

Urufu_Shinjiro
07-09-2008, 07:34 PM
Yeah, nvidia is not being stingy with CUDA or Physx and have stated that ATI is fully welcome to use it (CUDA carries a license fee though). ATI just chose to go another direction.

Solrac
08-07-2008, 07:07 PM
Attention to this!!!

http://www.guru3d.com/article/physx-by-nvidia-review

12th of August NVIDIA will release new graphic drivers with Physx driver integrated for all GPUs CUDA ready (G92 and later).

Regards, Solrac

Tenebrae
08-07-2008, 10:38 PM
If that was the case you'd find people buying and using hardware that gives the most advantage to them online, not to mention the developer has to design and build every calculation twice, once for CPU only users and another for specialist hardware.

People already do have an advantage online with beter hardware configurations. Track IR, Triple Head to go, CH Joystick + Rudder versus Keyboard (yikes) etc.

WhiteSnake
08-11-2008, 08:43 PM
The official drivers should be released tommorow (August 12th) with a whole lot of extras.

Keep in mind your FPS will suffer masivly wen running PhysX on a NVidia card, buit if your running a SLI rig you can let one GPU deal with the PhysX and the other with the Graphics...

I think ATI's/AMD's solution is a hell of a lot better both for PhysX and Havok as it desides to run it on or the GPU or CPU or Both depending what would give the optimal performance but with NVidia your forced to use the GPU wich is far from optimal in some cases.

Still cant wait to give it all a try on my 2 8800GT cards :)

mondo
08-12-2008, 10:09 AM
People already do have an advantage online with beter hardware configurations. Track IR, Triple Head to go, CH Joystick + Rudder versus Keyboard (yikes) etc.

Thats a bit different to having a DM, FM of physics based on hardware ;)