![]() |
support PhysicsX and/or Havok?
New video cards coming out form ATi and Nvidia. Both have physics support on the cards themselves ( I think). Will SoW_BoB support this technology, or will the cpu have to handle the whole wad? BTW, will the sim support quad cores right out of the box, or will a simple dual core processor (like the E8400) be sufficient to handle those clouds?
thanks, Flyby |
Definitely a worthwhile question Flyby but I think the answer is going to be complicated.
PhysX hasn't made allot of waves recently after their initial outing. Yes accelerated and dedicated physics hardware is nice but there really aren't any mature APIs for it like there are for 3D graphics. Once upon a time it was identical for 3D accelerated graphics. It'll probably need to be a DirectX component for physics acceleration and some agreed to standards before it really hits the mainstream. With nVidia and ATI/AMD working on competing but similar products no doubt some standardization will come into effect sooner or later. |
I've played most of the supported games with PhysX with and without a PhysX card and there is little noticable difference other than some added visuals like bits of debris.
I bet if they do put in support for any third hardware physics engine (bare in mind Oleg and Co have created there own physics engine for IL2 and were using facets of it in the updated flight engine for some time now) it won't be core to the game so I wouldn't worry. Given that users who don't have PhysX hardware will make up the majority of those who play the game I'd expect the load to be taken on the CPU like the current IL2 engine does. |
thanks for the replies, guys. Good food for thought. I seem to recall reqding somewhere that implementation might be through DirectX, as you said, Ice. Hey maybe that means there will be a choice, and a price break on models without the...nah! never happen.
Flyby out |
nVidia bought the PhysX API from Agea and have now incorporated it into their drivers, which means all 8000, 9000 and the new 200 series of cards can run the PhysX API via a GPU shader. Smart move really, this means about 70 million nVidia cards can run Agea PhysX in hardware mode.
Don't know if Oleg will use the API in SoW, but I've always believed this technology can be employed for things far more sophisticated that just eye candy, I think it can actually enhance damage and flight modelling in the sim. Also Goggle CUDA ... very interesting reading. |
Quote:
Is the 8000 series a dual chip graphics card? That seems kind of weird to me (but if it works it works)... my graphics card works pretty hard calculating the eye candy already, I can't imagine what my FPS would be if the GPU was calculating a bunch of bricks flying about and their shadows & reflections in a rolling mist... It seems to me that you really need two chips (one for GPU, and one for physics) to see any true improvement in performance and visuals... Interesting... |
No, not two chips, more than one core on the CPU ;) Look at the tech demos for "Alan Wake". They explain everything you'll need to know about how physics in games is progressing.
Quote:
If that was the case you'd find people buying and using hardware that gives the most advantage to them online, not to mention the developer has to design and build every calculation twice, once for CPU only users and another for specialist hardware. For anything other than a bespoke bit of software you build it to the most common specificaitons and especially set standards so the widest possible market and use the product. |
I think therefore I am (lucky I posted here!) good replies, and info!
thanks! Flyby out |
To directly answer your questions, It was once stated (by Oleg or Ilya in some interview) that SoW:BoB will support dual core processors so we know it will do that for certain. They may have actually coded in support for quads since that interview, but who knows. Secondly, it was directly stated that they will NOT support the PhysX hardware but, again, that was at least 6 months ago (probably longer) and could have changed since it was originally written. As was already stated, PhysX has made some strides since it was first introduced though real-world applications are still showing less than stellar performance gains.
TB |
thanks TB,
It would be nice to know if quad cores were coded in or not, especially for those gamers among us looking to build new systems. IIRC only of the two new Nvidia cards, only the Uber GX280 will have the PhysX implementation. Maybe Oleg can address these issues in his next update? Flyby out |
I imagine that once you get over the hurdle of making something dual core capable that moving things around to four cores isn't that big of a deal as you already have the frameworks in place. I'm saying this without any real programming knowledge...just some general reading on the subject.
What they should say is mutlicore programming or multithreaded programming. Once you get past just doing everything for one core I would think you could scare upwards...as long as you had threads that could be broken out into and still remain synchronized. |
Both cards are at the moment basically used for eyecandy only. I think it would be pretty hard to make it worth much more. Consider that not everybody will buy this hardware and if you dedicate parts of the physics to these systems, you have to make physics the same for single- and multicore-PCs and both in combination with a physics coprocessor.
As tempting as it may sound, I doubt it will be more than providing additional eyecandy. |
I thought Nvidia was implementing PhysX through it's drivers somehow. Maybe I misunderstood what I read (somewhere)
|
I doubt that - at least not as the PhysX-Chip. The statements were merely plans how it could be implemented, but on the technical side, the interface between the card and the mainboard still is a bottleneck, especially at higher resolutions with 16xAF & 16XFSAA.
Considering the bandwidth and functionality of this chip, I doubt a shared interface will be the solution. On the other hand, there may be parts of the former PhysX-Chip being implemented on the GFX-card to support graphics further. Basically the same thing with the step from pure 2D-Cards that were aided with 3D-Acceleration by the 3DFX-Voodoo-Chipset. All cried out how completely useless it was, a short while later everybody implemented the functionality on their own chip in one way or the other. |
Quote:
|
Well just to further comment about physx if you will...
I would really like to see the Damage models improve dramatically. Not only from taking damage from weapons but possible "bending" your plane as well. Being hit while under a strong G load can cause a catastrophic failure or if the game models more complex systems like O2 and fuel management that they can fail on you. Now I know that really doesn't have a lot to do with Physx but it would be nice to see this sim take it to the next level. |
Don't forget each shader is a programmable core. The 280GTX has 240 of them. That is some serious parallel power, and they can be programmed using C and C++.
I suspect that is why Intel is making a lot noise lately about how GPU's will never be good for everyday computing. But I reckon their $hit'n their pants at the prospect that GPGPU's (General Purpose GPUs) are becoming more than just a fleeting hobby now. |
There's a little bit of info about PhysX on NVidia with the new drivers.
http://guru3d.com/news.html |
NOT to high-jack this thread but I'm starting to wonder about "copy-protection" software and "BoB SoW"? I wonder if they will be using something like SecuROM ?
Sorry about the quiet aside... PS...I'm also not clear on the issue of running PhysX software on a graphics card without slowing down the graphics performance... |
"PS...I'm also not clear on the issue of running PhysX software on a graphics card without slowing down the graphics performance..."
Many people rate flight sims as being CPU limited, indeed I only got a ~20% framerate increase when I went from a NV6800 to ATI 1950pro despite the new card having about twice the performace. From what I remember, the PhysX card was good at collision detection, hence the tech demo showing a stack of boxes falling and interacting with one another. GPU based collision detection could improve the framerate at critical moments ( High anti-aircraft fire rates, Bomb fragments, Many firing aircraft with many damage hit boxes ). True you would have a trade off with the look of the sim, but with the rate of change in the graphics cards you may not have long to wait before ALL settings are maxed out again. |
Quote:
So what we're going to see is a drastic improvement in physics, not eye candy. The eye candy is all there already, but what has been lacking is the accuracy of physical things, due to the computational complexity. I haven't read any articles yet, but this has been a problem for decades, which is finally come from the old 'Cray supercomputer' down to a little chip in a box. Luv this stuff :) |
Quote:
While the demo showed it working with collision, you cannot easily transfer that to a game. You cannot base complex collision-modeling on a hardware-system only 3% of the market has available. |
Guys, chill out on the requirements abit will ya.
We all know BoB will be a huge leap from IL2 in terms of realism, graphics , physics and hardware and on and on, but it's not gonna be a benchmark as demanding as Crysis. Dont get me wrong im sure it will look just awsome and sweap everything away im just saying if your looking at buying a system now, just go for a Quad and 8800 GTS Nvidia at least and say 2 gigabyte of ddr2 memory. If you do that i can personally promise you you'd be able to play on high settings. |
|
curious
Quote:
|
From my understanding, Nvidia bought phys-x's software and is incorporating it into its new drivers for the 200 series cards. I would be suprised if BOB doesn't use the graphics card for at least some of the physics, gaming theory has been moving that way for some time. As for the phys-x hardware, I believe that horse is deader than a door nail.
|
I posted that in this thread already. Nvidia incorportated Physx into drivers some time ago for the 8xxx series.
|
Quote:
|
Yeah, I'm running physx on my 8800GTS 512 at home and it works great.
|
Quote:
|
Please Urufu_Shinjiro!
I have the 175.80 driver. Can you give us some more specific information how to do that? ( PhysicsX with a 8800GTS) I would really be very thankful. Regards. |
Quote:
|
Ok, you need one of the latest beta drivers, I'm using 177.41. These will work just fine for the G92 cards and up but need a modded .inf file to install on the G92 8800's. Once you have that driver installed then install the physx software from the nvidia site. You should then be able to open the physx control panel and select GeForce for physx. More info here: http://www.overclock.net/software-ne...-released.html
|
Quote:
I though ATI went with Intel and HAVOQ.... No? |
Quote:
@ urufu's post: Be advised, that this driver-modification does NOT work with G80 GPUs. Not all 8800-series cards do have the newer G92-chipset, but you will need that in order to have the PhysX-support. Without that, you run in danger of destroying your current drivers and settings! |
Got all now!
Thank you Urufu_Shinjiro! Thx all! Regards, Solrac.:grin: |
Quote:
|
Quote:
Here is the link: http://www.tomshardware.com/news/nvi...hysx,5841.html |
Yeah, nvidia is not being stingy with CUDA or Physx and have stated that ATI is fully welcome to use it (CUDA carries a license fee though). ATI just chose to go another direction.
|
Attention to this!!!
http://www.guru3d.com/article/physx-by-nvidia-review 12th of August NVIDIA will release new graphic drivers with Physx driver integrated for all GPUs CUDA ready (G92 and later). Regards, Solrac |
Quote:
|
The official drivers should be released tommorow (August 12th) with a whole lot of extras.
Keep in mind your FPS will suffer masivly wen running PhysX on a NVidia card, buit if your running a SLI rig you can let one GPU deal with the PhysX and the other with the Graphics... I think ATI's/AMD's solution is a hell of a lot better both for PhysX and Havok as it desides to run it on or the GPU or CPU or Both depending what would give the optimal performance but with NVidia your forced to use the GPU wich is far from optimal in some cases. Still cant wait to give it all a try on my 2 8800GT cards :) |
Quote:
|
All times are GMT. The time now is 09:20 PM. |
Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.