Fulqrum Publishing Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official Fulqrum Publishing forum > Fulqrum Publishing > IL-2 Sturmovik

IL-2 Sturmovik The famous combat flight simulator.

Reply
 
Thread Tools Display Modes
  #51  
Old 12-17-2010, 03:47 PM
KG26_Alpha KG26_Alpha is offline
Approved Member
 
Join Date: Jan 2008
Location: London
Posts: 2,805
Default

Quote:
Originally Posted by T}{OR View Post
And there always will be. I just wrote what I read on the net, based on the various reviews.

Take it or leave it, your choice.

I always find it best to comment on what you have first hand experience of.

As we know the internet is all ways 100% correct factually and totally unbiased so it must be all true.


@ Klem

Looks like a good system so long as its stress tested


.

Last edited by KG26_Alpha; 12-17-2010 at 04:46 PM.
Reply With Quote
  #52  
Old 12-17-2010, 04:45 PM
Triggaaar Triggaaar is offline
Approved Member
 
Join Date: Sep 2010
Posts: 535
Default

Quote:
Originally Posted by klem View Post
Power and Heat: The 570 edges the 6970 out by something like 30w and 10 degrees under load.
You seem have compared them fairly, but the 570 uses a little more power than the 6970.

AnAnd - Load Power Consumption - Crysis
570 = 361 Watts
6970 = 340 Watts
(I'll ignore Furmark as it's limited, but amd much less)

Hardware Canucks - 1hr 3DMark Batch size test
570 = 337 Watts
6970 = 364 Watts

That's a 48 Watt swing. Watt is that all about?

Now checking Guru3d, which calculates the GPU load:
570 = 213
6970 = 207
And HardOCP, which calculates total system
570 = 454W
6970 = 428W
And BitTech
570 = 330
6970 = 306
Reply With Quote
  #53  
Old 12-17-2010, 04:46 PM
JG27CaptStubing JG27CaptStubing is offline
Approved Member
 
Join Date: Dec 2007
Posts: 330
Default

Some very interesting comments coming from both camps... I look at it in much more simpler terms.

Regarding Dual GPU setups... While I enjoyed the scaling of my SLI setup back in the day I didn't enjoy the headaches involved everytime there was a driver update. Studders galore in IL2 and even with a lot of tweaking and messing around with the newer Nvidia drivers you still can't entirely get rid of studders in IL2.

My choice to go with a 580 was simple. I wanted the fastest possible card without the tweaking crap that comes with multi GPU setups. For those trying to save a buck it certainly does scale well and it looks like AMD stepped up with their new rev of drivers and cards.

Sure sure you can save a few bucks and get great scaling with multi GPU setups but it does come at a cost. You will pay for it eventually. I have yet to see Nvidia or ATI come out with perfect drivers.

I'm now running an EVGA 580 and it's been a fantastic NO hassel experience. In terms of SOW I don't think it's going to matter much which camp you're in though I know for a fact Oleg likes Nvidia.
Reply With Quote
  #54  
Old 12-17-2010, 05:01 PM
C_G C_G is offline
Approved Member
 
Join Date: Nov 2008
Posts: 95
Default

Just a bit of speculation here, but I think that in most cases SoW is going to be bottlenecking the CPU, not the GPU.

I think this because there is a lot more going on "under the hood" (i.e. non-visually) in terms of DM than before.
We know that each subsystem of the aircraft is modeled internally for DM purposes.
We also know from his thread on radar a while back, that ground objects are also subject to a sophisticated DM model. IIRC he stated that if the power generator for a radar station (or maybe it was for a searchlight) is knocked out then the radar station (or search light) will not work, though it has not been damaged itself.

I think it's safe to assume that AI routines will be more complex and that FM routines will also.

On the other hand, we also know (iirc- I may be mistaken) that SoW is being designed to utilize all the cores of multicore CPU's, so it may well be that notwithstanding the increase in "under the hood" complexity CPUs will not be the chokepoint and a need for bleeding edge cards will still exist.

This uncertainty is a large part of the reason I'm holding off my upgrades until after SoW is released and we have some definite information on this.

C_G
Reply With Quote
  #55  
Old 12-17-2010, 05:34 PM
Coen020 Coen020 is offline
Approved Member
 
Join Date: Nov 2010
Location: Amsterdam
Posts: 129
Default

Ok guys, i tried to understand as much of this thread as i possibly could but, without much succes i'm afraid.

My only question is: will it make much difference? amd or nvidia?

and will SOW run good on my pc?

Processor / CPU - AMD Athlon II X4 635
Mainboard - Gigabye GA-790XTA-UD4
PSU - Antec 550W Basiq Modular
Memory - 4GB OCZ DDR3 1333MHz
OS - Windows 7 64-bit
VGA - XFX 5850
Reply With Quote
  #56  
Old 12-17-2010, 05:46 PM
Triggaaar Triggaaar is offline
Approved Member
 
Join Date: Sep 2010
Posts: 535
Default

Quote:
Originally Posted by Coen020 View Post
My only question is: will it make much difference? amd or nvidia?
Honestly, we don't know. These are the facts though:
Oleg has said it will run best with a DX11 card - so both nvidia and AMD are fine there
Oleg has said that Nvidia have been more helpful so far - this may indicate that drivers will be better for nvidia, but we are guessing

Quote:
and will SOW run good on my pc?

Processor / CPU - AMD Athlon II X4 635
Mainboard - Gigabye GA-790XTA-UD4
PSU - Antec 550W Basiq Modular
Memory - 4GB OCZ DDR3 1333MHz
OS - Windows 7 64-bit
VGA - XFX 5850
Here we have to guess, and it depends what you mean by good. I would be happy to bet that you would need a top end system to run at the highest settings. But your system is quite nice, so just enjoy it until SoW is released, and then we'll know for sure if there are any bits that are holding it back.
Reply With Quote
  #57  
Old 12-17-2010, 05:58 PM
C_G C_G is offline
Approved Member
 
Join Date: Nov 2008
Posts: 95
Default

Quote:
Originally Posted by Coen020 View Post
Ok guys, i tried to understand as much of this thread as i possibly could but, without much succes i'm afraid.

My only question is: will it make much difference? amd or nvidia?

and will SOW run good on my pc?

Processor / CPU - AMD Athlon II X4 635
Mainboard - Gigabye GA-790XTA-UD4
PSU - Antec 550W Basiq Modular
Memory - 4GB OCZ DDR3 1333MHz
OS - Windows 7 64-bit
VGA - XFX 5850
I would think that so long as you go with an AMD or nVidia card that roughly matches your upper range machine you'll be fine. There may be some bells and whistles option at the very high end that you'll have to decide whether to compromise fps for, but Oleg won't have designed SoW:BoB such that a machine like yours won't run it well.

IMO, the AMD/ATI vs nVidia wars are rather ridiculous as they often come down to a difference of a few percentage points of difference in performance. If you look at the fps difference between the two companies in the charts in the first (Tom's) article linked, the difference in performance between same-price level cards is marginal and unlikely to be noticeable in practice (can anyone ACTUALLY tell the difference between 111 fps and 93 fps?).

The bottom line is that you get what you pay for (and the companies price as against their competitor for roughly equivalent performance), with diminishing returns per $ as you get to the "bleeding edge" of new technology.

I don't think going nVidia or going AMD is going to make a huge difference in practice so long as you get roughly equivalent cards.
Reply With Quote
  #58  
Old 12-17-2010, 06:16 PM
LoBiSoMeM LoBiSoMeM is offline
Approved Member
 
Join Date: May 2010
Posts: 963
Default

Always amazes me why people think in therms of upgrade because some VGA brand "support" a game development...

The problem with IL-2 and ATI was the OpenGL drivers. IL-2 was optimized using OpenGL, runs with higher quality under OpenGL, and ATI do some bad work in these drivers, but IL-2 is a "discontinued" product. And TODAY ATI cards with TODAY DRIVERS can run IL-2 like NVIDIA cards. IL-2 is "easy" for today VGAs.

But let's look at ALL OTHER TITLES. Neither ATI or NVIDIA have "gigantic troubles" running any other title. And all the major titles today runs under DirectX. So, why in hell do you believe SoW will run so better in NVIDIA hardware, if SoW will use Dx9-11 APIs?!?! Just because you think ATI didn't take the hand of Oleg and leave him to a walk on the park?

Unbelieavable. Any modern DX title will run "better" in a "better" card, ATI or NVIDIA. I alway bought cards of the two brands just looking at the actual market, and have my share of good cards of the two brands, and my share of melting cards too. And I never had to change the brand to "run better" some title, NEVER.

Pointless discussion. Read reviews, look at the specs of each card, and let the developers use the cards potential. It's their work and Oleg team will do great with ATI or NVIDIA brands, it's just DirectX API.

And if some of the two companies screw some driver, it's the way it is. Just to remember: is better some blocky text in IL-2 because of a bad driver than a melting card... Do you remember? All the two major companies make HUGE mistakes...

Out of this kind of discussion.
Reply With Quote
  #59  
Old 12-17-2010, 06:46 PM
KG26_Alpha KG26_Alpha is offline
Approved Member
 
Join Date: Jan 2008
Location: London
Posts: 2,805
Default

Quote:
Originally Posted by LoBiSoMeM View Post
Always amazes me why people think in therms of upgrade because some VGA brand "support" a game development...

The problem with IL-2 and ATI was the OpenGL drivers. IL-2 was optimized using OpenGL, runs with higher quality under OpenGL, and ATI do some bad work in these drivers, but IL-2 is a "discontinued" product. And TODAY ATI cards with TODAY DRIVERS can run IL-2 like NVIDIA cards. IL-2 is "easy" for today VGAs.

But let's look at ALL OTHER TITLES. Neither ATI or NVIDIA have "gigantic troubles" running any other title. And all the major titles today runs under DirectX. So, why in hell do you believe SoW will run so better in NVIDIA hardware, if SoW will use Dx9-11 APIs?!?! Just because you think ATI didn't take the hand of Oleg and leave him to a walk on the park?

Unbelieavable. Any modern DX title will run "better" in a "better" card, ATI or NVIDIA. I alway bought cards of the two brands just looking at the actual market, and have my share of good cards of the two brands, and my share of melting cards too. And I never had to change the brand to "run better" some title, NEVER.

Pointless discussion. Read reviews, look at the specs of each card, and let the developers use the cards potential. It's their work and Oleg team will do great with ATI or NVIDIA brands, it's just DirectX API.

And if some of the two companies screw some driver, it's the way it is. Just to remember: is better some blocky text in IL-2 because of a bad driver than a melting card... Do you remember? All the two major companies make HUGE mistakes...

Out of this kind of discussion.
Because Nvidia Have supported IL2 for over 10 years
Reply With Quote
  #60  
Old 12-17-2010, 06:58 PM
T}{OR's Avatar
T}{OR T}{OR is offline
Approved Member
 
Join Date: Feb 2008
Location: Zagreb, Croatia
Posts: 833
Default

Quote:
Originally Posted by JG27CaptStubing View Post
I'm now running an EVGA 580 and it's been a fantastic NO hassel experience. In terms of SOW I don't think it's going to matter much which camp you're in though I know for a fact Oleg likes Nvidia.
Excellent point. No hassle with single GPU-s.


Quote:
Originally Posted by C_G View Post
Just a bit of speculation here, but I think that in most cases SoW is going to be bottlenecking the CPU, not the GPU.

I think this because there is a lot more going on "under the hood" (i.e. non-visually) in terms of DM than before.
We know that each subsystem of the aircraft is modeled internally for DM purposes.
We also know from his thread on radar a while back, that ground objects are also subject to a sophisticated DM model. IIRC he stated that if the power generator for a radar station (or maybe it was for a searchlight) is knocked out then the radar station (or search light) will not work, though it has not been damaged itself.

I think it's safe to assume that AI routines will be more complex and that FM routines will also.

On the other hand, we also know (iirc- I may be mistaken) that SoW is being designed to utilize all the cores of multicore CPU's, so it may well be that notwithstanding the increase in "under the hood" complexity CPUs will not be the chokepoint and a need for bleeding edge cards will still exist.

This uncertainty is a large part of the reason I'm holding off my upgrades until after SoW is released and we have some definite information on this.

C_G
IIRC long time ago when news about SoW started Oleg confirmed only dual core support. If they implemented support for more than two cores it would be a pleasant surprise. The fact that we will have x64.exe is more important that +6 core support.


Quote:
Originally Posted by LoBiSoMeM View Post
Always amazes me why people think in therms of upgrade because some VGA brand "support" a game development...

The problem with IL-2 and ATI was the OpenGL drivers. IL-2 was optimized using OpenGL, runs with higher quality under OpenGL, and ATI do some bad work in these drivers, but IL-2 is a "discontinued" product. And TODAY ATI cards with TODAY DRIVERS can run IL-2 like NVIDIA cards. IL-2 is "easy" for today VGAs.

But let's look at ALL OTHER TITLES. Neither ATI or NVIDIA have "gigantic troubles" running any other title. And all the major titles today runs under DirectX. So, why in hell do you believe SoW will run so better in NVIDIA hardware, if SoW will use Dx9-11 APIs?!?! Just because you think ATI didn't take the hand of Oleg and leave him to a walk on the park?

Unbelieavable. Any modern DX title will run "better" in a "better" card, ATI or NVIDIA. I alway bought cards of the two brands just looking at the actual market, and have my share of good cards of the two brands, and my share of melting cards too. And I never had to change the brand to "run better" some title, NEVER.

Pointless discussion. Read reviews, look at the specs of each card, and let the developers use the cards potential. It's their work and Oleg team will do great with ATI or NVIDIA brands, it's just DirectX API.

And if some of the two companies screw some driver, it's the way it is. Just to remember: is better some blocky text in IL-2 because of a bad driver than a melting card... Do you remember? All the two major companies make HUGE mistakes...

Out of this kind of discussion.
Nvidia (allegedly) already has SoW code over a month (so said Oleg) for the purposes of driver optimization. And if you looked at all those reviews you would have seen that certain rival cards like GTX 570 and HD 6970 run better in some games and worse in others - with a margin of 10 FPS. Not much, but difference exists non the less.
__________________

LEVEL BOMBING MANUAL v2.0 | Dedicated Bomber Squadron
'MUSTANG' - compilation of online air victories
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 12:41 PM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.