Fulqrum Publishing Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official Fulqrum Publishing forum > Fulqrum Publishing > IL-2 Sturmovik

IL-2 Sturmovik The famous combat flight simulator.

Reply
 
Thread Tools Display Modes
  #61  
Old 12-17-2010, 08:16 PM
JG27CaptStubing JG27CaptStubing is offline
Approved Member
 
Join Date: Dec 2007
Posts: 330
Default

Another point I wanted to bring up... We are very used to seeing performance in IL2 with its older yet updated OpenGL engine. I can assure you one thing. The move to DX9 and above will offer a much better scaling and visuals. In other words if IL2 used DX9 you would already see much better performance versus the visuals.

What is going to be interesting is to see the impact on the CPU. Getting a game to scale via CPU is usually a myth. It just may be that it offers more options more than FPS.
Reply With Quote
  #62  
Old 12-17-2010, 08:24 PM
Biggs [CV] Biggs [CV] is offline
Approved Member
 
Join Date: Dec 2010
Location: Ohio
Posts: 84
Default

One brand or the other is not gonna make a huge differance. If Nvidia gets 10 more frames per second so be it. If you notice the differance in 10 fps you need a new GPU.
Reply With Quote
  #63  
Old 12-17-2010, 09:13 PM
klem's Avatar
klem klem is offline
Approved Member
 
Join Date: Nov 2007
Posts: 1,653
Default

Quote:
Originally Posted by Triggaaar View Post
You seem have compared them fairly, but the 570 uses a little more power than the 6970.

AnAnd - Load Power Consumption - Crysis
570 = 361 Watts
6970 = 340 Watts
(I'll ignore Furmark as it's limited, but amd much less)

Hardware Canucks - 1hr 3DMark Batch size test
570 = 337 Watts
6970 = 364 Watts

That's a 48 Watt swing. Watt is that all about?

Now checking Guru3d, which calculates the GPU load:
570 = 213
6970 = 207
And HardOCP, which calculates total system
570 = 454W
6970 = 428W
And BitTech
570 = 330
6970 = 306
Yes Triggaaar, it was Hardware Canuck's figures I noted down. I seem to have overlooked the others
My brain was probably addled by then

For those pointing out the mimimal difference between the 570/6950 I can only say I had to decide where to throw my £300. And I don't throw money easily so I got quite deeply into those small differences

And they are right. Between the two, it probably doesn't matter too much which way you jump if you are only considering "My Position". High tesselation and resolution needs would have pushed me towards the 6970.
__________________
klem
56 Squadron RAF "Firebirds"
http://firebirds.2ndtaf.org.uk/



ASUS Sabertooth X58 /i7 950 @ 4GHz / 6Gb DDR3 1600 CAS8 / EVGA GTX570 GPU 1.28Gb superclocked / Crucial 128Gb SSD SATA III 6Gb/s, 355Mb-215Mb Read-Write / 850W PSU
Windows 7 64 bit Home Premium / Samsung 22" 226BW @ 1680 x 1050 / TrackIR4 with TrackIR5 software / Saitek X52 Pro & Rudders
Reply With Quote
  #64  
Old 12-18-2010, 12:12 AM
swiss swiss is offline
Approved Member
 
Join Date: Mar 2010
Location: Zürich, Swiss Confederation
Posts: 2,266
Default

Quote:
Originally Posted by speculum jockey View Post
I'd love to see some kWh usage per month/year for some of those cards. Given the heat they produce they might rival your average TV or small appliance.

be my guest:

http://www.guru3d.com/article/geforce-gtx-580-review/7

less than $10/month
Reply With Quote
  #65  
Old 12-18-2010, 09:53 AM
Flanker35M Flanker35M is offline
Approved Member
 
Join Date: Dec 2009
Location: Finland
Posts: 1,806
Default

S!

What I meant by comparing the 980X and 1090T BE is that those are the "top of the line" the manufacturer has. Now AMD launched the 1100T BE which is a bit more expensive than the 1090T. My comparison just showed that I can build an AMD top of the line rig capable of running ANY game greatly with less invested money than if I chose Intel/nVidia. If I could pour out money just like that, then sure I would run Intel rig, but my hard earned money is needed to run a family and RL too So AMD was a logical choice for me and has not dissapointed me in any game I play

As of stated above we seem to cling on IL-2 and it's rather old OpenGL engine to judge capabilities of a GPU. Sure in the Black Death track there is a difference in FPS, but when actually playing you can not tell the difference at all. And I have used both brands on IL-2, online and offline. I do not fully trust the benchmarks, I play the games I have and see myself With small tweaking I have gotten them to run as I want, on both brands again

I hope and wish SoW will NOT be optimized for just one brand, to get a symbol spinning or appearing on startup. DirectX is the same for both brands, they just need to get their drivers right. To force a player to change hardware because of some code writing is stupid and short sighted from any developer. I am pretty sure Oleg & Team have not fallen to this pit.
Reply With Quote
  #66  
Old 12-19-2010, 07:43 AM
klem's Avatar
klem klem is offline
Approved Member
 
Join Date: Nov 2007
Posts: 1,653
Default

Quote:
Originally Posted by T}{OR View Post
..............
IIRC long time ago when news about SoW started Oleg confirmed only dual core support. If they implemented support for more than two cores it would be a pleasant surprise. The fact that we will have x64.exe is more important that +6 core support. .................
I hope he has by now! I don't know what's involved in making it run on more than two cores but "flight sims tend to be CPU rather than GPU limited" (a quote I picked up somewhere) so that has to be an important focus for him in a cutting edge sim. Anyone upgrading for SoW now is going to have more than two cores and will be pretty £!$$*^ off if they can't get the best out of their investment. I know I will be.

Can you imagine Oleg saying (eventually), "best system requirements AMD 3800+ dual core CPU (recommend overclocking)" ?!

btw the 64 exe won't be available for some time and won't be as important as multi-core (another quote I picked up somewhere, please don't shoot the messenger).
__________________
klem
56 Squadron RAF "Firebirds"
http://firebirds.2ndtaf.org.uk/



ASUS Sabertooth X58 /i7 950 @ 4GHz / 6Gb DDR3 1600 CAS8 / EVGA GTX570 GPU 1.28Gb superclocked / Crucial 128Gb SSD SATA III 6Gb/s, 355Mb-215Mb Read-Write / 850W PSU
Windows 7 64 bit Home Premium / Samsung 22" 226BW @ 1680 x 1050 / TrackIR4 with TrackIR5 software / Saitek X52 Pro & Rudders
Reply With Quote
  #67  
Old 12-19-2010, 08:05 AM
Hecke
Guest
 
Posts: n/a
Default

Quote:
Originally Posted by klem View Post

btw the 64 exe won't be available for some time and won't be as important as multi-core (another quote I picked up somewhere, please don't shoot the messenger).
What the ...? Didn't Oleg see what happens with only 2 GB of Ram at Igromir?
Reply With Quote
  #68  
Old 12-19-2010, 08:17 AM
swiss swiss is offline
Approved Member
 
Join Date: Mar 2010
Location: Zürich, Swiss Confederation
Posts: 2,266
Default

Quote:
Originally Posted by Hecke View Post
What the ...? Didn't Oleg see what happens with only 2 GB of Ram at Igromir?
1st: klem is not oleg or part of the dev team
2nd: Oleg programmed it, he probably knows better than you what he does.

Reply With Quote
  #69  
Old 12-19-2010, 08:26 AM
LoBiSoMeM LoBiSoMeM is offline
Approved Member
 
Join Date: May 2010
Posts: 963
Default

People thinking about ATI/NVIDIA support and I only can think about multi-core full support and x64 memory adress.

I dream with devs working hard in these two points. It's time to go in this direction, and people with less than 4GB of RAM and x4 CPU starts to think about some upgrade...

In sims, will be a must: much more performance for less money, with fully optimized engines for x64 and x4 or more cores.
Reply With Quote
  #70  
Old 12-19-2010, 09:05 AM
klem's Avatar
klem klem is offline
Approved Member
 
Join Date: Nov 2007
Posts: 1,653
Default

Quote:
Originally Posted by swiss View Post
1st: klem is not oleg or part of the dev team
2nd: Oleg programmed it, he probably knows better than you what he does.

Quite right.
But Oleg did say in a post during the last couple of months that the 64 exe would not be available for a while as they have more pressing things on their minds. Thats all I know.

Then again it's my understanding that 32 bit supports 3.?? Gb RAM and can assign a full 2Gb RAM to a 32bit application. That's much better than trying to share a bare 2Gb RAM across the system AND the application as the Demo PCs apparently had to.
__________________
klem
56 Squadron RAF "Firebirds"
http://firebirds.2ndtaf.org.uk/



ASUS Sabertooth X58 /i7 950 @ 4GHz / 6Gb DDR3 1600 CAS8 / EVGA GTX570 GPU 1.28Gb superclocked / Crucial 128Gb SSD SATA III 6Gb/s, 355Mb-215Mb Read-Write / 850W PSU
Windows 7 64 bit Home Premium / Samsung 22" 226BW @ 1680 x 1050 / TrackIR4 with TrackIR5 software / Saitek X52 Pro & Rudders
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 01:20 PM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.