Fulqrum Publishing Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official Fulqrum Publishing forum > Fulqrum Publishing > IL-2 Sturmovik: Cliffs of Dover

IL-2 Sturmovik: Cliffs of Dover Latest instalment in the acclaimed IL-2 Sturmovik series from award-winning developer Maddox Games.

Reply
 
Thread Tools Display Modes
  #1  
Old 07-20-2012, 07:48 AM
Icebear Icebear is offline
Approved Member
 
Join Date: Aug 2011
Location: Antarctica
Posts: 156
Default

Quote:
Originally Posted by SKUD View Post
Although this has all the potential of grabbing attention and causing real action as a late night infomercial in a nursing home here goes..

FIX SLI!

Dumbing down the graphics to increase performance has done nothing to help the experience of a lot of players with high-end equipment. Even more folks have multiple mid-range cards that would benefit greatly from functional SLI capability. IMHO the fact that this extremely graphics intensive application cannot use SLI or crossfire is its biggest fault now that CTD is fixed. Its like having a Corvette Z06 with P185 tires or a 3/4 lb burger on a saltine cracker.

If this app had SLI the devs could turn back on "REAL" original textures and landscape environment that was last seen 2 patched ago. Our machines would be able to handle tree rendering out to as far as we could see. Micro stuttering would likely disappear.

Then they could quit messing with the graphics engine and focus on the game play features. The graphics card industry will not be able to patch up this huge deficiency any time soon. I know this because my GTX 580 outperforms my 590 when NVidia lists the 590 as a 50% improvement.

So, my question is, why is there no focus on fixing SLI? Is it just too hard? Does it require the enlistment of help form the card builders?
Reply With Quote
  #2  
Old 07-20-2012, 08:54 AM
Damixu Damixu is offline
Approved Member
 
Join Date: Jul 2010
Location: Finland
Posts: 128
Default

Hey, don't be stupid and buy SLI/Crossfire GPU combos. Every hardcore, I mean Hardcore game/sim only get worse with those. Those run of the mill sim look-a-likes doesn't count...

My golden rule is to buy always fastest single core GPU available and it works out of the box right away....
Reply With Quote
  #3  
Old 07-20-2012, 09:03 AM
pstyle pstyle is offline
Approved Member
 
Join Date: Mar 2011
Posts: 328
Default

Quote:
Originally Posted by Damixu View Post
Hey, don't be stupid and buy SLI/Crossfire GPU combos. Every hardcore, I mean Hardcore game/sim only get worse with those. Those run of the mill sim look-a-likes doesn't count...

My golden rule is to buy always fastest single core GPU available and it works out of the box right away....
oooof? I like the use of this vague term "hardcore"... a good way of ensuring you can weasel out of any examples to the contrary by claiming they just aint hardcore enough.
I bought 2 x GTX570s in SLI mode. Worked out of the box right away.. On games that actually support SLI (skyrim for one) it's bloody great.
Reply With Quote
  #4  
Old 07-20-2012, 09:17 AM
Damixu Damixu is offline
Approved Member
 
Join Date: Jul 2010
Location: Finland
Posts: 128
Default

Quote:
Originally Posted by pstyle View Post
oooof? I like the use of this vague term "hardcore"... a good way of ensuring you can weasel out of any examples to the contrary by claiming they just aint hardcore enough.
I bought 2 x GTX570s in SLI mode. Worked out of the box right away.. On games that actually support SLI (skyrim for one) it's bloody great.

Go ahead any fly decisive aerial battles in English channel in year 1940 with Skyrim.

I know most of the major titles in gaming support SLI/Crossfire but strangely several hardcore simulation games doesn't. I suppose it's because sims are quite niche market and the developer companies are very small or limited and they can't affort to program support those dual/trio GPU cards (or does not use ready made gaming enigne which could support out of the box).

edit: Byt the way, I love Skyrim

Last edited by Damixu; 07-20-2012 at 09:19 AM.
Reply With Quote
  #5  
Old 07-20-2012, 09:40 AM
Stublerone Stublerone is offline
Approved Member
 
Join Date: Sep 2011
Posts: 250
Default

Just waste to use sli, sorry! It seems, that you are really thinking that sli will improve clod performance by far? I say: U only get a big performance increase with sli in clod, if you already have highend cards and start multiple display setups. A sli with mid range cards woulf only help under the circumstance, that your mid range card has 3 gb ram on every card. Otherwise, the tree rendering or other things wouldn't improve that much.

Be aware, that as soon as your card is running out of memory, clod will run insufficient.

I do not know, why some users really think, that sli is good!?! The currently used technique is insufficient until they rethink this whole sli thing and give us a new sli.

Simply dumb to talk about that technology in that state. Never think about sli as a cheap upgrade to get more power. It is just for a small part of the com, who spend a lot of money to max out to the highest possible.

Skyrim is for sure a good game but not comparable. Skyrim for me is casual gaming and the engine is optimized to work for consoles as well. So the load , the memory usage has to be low. Ps3 nearly has no memory. So, all this games never reach the memory usage of a pc sim game. ->That is, why u can really increase performance in skyrim, but not in clod (only with the explained hardware, which is not mid-range in most cases).
Reply With Quote
  #6  
Old 07-20-2012, 09:27 AM
flyingblind flyingblind is offline
Approved Member
 
Join Date: Oct 2009
Posts: 255
Default

The OP dose have a point. I have a mid-range card with 1GB memory and at least wish I had got a 2GB version. The best option for me would be to buy a second matching card if only I could be sure that would work properly. Do I waste the first card and go for a better single GPU or do I risk wasting more, albeit a smaller amount of money on a SLI setup?
Reply With Quote
  #7  
Old 07-20-2012, 10:21 AM
Stublerone Stublerone is offline
Approved Member
 
Join Date: Sep 2011
Posts: 250
Default

Quote:
Originally Posted by flyingblind View Post
The OP dose have a point. I have a mid-range card with 1GB memory and at least wish I had got a 2GB version. The best option for me would be to buy a second matching card if only I could be sure that would work properly. Do I waste the first card and go for a better single GPU or do I risk wasting more, albeit a smaller amount of money on a SLI setup?
Take into consideration, that 2 x1gb cards won't help. It is not 2 gb!!!!! It is still 1gb in sli, so your memory is way to low. Even a 2gb card will have problems in maxed setting and 1080p resolution, as the game will consume more than 2gb. My hd7970 is consuming up to 2,7 gb !!!! So even a sli with 2 x 2gb cannot help that much. So, 2 gb is the lowest memory, u should have to run it with some higher details, but still with stutters, if you are unlucky.
Reply With Quote
  #8  
Old 07-20-2012, 10:36 AM
Blackdog_kt Blackdog_kt is offline
Approved Member
 
Join Date: Jan 2008
Posts: 2,715
Default

Quote:
Originally Posted by flyingblind View Post
The OP dose have a point. I have a mid-range card with 1GB memory and at least wish I had got a 2GB version. The best option for me would be to buy a second matching card if only I could be sure that would work properly. Do I waste the first card and go for a better single GPU or do I risk wasting more, albeit a smaller amount of money on a SLI setup?
Sorry to be the bearer of bad news, but SLI doesn't increase your available video RAM.

I'm not exactly sure why, but i think that if cards use split frame rendering they render half of each frame, but they need to store the entire frame in each card's memory.

When they render alternate frames (eg, card 1 renders frame 250 and card 2 renders frame 251) the same holds, because each card has its RAM filled with an entire frame.

You do have double the video RAM but you also use double, so there is no gain in available memory.
Reply With Quote
  #9  
Old 07-20-2012, 12:17 PM
6BL Bird-Dog's Avatar
6BL Bird-Dog 6BL Bird-Dog is offline
Approved Member
 
Join Date: Feb 2008
Posts: 209
Default x-fire sli

From experiance I got various results with different card combo`s single and X-Fire.
1x AMD Saphire Toxic 5870 Oc, poor frame rates and micro studders,had to use medium settings .
2x AMD Saphire Toxic 5870 Oc,best ever performance when the early patch release enabled X-Fire,Max settings, good high frame rates and only the microstudders low.Ruined when a small patch was bought out within hours because of graphics anomolys with Nvidia systems.
1xXFX HD 7970 Black Edition,Improvement on single 5870, but running at Very High in game settings made little difference ,frame rates improved slightly.
2xXFX HD 7970 Black Edition,with both last Alpa and Beta patches more or less the same as a single XFX HD 7970 Black Edition performance slightly improved at highest settings.
I tested the same cards and setups on the rig in my sig and did compare performance with RoF and iL2 1946 ,the latter of which does not have X-Fire support.
In Rise of Flight as I Improved my Gpu hardware I was able to increases my quality settings with also an increase in FPS .The 2x AMD Saphire Toxic 5870 Oc performance was slghtly slower than a single XFX HD 7970 Black Edition.
2xXFX HD 7970 Black Edition runs fluidly with all settings at max with the exception of AntAliasing which Inever run above 2x as it murders the fps in ROF.
iL2 1946 ran maxed out on all combinations ,seamlessly on 2xXFX HD 7970 Black Edition, with the Average FPS limited to 60mhz(my native screen res),but max frames are almost always over 1000fps.
The Clod dev team hit the sweet spot once for me on X-fire but have never unfortunately got close since.
In Cliffs of Dover`s present state apart from increasing in game video quality settings performance wise X-Fire is not functioning in anyway near to the Early X-Fire patch release which on my system was awesome compared to any patches before or after,the textures looked much better then too if a little bit too bright.
__________________
ASUS Sabertooth 990FX R2.0 AMD FX-8350@4GB Watercooled
2X8GB Crucial 1866 - 2x XFX HD 7970 Black Edition in X-Fire, Water Cooled 1900X1200 Native res
OCZ AGILITY 3 240GB O/S WINDOWS 7 Home Premium OCZ AGILITY 4 240GB WESTERN DIGITAL 500GB

Last edited by 6BL Bird-Dog; 07-20-2012 at 12:23 PM.
Reply With Quote
  #10  
Old 07-21-2012, 07:11 AM
SKUD SKUD is offline
Approved Member
 
Join Date: Jun 2010
Posts: 55
Default

Quote:
Originally Posted by Blackdog_kt View Post
Sorry to be the bearer of bad news, but SLI doesn't increase your available video RAM.

I'm not exactly sure why, but i think that if cards use split frame rendering they render half of each frame, but they need to store the entire frame in each card's memory.

When they render alternate frames (eg, card 1 renders frame 250 and card 2 renders frame 251) the same holds, because each card has its RAM filled with an entire frame.

You do have double the video RAM but you also use double, so there is no gain in available memory.
So let me make sure I have this straight. Nvidia wasted $$ putting an extra 1.5 GB of DDR5 on my 590 in full knowledge that it would never be used ?? Now they did it again with the 690 throwing away 2GB of VRAM because SLI can never use the VRAM from both cards. Those silly guys. Thanks for the tip.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 12:25 AM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.