Official Fulqrum Publishing forum

Official Fulqrum Publishing forum (http://forum.fulqrumpublishing.com/index.php)
-   IL-2 Sturmovik: Cliffs of Dover (http://forum.fulqrumpublishing.com/forumdisplay.php?f=189)
-   -   Suggestion for the 1C Devs (http://forum.fulqrumpublishing.com/showthread.php?t=33358)

SKUD 07-20-2012 01:05 AM

Suggestion for the 1C Devs
 
Although this has all the potential of grabbing attention and causing real action as a late night infomercial in a nursing home here goes..

FIX SLI!

Dumbing down the graphics to increase performance has done nothing to help the experience of a lot of players with high-end equipment. Even more folks have multiple mid-range cards that would benefit greatly from functional SLI capability. IMHO the fact that this extremely graphics intensive application cannot use SLI or crossfire is its biggest fault now that CTD is fixed. Its like having a Corvette Z06 with P185 tires or a 3/4 lb burger on a saltine cracker.

If this app had SLI the devs could turn back on "REAL" original textures and landscape environment that was last seen 2 patched ago. Our machines would be able to handle tree rendering out to as far as we could see. Micro stuttering would likely disappear.

Then they could quit messing with the graphics engine and focus on the game play features. The graphics card industry will not be able to patch up this huge deficiency any time soon. I know this because my GTX 580 outperforms my 590 when NVidia lists the 590 as a 50% improvement.

So, my question is, why is there no focus on fixing SLI? Is it just too hard? Does it require the enlistment of help form the card builders?

Robert 07-20-2012 03:59 AM

Except that the vast majority don't play with SLI. Personally I think it's a waste and doesn't seem very cost effective to use SLI. Even if I'm wrong (and I contend I may be) and SLI is the bee's knees, how many are going to invest in SLI? We have continents of CoD players using Windows XP and DX9. Having the graphics engine working on SLI (because of sheer brute force) will do nothing for high - mid level systems. How will it help with the coding that's obviously a large issue?

Maybe I just don't see how functioning SLI would help benefit single card players.

Fjordmonkey 07-20-2012 05:15 AM

Why focus on something only a small minority actually has (SLI and Crossfire still aren't all that common since it's usually more hassle than it's worth plus that it can drive up the cost of a build rather hideously) when the majority of your customers has issues with running the game on a single-card setup?

Yes, indeed...

SKUD 07-20-2012 05:30 AM

Quote:

Originally Posted by Robert (Post 446647)
Except that the vast majority don't play with SLI. Personally I think it's a waste and doesn't seem very cost effective to use SLI. Even if I'm wrong (and I contend I may be) and SLI is the bee's knees, how many are going to invest in SLI? We have continents of CoD players using Windows XP and DX9. Having the graphics engine working on SLI (because of sheer brute force) will do nothing for high - mid level systems. How will it help with the coding that's obviously a large issue?

Maybe I just don't see how functioning SLI would help benefit single card players.

It certainly is a waste in COD's current form. If SLI were working you could get good performance out of a couple of relatively cheap cards rather than having to shell out for bleeding edge single GPUs. I think most folks would rather fork over $200 for a second GTX 560 than $600 for a new GTX 680 if it would do them some good. Right now the only hope to run at max settings is to mortgage the farm.
Anyway, it appears that brute force is just what it takes for this sim. If that weren't true my 590 would be killing my 580.

skouras 07-20-2012 06:13 AM

Quote:

Originally Posted by Robert (Post 446647)
Except that the vast majority don't play with SLI. Personally I think it's a waste and doesn't seem very cost effective to use SLI. Even if I'm wrong (and I contend I may be) and SLI is the bee's knees, how many are going to invest in SLI? We have continents of CoD players using Windows XP and DX9. Having the graphics engine working on SLI (because of sheer brute force) will do nothing for high - mid level systems. How will it help with the coding that's obviously a large issue?

Maybe I just don't see how functioning SLI would help benefit single card players.

totally agreed

furbs 07-20-2012 06:18 AM

Sli is fixed, were just waiting for a driver release.

Well, that's what Luthier said about May last year.

David198502 07-20-2012 06:32 AM

Quote:

Originally Posted by SKUD (Post 446629)
Although this has all the potential of grabbing attention and causing real action as a late night infomercial in a nursing home here goes..

FIX SLI!

Dumbing down the graphics to increase performance has done nothing to help the experience of a lot of players with high-end equipment. Even more folks have multiple mid-range cards that would benefit greatly from functional SLI capability. IMHO the fact that this extremely graphics intensive application cannot use SLI or crossfire is its biggest fault now that CTD is fixed. Its like having a Corvette Z06 with P185 tires or a 3/4 lb burger on a saltine cracker.

If this app had SLI the devs could turn back on "REAL" original textures and landscape environment that was last seen 2 patched ago. Our machines would be able to handle tree rendering out to as far as we could see. Micro stuttering would likely disappear.

Then they could quit messing with the graphics engine and focus on the game play features. The graphics card industry will not be able to patch up this huge deficiency any time soon. I know this because my GTX 580 outperforms my 590 when NVidia lists the 590 as a 50% improvement.

So, my question is, why is there no focus on fixing SLI? Is it just too hard? Does it require the enlistment of help form the card builders?

who says the CTDs are fixed??
there are still many people who suffer from them...

Icebear 07-20-2012 07:48 AM

Quote:

Originally Posted by SKUD (Post 446629)
Although this has all the potential of grabbing attention and causing real action as a late night infomercial in a nursing home here goes..

FIX SLI!

Dumbing down the graphics to increase performance has done nothing to help the experience of a lot of players with high-end equipment. Even more folks have multiple mid-range cards that would benefit greatly from functional SLI capability. IMHO the fact that this extremely graphics intensive application cannot use SLI or crossfire is its biggest fault now that CTD is fixed. Its like having a Corvette Z06 with P185 tires or a 3/4 lb burger on a saltine cracker.

If this app had SLI the devs could turn back on "REAL" original textures and landscape environment that was last seen 2 patched ago. Our machines would be able to handle tree rendering out to as far as we could see. Micro stuttering would likely disappear.

Then they could quit messing with the graphics engine and focus on the game play features. The graphics card industry will not be able to patch up this huge deficiency any time soon. I know this because my GTX 580 outperforms my 590 when NVidia lists the 590 as a 50% improvement.

So, my question is, why is there no focus on fixing SLI? Is it just too hard? Does it require the enlistment of help form the card builders?


Damixu 07-20-2012 08:54 AM

Hey, don't be stupid and buy SLI/Crossfire GPU combos. Every hardcore, I mean Hardcore game/sim only get worse with those. Those run of the mill sim look-a-likes doesn't count...

My golden rule is to buy always fastest single core GPU available and it works out of the box right away....

pstyle 07-20-2012 09:03 AM

Quote:

Originally Posted by Damixu (Post 446691)
Hey, don't be stupid and buy SLI/Crossfire GPU combos. Every hardcore, I mean Hardcore game/sim only get worse with those. Those run of the mill sim look-a-likes doesn't count...

My golden rule is to buy always fastest single core GPU available and it works out of the box right away....

oooof? I like the use of this vague term "hardcore"... a good way of ensuring you can weasel out of any examples to the contrary by claiming they just aint hardcore enough.
I bought 2 x GTX570s in SLI mode. Worked out of the box right away.. On games that actually support SLI (skyrim for one) it's bloody great.

Damixu 07-20-2012 09:17 AM

Quote:

Originally Posted by pstyle (Post 446693)
oooof? I like the use of this vague term "hardcore"... a good way of ensuring you can weasel out of any examples to the contrary by claiming they just aint hardcore enough.
I bought 2 x GTX570s in SLI mode. Worked out of the box right away.. On games that actually support SLI (skyrim for one) it's bloody great.


Go ahead any fly decisive aerial battles in English channel in year 1940 with Skyrim. :)

I know most of the major titles in gaming support SLI/Crossfire but strangely several hardcore simulation games doesn't. I suppose it's because sims are quite niche market and the developer companies are very small or limited and they can't affort to program support those dual/trio GPU cards (or does not use ready made gaming enigne which could support out of the box).

edit: Byt the way, I love Skyrim :)

flyingblind 07-20-2012 09:27 AM

The OP dose have a point. I have a mid-range card with 1GB memory and at least wish I had got a 2GB version. The best option for me would be to buy a second matching card if only I could be sure that would work properly. Do I waste the first card and go for a better single GPU or do I risk wasting more, albeit a smaller amount of money on a SLI setup?

Stublerone 07-20-2012 09:40 AM

Just waste to use sli, sorry! It seems, that you are really thinking that sli will improve clod performance by far? I say: U only get a big performance increase with sli in clod, if you already have highend cards and start multiple display setups. A sli with mid range cards woulf only help under the circumstance, that your mid range card has 3 gb ram on every card. Otherwise, the tree rendering or other things wouldn't improve that much.

Be aware, that as soon as your card is running out of memory, clod will run insufficient.

I do not know, why some users really think, that sli is good!?! The currently used technique is insufficient until they rethink this whole sli thing and give us a new sli.

Simply dumb to talk about that technology in that state. Never think about sli as a cheap upgrade to get more power. It is just for a small part of the com, who spend a lot of money to max out to the highest possible.

Skyrim is for sure a good game but not comparable. Skyrim for me is casual gaming and the engine is optimized to work for consoles as well. So the load , the memory usage has to be low. Ps3 nearly has no memory. So, all this games never reach the memory usage of a pc sim game. ->That is, why u can really increase performance in skyrim, but not in clod (only with the explained hardware, which is not mid-range in most cases).

Stublerone 07-20-2012 10:21 AM

Quote:

Originally Posted by flyingblind (Post 446700)
The OP dose have a point. I have a mid-range card with 1GB memory and at least wish I had got a 2GB version. The best option for me would be to buy a second matching card if only I could be sure that would work properly. Do I waste the first card and go for a better single GPU or do I risk wasting more, albeit a smaller amount of money on a SLI setup?

Take into consideration, that 2 x1gb cards won't help. It is not 2 gb!!!!! It is still 1gb in sli, so your memory is way to low. Even a 2gb card will have problems in maxed setting and 1080p resolution, as the game will consume more than 2gb. My hd7970 is consuming up to 2,7 gb !!!! So even a sli with 2 x 2gb cannot help that much. So, 2 gb is the lowest memory, u should have to run it with some higher details, but still with stutters, if you are unlucky. :)

Blackdog_kt 07-20-2012 10:36 AM

Quote:

Originally Posted by flyingblind (Post 446700)
The OP dose have a point. I have a mid-range card with 1GB memory and at least wish I had got a 2GB version. The best option for me would be to buy a second matching card if only I could be sure that would work properly. Do I waste the first card and go for a better single GPU or do I risk wasting more, albeit a smaller amount of money on a SLI setup?

Sorry to be the bearer of bad news, but SLI doesn't increase your available video RAM.

I'm not exactly sure why, but i think that if cards use split frame rendering they render half of each frame, but they need to store the entire frame in each card's memory.

When they render alternate frames (eg, card 1 renders frame 250 and card 2 renders frame 251) the same holds, because each card has its RAM filled with an entire frame.

You do have double the video RAM but you also use double, so there is no gain in available memory.

6BL Bird-Dog 07-20-2012 12:17 PM

x-fire sli
 
From experiance I got various results with different card combo`s single and X-Fire.
1x AMD Saphire Toxic 5870 Oc, poor frame rates and micro studders,had to use medium settings .
2x AMD Saphire Toxic 5870 Oc,best ever performance when the early patch release enabled X-Fire,Max settings, good high frame rates and only the microstudders low.Ruined when a small patch was bought out within hours because of graphics anomolys with Nvidia systems.
1xXFX HD 7970 Black Edition,Improvement on single 5870, but running at Very High in game settings made little difference ,frame rates improved slightly.
2xXFX HD 7970 Black Edition,with both last Alpa and Beta patches more or less the same as a single XFX HD 7970 Black Edition performance slightly improved at highest settings.
I tested the same cards and setups on the rig in my sig and did compare performance with RoF and iL2 1946 ,the latter of which does not have X-Fire support.
In Rise of Flight as I Improved my Gpu hardware I was able to increases my quality settings with also an increase in FPS .The 2x AMD Saphire Toxic 5870 Oc performance was slghtly slower than a single XFX HD 7970 Black Edition.
2xXFX HD 7970 Black Edition runs fluidly with all settings at max with the exception of AntAliasing which Inever run above 2x as it murders the fps in ROF.
iL2 1946 ran maxed out on all combinations ,seamlessly on 2xXFX HD 7970 Black Edition, with the Average FPS limited to 60mhz(my native screen res),but max frames are almost always over 1000fps.
The Clod dev team hit the sweet spot once for me on X-fire but have never unfortunately got close since.
In Cliffs of Dover`s present state apart from increasing in game video quality settings performance wise X-Fire is not functioning in anyway near to the Early X-Fire patch release which on my system was awesome compared to any patches before or after,the textures looked much better then too if a little bit too bright.

Wolf_Rider 07-20-2012 02:30 PM

Quote:

Originally Posted by flyingblind (Post 446700)
The OP dose have a point. I have a mid-range card with 1GB memory and at least wish I had got a 2GB version. The best option for me would be to buy a second matching card if only I could be sure that would work properly. Do I waste the first card and go for a better single GPU or do I risk wasting more, albeit a smaller amount of money on a SLI setup?


Two 1Gb cards in SLI would still only give you 1Gb to work with and SLI inherently has small stutters

Ataros 07-20-2012 03:40 PM

Quote:

Originally Posted by Damixu (Post 446691)
Hey, don't be stupid and buy SLI/Crossfire GPU combos. Every hardcore, I mean Hardcore game/sim only get worse with those. Those run of the mill sim look-a-likes doesn't count...

My golden rule is to buy always fastest single core GPU available and it works out of the box right away....

+1
All demanding games like ArmA, ArmA2, RoF had/have issues with SLI/xfire. It works in one patch and then does not work in next one. IIRC Jason wrote they could hardly get needed docs from NV but none from ATI. NV/ATI managers do not talk to such small companies as sim developers. Any patch or driver update can cause stutters. Especially when many single GPU users have stutters (due to big textures on higher settings usually or use of vsync, antivirus, etc.).

Icebear 07-20-2012 08:42 PM

Quote:

Originally Posted by Ataros (Post 446925)
+1
All demanding games like ArmA, ArmA2, RoF had/have issues with SLI/xfire. It works in one patch and then does not work in next one. IIRC Jason wrote they could hardly get needed docs from NV but none from ATI. NV/ATI managers do not talk to such small companies as sim developers. Any patch or driver update can cause stutters. Especially when many single GPU users have stutters (due to big textures on higher settings usually or use of vsync, antivirus, etc.).

And why are those games demanding? There is no real multicore support and the graphic engines are totally antiquated. I.e. Armed Assault still uses the Operation Flashpoint engine. Even BI is consquently improving it, the engine is more than 10 years old! They sold it so many times and I'm sure that they will sell it again with ArmA 3. So no surprise for me at all.

But this would not explain the problems we face at CloD as the engine is brand-new and up to date.....

SKUD 07-21-2012 07:11 AM

Quote:

Originally Posted by Blackdog_kt (Post 446736)
Sorry to be the bearer of bad news, but SLI doesn't increase your available video RAM.

I'm not exactly sure why, but i think that if cards use split frame rendering they render half of each frame, but they need to store the entire frame in each card's memory.

When they render alternate frames (eg, card 1 renders frame 250 and card 2 renders frame 251) the same holds, because each card has its RAM filled with an entire frame.

You do have double the video RAM but you also use double, so there is no gain in available memory.

So let me make sure I have this straight. Nvidia wasted $$ putting an extra 1.5 GB of DDR5 on my 590 in full knowledge that it would never be used ?? Now they did it again with the 690 throwing away 2GB of VRAM because SLI can never use the VRAM from both cards. Those silly guys. Thanks for the tip.

Blackdog_kt 07-21-2012 09:38 AM

Quote:

Originally Posted by SKUD (Post 447136)
So let me make sure I have this straight. Nvidia wasted $$ putting an extra 1.5 GB of DDR5 on my 590 in full knowledge that it would never be used ?? Now they did it again with the 690 throwing away 2GB of VRAM because SLI can never use the VRAM from both cards. Those silly guys. Thanks for the tip.

Well, google is your friend.

EVGA forums, scroll down to 9th reply by user HeavyHemi: http://www.evga.com/forums/tm.aspx?m=1421266&mpage=1
Also note that the prevailing advice in that thread if you want to really crank up the resolution while also keeping the detail settings high, is to get a single card with the highest amount of RAM you can afford. So it's not only modern flight sims that work this way (RoF also had a lot of problems with SLI early on, but i can't comment on its current state because i don't have it on my PC), it seems to be a more widespread trend in other games too.

Tom's Hardware SLI and Crossfire FAQs:
http://www.tomshardware.co.uk/forum/...crossfire-faqs

Incidentally, in the above link you can also find this little gem:
Quote:

Do SLI or CrossFire always improve performance ?
Not always.
There are some games that don't benefit from MultiGPU technology(or require a patch in order to utilize it).
For example,Flight simulator X doesn't benefit from either SLI or CrossFire.
Another example is StarCraft2 which barely benefits from more than 1 card
So FSX, a flight sim that was very demanding in graphics and CPU until hardware could could catch up with it, doesn't benefit from it. Sounds very familiar. Also note that Starcraft2 is a blockbuster AAA title.

Both of them are made by companies that could throw tons of cash on the issue. FSX is getting old and microsoft is more concerned with selling DLCs for its new MS Flight, but this wasn't always the case. Yet, they didn't fix it.
Also, SC2 is at its peak and its only part one of a trilogy, with a highly competitive multiplayer scene (think professional gamers who get paid like footbal players to take part in tournaments, etc) and the company behind it (Blizzard) has the enormous world of warcraft MMO cash-cow at its disposal and raving mad fans who buy everything they release (eg, the recent Diablo III).

If these guys can't do it or won't spend the time and money to, then the only reason i can think of is that SLI/Xfire setups are a bit too particular in terms of how you code your game in order to work correctly. It seems like the game has to be written around it and since it's a somewhat rigid and not so evolving technology (the cards evolve, but the technologies that pair them not so much), maybe it's not worth the compromises in other parts of the engine?

I'm just thinking out loud here, but the whole thing seems to completely debunk the entire "two cards = double the performance" logic. I've been ordering my PC components separately since forever and the only people i routinely see going for SLI setups are those that primarily focus on action/shooter games (simpler engines, small maps, elementary game mechanics, so all the PC has to do really is to run good graphics at a high frame rate).

The bottom line is, just because we might have some extra money to burn on a PC build doesn't mean we should go for the most expensive options. They might be kind of specialised in what they work well with.

He111 07-21-2012 12:50 PM

DRAT! you mean I bought an extra GTX 5980 for nothing!? :(

But wait, I'm future proof when game developers advance and develop for 2+ Gpus .. but WAIT!

Hopefully Rome2 will take advantage of multiple cores and GPUs?

.

SKUD 07-21-2012 06:44 PM

Quote:

Originally Posted by Blackdog_kt (Post 447154)
Well, google is your friend.

EVGA forums, scroll down to 9th reply by user HeavyHemi: http://www.evga.com/forums/tm.aspx?m=1421266&mpage=1
Also note that the prevailing advice in that thread if you want to really crank up the resolution while also keeping the detail settings high, is to get a single card with the highest amount of RAM you can afford. So it's not only modern flight sims that work this way (RoF also had a lot of problems with SLI early on, but i can't comment on its current state because i don't have it on my PC), it seems to be a more widespread trend in other games too.

Tom's Hardware SLI and Crossfire FAQs:
http://www.tomshardware.co.uk/forum/...crossfire-faqs

Incidentally, in the above link you can also find this little gem:


So FSX, a flight sim that was very demanding in graphics and CPU until hardware could could catch up with it, doesn't benefit from it. Sounds very familiar. Also note that Starcraft2 is a blockbuster AAA title.

Both of them are made by companies that could throw tons of cash on the issue. FSX is getting old and microsoft is more concerned with selling DLCs for its new MS Flight, but this wasn't always the case. Yet, they didn't fix it.
Also, SC2 is at its peak and its only part one of a trilogy, with a highly competitive multiplayer scene (think professional gamers who get paid like footbal players to take part in tournaments, etc) and the company behind it (Blizzard) has the enormous world of warcraft MMO cash-cow at its disposal and raving mad fans who buy everything they release (eg, the recent Diablo III).

If these guys can't do it or won't spend the time and money to, then the only reason i can think of is that SLI/Xfire setups are a bit too particular in terms of how you code your game in order to work correctly. It seems like the game has to be written around it and since it's a somewhat rigid and not so evolving technology (the cards evolve, but the technologies that pair them not so much), maybe it's not worth the compromises in other parts of the engine?

I'm just thinking out loud here, but the whole thing seems to completely debunk the entire "two cards = double the performance" logic. I've been ordering my PC components separately since forever and the only people i routinely see going for SLI setups are those that primarily focus on action/shooter games (simpler engines, small maps, elementary game mechanics, so all the PC has to do really is to run good graphics at a high frame rate).

The bottom line is, just because we might have some extra money to burn on a PC build doesn't mean we should go for the most expensive options. They might be kind of specialised in what they work well with.

Clearly my 590 is not using all the 3GB of VRAM it has in COD because I can directly compare it to my 3GB 580. So my question again is why would Nvidia build a card that can't use half of its VRAM under any circumstances? If this is the case then a 590/690 is nothing more than a 580/680 with a bonus space heater attached.

Never mind... my friend Google found this Gem.

"Originally Posted by CousinVin
I think i understand that putting two 3gb cards still only limits you to 3gb of usable vram.. right? If that is wrong please correct me.
You are correct.
Quote:
Now my confusion comes in with the GTX 590. It is labeled as a 3gb card, but from the assumption above, and considering that it is 1.5gb per core, is it really only 1.5 gb usable vram?
It's marketing. Joe Average can't tell the difference between total memory and dedicated memory."

So anyone looking at the 690 beware.

Royraiden 07-21-2012 11:06 PM

Its amazing that a feature that almost any current game has and a feature that was promised to be introduced/fixed more than a year ago is still not present in a sim that has so many problems performance wise and Im amazed that some guys here seem to encourage the devs to forget about a feature that could be really helpful if implemented correctly.When I bought my second video card it made a world of difference, specially on ROF, other more common titles were running a lot better to say the least.I never experienced micro stuttering or any problem related to the use of crossfire.Gpu's prices go down really quick so buying a second card to make a crossfire/sli setup is a lot cheaper than buying a single monster card.People of this community need to see things from a wider angle, having this feature could only make positive changes but on the other hand not having it as you can see is a negative thing in my opinion. By the way Im only running one card now so dont assume that I wrote this just because I had a dual gpu setup.I would be happy if multi gpu support got improved even if I cant benefit from it right now.

Blackdog_kt 07-22-2012 03:47 AM

Quote:

Originally Posted by SKUD (Post 447260)
Clearly my 590 is not using all the 3GB of VRAM it has in COD because I can directly compare it to my 3GB 580. So my question again is why would Nvidia build a card that can't use half of its VRAM under any circumstances? If this is the case then a 590/690 is nothing more than a 580/680 with a bonus space heater attached.

Never mind... my friend Google found this Gem.

"Originally Posted by CousinVin
I think i understand that putting two 3gb cards still only limits you to 3gb of usable vram.. right? If that is wrong please correct me.
You are correct.
Quote:
Now my confusion comes in with the GTX 590. It is labeled as a 3gb card, but from the assumption above, and considering that it is 1.5gb per core, is it really only 1.5 gb usable vram?
It's marketing. Joe Average can't tell the difference between total memory and dedicated memory."

So anyone looking at the 690 beware.

Yup, that's how it works. Each GPU needs its own RAM regardless of whether you have two cards with a GPU on each, or a single card with two GPUs on it (like the 690 i think). Also, each RAM has to store the entire frame before it's displayed, regardless of the SLI method used (either card 1 renders frame 1 and card 2 renders frame 2, or each card renders half of each frame).

So, when you see a single card with a dual GPU specifying 3GB of RAM on the box, what it means is 3GB divided equally among the two GPUs and mirrored for each frame -> 1.5Gb of effectively usable video RAM.

If your preferred games are heavy on textures and have long viewing distances (more textures need to be loaded per frame) it's better to go for a single 3GB card or two separate cards with 3GB each. View distance is probably the main reason that players of action and shooter games get great performance with SLI. Their view distance is nothing compared to a flight sim so this RAM issue is not so perceptible.

In the above 3GB example, to load the textures that the single card or the two-card SLI setup can, an single-card SLI setup like the 690 would have to have a total of 6GB of RAM (3 for each card).

I hope i didn't make any typos to make this confusing (it's a bit late at the moment and i'm sleepy) and that it sufficiently explains the limitations of the architecture in terms of RAM usage. Cheers ;)

Warhound 07-22-2012 04:47 AM

I understand people with SLI/crossfire want their setups to be utilized fully.
But from the devs standpoint I'd say it still makes more sense to focus on general performance.
The graphics engine is obviously not 100% optimized yet and investing time in making SLI/crossfire work is likely better spent on finding performance that benefits both single card users and multi card ones.

First of all the percentage of multi card users is small, that's a fact.
And secondly they could spend X weeks making both SLI and crossfire work, only to have it broken again in one of the monthly driverupdates.
So as a small team it makes more sense to focus on general optimization and performance, which benefits everyone, than spend hours working on something which only a small percentage of users will benefit from and which might be broken the moment you get it ready for release.
Nvm break it yourself when you push through the general optimizations that are being worked on, forcing you to start over on SLI/xfire once again.

KG26_Alpha 07-22-2012 11:20 AM

One word with GPU manufacturers and SLI & X-Fire.

Marketing

Jaws2002 07-22-2012 08:04 PM

When I got my gtx 590 there was no 3gb gtx580 and the 590 had a solid advantage over ALL other cards in a lot of games. While it only uses 1.5gb of video ram, it still runs a lot faster than a single card, in games that make sli work.
I still think they would be better off working on multi GPU support than spend months on the bloody dx9. But that's just me.
Anyway, recently I tried the sli profile I found in one of the technical threds and the performance boost I got was impressive. There are still some bad slow downs around the clouds, but everything else went from 15-20fps to 57-60 fps.
I think they would please more people fixing multi gpu support than fixing the bloody dx9, that's used by people that most likely don't have a powerful enough system for this game anyway.

Stublerone 07-23-2012 10:54 AM

They have to fix dx9. It is not, what they really want but they have to run conform on package specs.

Concerning sli: sure the optimized profiles show, that this helps improving the sli matter, bit again: the technique is old and normally (to fully make their initial intention of sli work) the whole components have to be harmonizes up to high standard. The communication between the pc components have to get high end bus systems in every case. This harmonized complete pc will reach market as soon as the manufacturers of every part work closely together ->perhaps this will never happen!
Other possibilty: A manufacturer built his own solution all-in-one. First steps can be seen in the implemented gpu in intel cpus.

In current technical configuration, sli is for high end benchmark community or simply for those to burn money to heat up the room or to use more electricity for nothing. :) Just my point of view.

von Pilsner 07-25-2012 02:28 PM

Good news for everyone but Warhound and Stublerone...

Quote:

Originally Posted by Luthier
We haven’t made changes to our SLI and Quad SLI support in the upcoming patch.

More specifically about the SLI question.
SLI support is two-ways, it needs to be in our game and in NVidia drivers. We have it in our game, but the patch needs to be certified by NVidia and a change needs to be added on their end.
We probably will try to get the upcoming betapatch certified by NVidia and get SLI support for the game with the next driver release.

Official nVidia support for CloD may be just what we need... :D

Stublerone 07-26-2012 11:00 AM

Hopefully it helps you a lot. Those with older sli setups will nevertheless suffer of their low vram. Wush you good lucj! ;)

FS~Phat 07-26-2012 11:22 AM

Quote:

Originally Posted by Stublerone (Post 447611)
They have to fix dx9. It is not, what they really want but they have to run conform on package specs.

Concerning sli: sure the optimized profiles show, that this helps improving the sli matter, bit again: the technique is old and normally (to fully make their initial intention of sli work) the whole components have to be harmonizes up to high standard. The communication between the pc components have to get high end bus systems in every case. This harmonized complete pc will reach market as soon as the manufacturers of every part work closely together ->perhaps this will never happen!
Other possibilty: A manufacturer built his own solution all-in-one. First steps can be seen in the implemented gpu in intel cpus.

In current technical configuration, sli is for high end benchmark community or simply for those to burn money to heat up the room or to use more electricity for nothing. :) Just my point of view.

Interesting opinion but my experience is not what you suggest.

I have many games in my library that absolutely work very well with SLI. Scaling for a lot of them is 60-80% (sometimes 100%) for the 2nd card and 50% or better for every additional card after. I have done testing with many games from single to 2way, 3way and 4way SLI and I have to say its not just about benchmarking. There is a real tangible difference. Sure its debatable whether there's any benefit than more than 120FPS but a lot of games are very graphically demanding these days with all feature turned on.

There are several very demanding games out there that can fully utilise all 4 of my cards at 99% and give me 100FPS or so when many others without this SLI struggle to get 30FPS.

COD currently does scale in SLI when at higher resolutions too. When I run it at 6048x1080 in surround with the 3 monitors it pushes all 4 cards properly to 99% and runs really smooth, although at a much lower frame rate of 25>40FPS compared to 40>80FPS at 1920x1080 with a single card.

I expect I will achieve 200FPS plus when the SLI profile is matched to the current game code. I was already achieving that a year ago when the game 1st came out with my old 4way 5870 crossfire setup. The only thing that created an issue for me back then was the limited Vram which is why I upgraded to 3GB video cards. And at 6048x1080 the game uses 2.5-2.8GB vram compared to 1.6-1.9GB vram at 1920x1080 with max settings. This is why many people experience stuttering and slowdowns because they just dont understand how much textures for the massive maps and large amount of objects that need to load. The development team appear to have been tweaking the textures and the texture engine for this very reason to make sure textures can fit in midrange hardware and are only loaded when needed and stream in and out of memory more smoothly. Its a big challenge for such big high detailed maps and detailed models with lots of objects.

In any case I am very excited about Luthiers news on trying to get the next official patch (or possibly next beta?) with an Nvidia SLI profile!

Couple videos... COD in ultra-widescreen and the beast. ;)

IL2 COD 6030x1080 Nvidia 2D Surround - Hurri's vs BF110's

4way SLI GPUs all maxed at 99% avg 30FPS

http://www.youtube.com/watch?v=JbJfvrd553U

The beast that powers IL2 COD

http://www.youtube.com/watch?v=-mHCNW1lkk4


A year ago on my old 4way 5870 setup. 200FPS+
http://forum.1cpublishing.eu/showthread.php?t=23604
http://w7sn5a.blu.livefilestore.com/...adeonpro-4.jpg

Stublerone 07-26-2012 12:13 PM

You got it right, phat. I just wanted to warn everybody to hear on unexperienced sli fanboy friends, who never got to know about sli and say, that bf3 is the best graphics and most demanding game. If someone give advice to update a gtx 285 with an additional one, I have to say, that such a guy is not the right one to talk to.

First, get a single card, which run everything sufficiently and then start building your sli maschine. There is no doubt about some more performance, bit the load of your cards do not tell you 1:1 the performance increase. For 3 monitors you need for sure more graphics power, but first of all more vram. What I want to say: built up an sli with 2 x 580s with 1.5 gb each and max out to multi monitors, you will see the limitations in vram. You can add 2 more cards and will never solve the initial problem of vram texture loading.

2 x 680 in triple monitor resolution will not run as smooth as 2 ati hd7970. They simply do not have enough vram to handle all the load, which is instantly needed from the game. Every load from normal ram through the bus is too slow and you get stutters or whatever. For cod, even in normal 1080p resolution, a 1,5 gb card will cause some stutters. I maxed out everthing in this resolution and also have 2.5gb load, but my card can manage it.

Nevertheless, sli is helpful for multi monitors, but the efficiency of that technology is forcing me to say: It is damn expensive and the tech needs rework. The data buses for every component should be as fast as the bus between gpu and its vram. Efficiency of this can only increase, if this happens.


All times are GMT. The time now is 05:16 PM.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.