Official Fulqrum Publishing forum

Official Fulqrum Publishing forum (http://forum.fulqrumpublishing.com/index.php)
-   IL-2 Sturmovik: Cliffs of Dover (http://forum.fulqrumpublishing.com/forumdisplay.php?f=189)
-   -   Suggestion for the 1C Devs (http://forum.fulqrumpublishing.com/showthread.php?t=33358)

Blackdog_kt 07-21-2012 09:38 AM

Quote:

Originally Posted by SKUD (Post 447136)
So let me make sure I have this straight. Nvidia wasted $$ putting an extra 1.5 GB of DDR5 on my 590 in full knowledge that it would never be used ?? Now they did it again with the 690 throwing away 2GB of VRAM because SLI can never use the VRAM from both cards. Those silly guys. Thanks for the tip.

Well, google is your friend.

EVGA forums, scroll down to 9th reply by user HeavyHemi: http://www.evga.com/forums/tm.aspx?m=1421266&mpage=1
Also note that the prevailing advice in that thread if you want to really crank up the resolution while also keeping the detail settings high, is to get a single card with the highest amount of RAM you can afford. So it's not only modern flight sims that work this way (RoF also had a lot of problems with SLI early on, but i can't comment on its current state because i don't have it on my PC), it seems to be a more widespread trend in other games too.

Tom's Hardware SLI and Crossfire FAQs:
http://www.tomshardware.co.uk/forum/...crossfire-faqs

Incidentally, in the above link you can also find this little gem:
Quote:

Do SLI or CrossFire always improve performance ?
Not always.
There are some games that don't benefit from MultiGPU technology(or require a patch in order to utilize it).
For example,Flight simulator X doesn't benefit from either SLI or CrossFire.
Another example is StarCraft2 which barely benefits from more than 1 card
So FSX, a flight sim that was very demanding in graphics and CPU until hardware could could catch up with it, doesn't benefit from it. Sounds very familiar. Also note that Starcraft2 is a blockbuster AAA title.

Both of them are made by companies that could throw tons of cash on the issue. FSX is getting old and microsoft is more concerned with selling DLCs for its new MS Flight, but this wasn't always the case. Yet, they didn't fix it.
Also, SC2 is at its peak and its only part one of a trilogy, with a highly competitive multiplayer scene (think professional gamers who get paid like footbal players to take part in tournaments, etc) and the company behind it (Blizzard) has the enormous world of warcraft MMO cash-cow at its disposal and raving mad fans who buy everything they release (eg, the recent Diablo III).

If these guys can't do it or won't spend the time and money to, then the only reason i can think of is that SLI/Xfire setups are a bit too particular in terms of how you code your game in order to work correctly. It seems like the game has to be written around it and since it's a somewhat rigid and not so evolving technology (the cards evolve, but the technologies that pair them not so much), maybe it's not worth the compromises in other parts of the engine?

I'm just thinking out loud here, but the whole thing seems to completely debunk the entire "two cards = double the performance" logic. I've been ordering my PC components separately since forever and the only people i routinely see going for SLI setups are those that primarily focus on action/shooter games (simpler engines, small maps, elementary game mechanics, so all the PC has to do really is to run good graphics at a high frame rate).

The bottom line is, just because we might have some extra money to burn on a PC build doesn't mean we should go for the most expensive options. They might be kind of specialised in what they work well with.

He111 07-21-2012 12:50 PM

DRAT! you mean I bought an extra GTX 5980 for nothing!? :(

But wait, I'm future proof when game developers advance and develop for 2+ Gpus .. but WAIT!

Hopefully Rome2 will take advantage of multiple cores and GPUs?

.

SKUD 07-21-2012 06:44 PM

Quote:

Originally Posted by Blackdog_kt (Post 447154)
Well, google is your friend.

EVGA forums, scroll down to 9th reply by user HeavyHemi: http://www.evga.com/forums/tm.aspx?m=1421266&mpage=1
Also note that the prevailing advice in that thread if you want to really crank up the resolution while also keeping the detail settings high, is to get a single card with the highest amount of RAM you can afford. So it's not only modern flight sims that work this way (RoF also had a lot of problems with SLI early on, but i can't comment on its current state because i don't have it on my PC), it seems to be a more widespread trend in other games too.

Tom's Hardware SLI and Crossfire FAQs:
http://www.tomshardware.co.uk/forum/...crossfire-faqs

Incidentally, in the above link you can also find this little gem:


So FSX, a flight sim that was very demanding in graphics and CPU until hardware could could catch up with it, doesn't benefit from it. Sounds very familiar. Also note that Starcraft2 is a blockbuster AAA title.

Both of them are made by companies that could throw tons of cash on the issue. FSX is getting old and microsoft is more concerned with selling DLCs for its new MS Flight, but this wasn't always the case. Yet, they didn't fix it.
Also, SC2 is at its peak and its only part one of a trilogy, with a highly competitive multiplayer scene (think professional gamers who get paid like footbal players to take part in tournaments, etc) and the company behind it (Blizzard) has the enormous world of warcraft MMO cash-cow at its disposal and raving mad fans who buy everything they release (eg, the recent Diablo III).

If these guys can't do it or won't spend the time and money to, then the only reason i can think of is that SLI/Xfire setups are a bit too particular in terms of how you code your game in order to work correctly. It seems like the game has to be written around it and since it's a somewhat rigid and not so evolving technology (the cards evolve, but the technologies that pair them not so much), maybe it's not worth the compromises in other parts of the engine?

I'm just thinking out loud here, but the whole thing seems to completely debunk the entire "two cards = double the performance" logic. I've been ordering my PC components separately since forever and the only people i routinely see going for SLI setups are those that primarily focus on action/shooter games (simpler engines, small maps, elementary game mechanics, so all the PC has to do really is to run good graphics at a high frame rate).

The bottom line is, just because we might have some extra money to burn on a PC build doesn't mean we should go for the most expensive options. They might be kind of specialised in what they work well with.

Clearly my 590 is not using all the 3GB of VRAM it has in COD because I can directly compare it to my 3GB 580. So my question again is why would Nvidia build a card that can't use half of its VRAM under any circumstances? If this is the case then a 590/690 is nothing more than a 580/680 with a bonus space heater attached.

Never mind... my friend Google found this Gem.

"Originally Posted by CousinVin
I think i understand that putting two 3gb cards still only limits you to 3gb of usable vram.. right? If that is wrong please correct me.
You are correct.
Quote:
Now my confusion comes in with the GTX 590. It is labeled as a 3gb card, but from the assumption above, and considering that it is 1.5gb per core, is it really only 1.5 gb usable vram?
It's marketing. Joe Average can't tell the difference between total memory and dedicated memory."

So anyone looking at the 690 beware.

Royraiden 07-21-2012 11:06 PM

Its amazing that a feature that almost any current game has and a feature that was promised to be introduced/fixed more than a year ago is still not present in a sim that has so many problems performance wise and Im amazed that some guys here seem to encourage the devs to forget about a feature that could be really helpful if implemented correctly.When I bought my second video card it made a world of difference, specially on ROF, other more common titles were running a lot better to say the least.I never experienced micro stuttering or any problem related to the use of crossfire.Gpu's prices go down really quick so buying a second card to make a crossfire/sli setup is a lot cheaper than buying a single monster card.People of this community need to see things from a wider angle, having this feature could only make positive changes but on the other hand not having it as you can see is a negative thing in my opinion. By the way Im only running one card now so dont assume that I wrote this just because I had a dual gpu setup.I would be happy if multi gpu support got improved even if I cant benefit from it right now.

Blackdog_kt 07-22-2012 03:47 AM

Quote:

Originally Posted by SKUD (Post 447260)
Clearly my 590 is not using all the 3GB of VRAM it has in COD because I can directly compare it to my 3GB 580. So my question again is why would Nvidia build a card that can't use half of its VRAM under any circumstances? If this is the case then a 590/690 is nothing more than a 580/680 with a bonus space heater attached.

Never mind... my friend Google found this Gem.

"Originally Posted by CousinVin
I think i understand that putting two 3gb cards still only limits you to 3gb of usable vram.. right? If that is wrong please correct me.
You are correct.
Quote:
Now my confusion comes in with the GTX 590. It is labeled as a 3gb card, but from the assumption above, and considering that it is 1.5gb per core, is it really only 1.5 gb usable vram?
It's marketing. Joe Average can't tell the difference between total memory and dedicated memory."

So anyone looking at the 690 beware.

Yup, that's how it works. Each GPU needs its own RAM regardless of whether you have two cards with a GPU on each, or a single card with two GPUs on it (like the 690 i think). Also, each RAM has to store the entire frame before it's displayed, regardless of the SLI method used (either card 1 renders frame 1 and card 2 renders frame 2, or each card renders half of each frame).

So, when you see a single card with a dual GPU specifying 3GB of RAM on the box, what it means is 3GB divided equally among the two GPUs and mirrored for each frame -> 1.5Gb of effectively usable video RAM.

If your preferred games are heavy on textures and have long viewing distances (more textures need to be loaded per frame) it's better to go for a single 3GB card or two separate cards with 3GB each. View distance is probably the main reason that players of action and shooter games get great performance with SLI. Their view distance is nothing compared to a flight sim so this RAM issue is not so perceptible.

In the above 3GB example, to load the textures that the single card or the two-card SLI setup can, an single-card SLI setup like the 690 would have to have a total of 6GB of RAM (3 for each card).

I hope i didn't make any typos to make this confusing (it's a bit late at the moment and i'm sleepy) and that it sufficiently explains the limitations of the architecture in terms of RAM usage. Cheers ;)

Warhound 07-22-2012 04:47 AM

I understand people with SLI/crossfire want their setups to be utilized fully.
But from the devs standpoint I'd say it still makes more sense to focus on general performance.
The graphics engine is obviously not 100% optimized yet and investing time in making SLI/crossfire work is likely better spent on finding performance that benefits both single card users and multi card ones.

First of all the percentage of multi card users is small, that's a fact.
And secondly they could spend X weeks making both SLI and crossfire work, only to have it broken again in one of the monthly driverupdates.
So as a small team it makes more sense to focus on general optimization and performance, which benefits everyone, than spend hours working on something which only a small percentage of users will benefit from and which might be broken the moment you get it ready for release.
Nvm break it yourself when you push through the general optimizations that are being worked on, forcing you to start over on SLI/xfire once again.

KG26_Alpha 07-22-2012 11:20 AM

One word with GPU manufacturers and SLI & X-Fire.

Marketing

Jaws2002 07-22-2012 08:04 PM

When I got my gtx 590 there was no 3gb gtx580 and the 590 had a solid advantage over ALL other cards in a lot of games. While it only uses 1.5gb of video ram, it still runs a lot faster than a single card, in games that make sli work.
I still think they would be better off working on multi GPU support than spend months on the bloody dx9. But that's just me.
Anyway, recently I tried the sli profile I found in one of the technical threds and the performance boost I got was impressive. There are still some bad slow downs around the clouds, but everything else went from 15-20fps to 57-60 fps.
I think they would please more people fixing multi gpu support than fixing the bloody dx9, that's used by people that most likely don't have a powerful enough system for this game anyway.

Stublerone 07-23-2012 10:54 AM

They have to fix dx9. It is not, what they really want but they have to run conform on package specs.

Concerning sli: sure the optimized profiles show, that this helps improving the sli matter, bit again: the technique is old and normally (to fully make their initial intention of sli work) the whole components have to be harmonizes up to high standard. The communication between the pc components have to get high end bus systems in every case. This harmonized complete pc will reach market as soon as the manufacturers of every part work closely together ->perhaps this will never happen!
Other possibilty: A manufacturer built his own solution all-in-one. First steps can be seen in the implemented gpu in intel cpus.

In current technical configuration, sli is for high end benchmark community or simply for those to burn money to heat up the room or to use more electricity for nothing. :) Just my point of view.

von Pilsner 07-25-2012 02:28 PM

Good news for everyone but Warhound and Stublerone...

Quote:

Originally Posted by Luthier
We haven’t made changes to our SLI and Quad SLI support in the upcoming patch.

More specifically about the SLI question.
SLI support is two-ways, it needs to be in our game and in NVidia drivers. We have it in our game, but the patch needs to be certified by NVidia and a change needs to be added on their end.
We probably will try to get the upcoming betapatch certified by NVidia and get SLI support for the game with the next driver release.

Official nVidia support for CloD may be just what we need... :D


All times are GMT. The time now is 09:46 PM.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.