View Full Version : Luthier-Will SLI be fixed in the patch?
With the new Nvidia 680 series out I was thinking about a new graphics card.
My GTX590 on this Z68 rig is handily outperformed by my 3GB GTX580 on my X58 rig because SLI isn't working in CloD. If SLI is going to remain broken forever or a very long time I will hold out for the 4GB 680 or the phantom 685 but avoid the 690.
thanks,
SKUD
PotNoodles
04-13-2012, 04:40 AM
That's a bit over kill isn't it? Especially since they said the FPS have been fixed by 50%. I have a Geforce 570GtX and get 25 - 30fps. With the new patch I should get 50 - 60FPS if you get a 50% increase.
Lots of hypotheticals and "should" or "they said" in there.
I think its an excellent question. I too am planning on a 680, and this would be good to know.
Tree_UK
04-13-2012, 06:33 AM
The problem we have with SLI is that Luthier believes it isn't broken, if it stays this way then it will never get fixed. The last word from Luthier on SLI was that Nvidia's next driver release would have a profile for CLOD in it, there have been at least 5 driver release since with no mention of CLOD. Luthier's track record with SLI is somewhat shady, I asked in 2009 if CLOD would support SLI, after many avoided posts Luthier finally answered and said ;why wouldn't it work' he then went on to say that he would talk no more about SLI.
Then on release he blamed the 'Epilepsy filter' for killing SLI, but when it was removed SLI still didn't work.... strange that. Since then Luthier claims that SLI as been fixed but now it's Nvidia's fault for not releasing a driver to support it. We know that Luthier would never tell us any pork pies he has stressed this on many occasions, so I guess he's just been unlucky with this particular problem in that other comapanies like Ubisoft with the epilepsy filter and Nvidia with no driver support effect the perfectly good code of Luthiers game.
Wolf_Rider
04-13-2012, 06:39 AM
There's more of a problem with nVidia and whether or not SLI is working (which, why bother as stutter is an inherent faultwith SLI)... the bigger bug is How any game with a "Launcher.exe" as the name of the exe gets picked up as another game.
The driver doesn't differentiate by path, it just assumes that any "Launcher.exe" which triggers is the game it has listed in profiles.
Tree_UK
04-13-2012, 06:41 AM
There's more of a problem with nVidia and whether or not SLI is working (which, why bother as stutter is an inherent faultwith SLI)... the bigger bug is How any game with a "Launcher.exe" as the name of the exe gets picked up as another game.
The driver doesn't differentiate by path, it just assumes that any "Launcher.exe" which triggers is the game it has listed in profiles.
Yes I do believe you are right with the launcher.exe issue Wolf, Its a school boy error by the Dev's IMHO.
Flanker35M
04-13-2012, 07:05 AM
S!
CrossFire or SLI still have problems in many games no matter brand. Really strange this support has not been more widespread as even so long back as Voodoo II I used 2 cards and it worked out of the box, for example in EAW and Jane's WW2 Fighters :) Since then been sticking to single card configuration as has less hassle and problems with good performance.
ATAG_Doc
04-13-2012, 07:25 AM
The problem we have with SLI is that Luthier believes it isn't broken, if it stays this way then it will never get fixed. The last word from Luthier on SLI was that Nvidia's next driver release would have a profile for CLOD in it, there have been at least 5 driver release since with no mention of CLOD. Luthier's track record with SLI is somewhat shady, I asked in 2009 if CLOD would support SLI, after many avoided posts Luthier finally answered and said ;why wouldn't it work' he then went on to say that he would talk no more about SLI.
Then on release he blamed the 'Epilepsy filter' for killing SLI, but when it was removed SLI still didn't work.... strange that. Since then Luthier claims that SLI as been fixed but now it's Nvidia's fault for not releasing a driver to support it. We know that Luthier would never tell us any pork pies he has stressed this on many occasions, so I guess he's just been unlucky with this particular problem in that other comapanies like Ubisoft with the epilepsy filter and Nvidia with no driver support effect the perfectly good code of Luthiers game.
2009? You've been here asking the same question since then? For real? Not been answered. Not fixed. And I thought I couldnt take a hint. Holy cow.
RickRuski
04-13-2012, 07:28 AM
Luthier has said just after the release of CoD that Sli had been broken and they were working to fix it. So far there has been nothing from the development team to say that it has been fixed so I'm sticking with my own work around. That was nearly 12 months ago, guess we won't see any thing happen until the release of BoM. Very frustrating.
Ataros
04-13-2012, 07:49 AM
The example of another great simulator shows that even when SLI is fixed it can be broken again in the next patch. They managed to fix it again only after NVidia sent them some documentation after several unsuccessful requests for support. NV did not even talked to them, just sent doc files to figure out by themselves. This happened 2 years after the launch. They were not so lucky with ATI x-fire as ATI never responded to them.
Sim market is too small for both NV and ATI to bother.
Luthier tells on the forum only what graphics developer tells him. The previous graphics developer was fired.
SLI will probably not be available in the 1st beta but may be supported later. This would not mean that it can not be broken again in the following patch as it happened in competitor's sim.
It is reasonable to avoid SLI and x-fire setups at all cost unless it is a temporary solution of adding 2nd cheap 3-year old card to your old one.
It seems that anyone who can not handle his emotions blaming devs for not including SLI into the old graphics engine (before new is introduced) just does not know what he is talking about or deliberately trolls. 1st step - have a stable graphics engine (test beta and fix bugs), 2nd - talk to NV and ATI asking for support if face any difficulties, 3rd - wait till they pay any attention. At least this happened in competitor's camp.
Tree_UK
04-13-2012, 08:35 AM
2009? You've been here asking the same question since then? For real? Not been answered. Not fixed. And I thought I couldnt take a hint. Holy cow.
No I haven't been asking since 2009, i raised the question in 2009. I was giving you a history of SLI events, but im sure you knew that.
RickRuski
04-13-2012, 08:42 AM
Ataros,
All is not lost. I'm getting about double the fps with Sli as i'm getting with a single card. Those of us that are trying, are getting it to work reasonably well under the current situation. Most of my settings are on high with about 2 on medium working with 2xgts450's which most would laugh at with this sim, but for me it's working. I'm happy at getting 40--60fps with London Attack single player with Sli (only 25---34 with a single card working), over water around 80---90fps Sli. I'm waiting to see what happens with the new graphics patch and the quoted 50% approx improvement in fps. My biggest problem at the moment seems to be with the 1gig V/ram, my indications are that with CoD it is running at max, don't know what others are finding.
III/JG53_Don
04-13-2012, 01:17 PM
Why are so many people here talking about an increase of 50%?
When the devs are talking about the patch that nearly doubles the fps, actually its an increase of 100% or near to that :D
We are talking about total numbers here! There would be even the mathematical possibility to get an increase of 200%, believe it or not ;-)
JG5_emil
04-13-2012, 02:17 PM
What is the deal with SLI...does it not give any improvement at all?
I ask because I am planning a 2nd 3 Gig 580 GTX so that I can run a 3rd monitor for work.
Are there any negative implications running SLI?
BlackSix
04-13-2012, 02:51 PM
With the new Nvidia 680 series out I was thinking about a new graphics card.
My GTX590 on this Z68 rig is handily outperformed by my 3GB GTX580 on my X58 rig because SLI isn't working in CloD. If SLI is going to remain broken forever or a very long time I will hold out for the 4GB 680 or the phantom 685 but avoid the 690.
thanks,
SKUD
SLI and AA will not be fixed in first beta.
Thanks so much for the quick answer.
louisv
04-13-2012, 08:54 PM
That's a bit over kill isn't it? Especially since they said the FPS have been fixed by 50%. I have a Geforce 570GtX and get 25 - 30fps. With the new patch I should get 50 - 60FPS if you get a 50% increase.
That's a 100% increase...
But that is what we are talking about here, a 100% increase in FPS...your mileage may vary.
furbs
04-13-2012, 09:47 PM
SLI and AA will not be fixed in first beta.
Thanks Blacksix, thats good news if the fix is coming at least in a follow up patch.
RickRuski
04-13-2012, 11:54 PM
JG5 emil,
Sli does give improvement with CoD. It approx doubles my frame rate as with single card. You just need to try things. See my reply further back in this subject: -
Ataros,
All is not lost. I'm getting about double the fps with Sli as i'm getting with a single card. Those of us that are trying, are getting it to work reasonably well under the current situation. Most of my settings are on high with about 2 on medium working with 2xgts450's which most would laugh at with this sim, but for me it's working. I'm happy at getting 40--60fps with London Attack single player with Sli (only 25---34 with a single card working), over water around 80---90fps Sli. I'm waiting to see what happens with the new graphics patch and the quoted 50% approx improvement in fps. My biggest problem at the moment seems to be with the 1gig V/ram, my indications are that with CoD it is running at max, don't know what others are finding.
Wolf_Rider
04-13-2012, 11:59 PM
Yes I do believe you are right with the launcher.exe issue Wolf, Its a school boy error by the Dev's IMHO.
perhaps a naivity on the various developers' part, in having file names entitled the same, but is more a childish balls up by nVidia in not having a profile system which doesn't follow the path to the required program's exe, when adding a custom profile
SLI and AA will not be fixed in first beta.
Thanks
BSix
That's all I was looking for.
I'll get the 4GB version then and wrap this 590 up in Christmas paper for the kids.
SKUD
Skud, any indication when the 4gig will become available?
He111
04-14-2012, 07:13 AM
MY 580s in SLI work ok with cLOD although I'm only using one card for graphics, then other for physics engine. :grin:
.
jcenzano
04-14-2012, 03:06 PM
MY 580s in SLI work ok with cLOD although I'm only using one card for graphics, then other for physics engine. :grin:
.
As far as I know ClOD does not support physics either, so basically you are running it on a single card.
http://developer.nvidia.com/physx-games
Skud, any indication when the 4gig will become available?
Internet Rumors say May. So expect June-July. 685 in October so expect Christmas or 1st qtr next year.
furbs
04-14-2012, 03:49 PM
is that 680 4Gb or 690 4Gb?
Anders_And
04-14-2012, 04:46 PM
That's a bit over kill isn't it? Especially since they said the FPS have been fixed by 50%. I have a Geforce 570GtX and get 25 - 30fps. With the new patch I should get 50 - 60FPS if you get a 50% increase.
From 25-30 FPS to 50-60 FPS is a 100% increase in FPS...;)
As far as I know, Luthier expected an increase of "ONLY" 50% in FPS and 50% of 25-30 FPS would be ca 12,5-15 FPS more compared to what you have now ;)... So after the patch you should be looking at 35-45 FPS AT BEST!.. Still significant though!!
Just saying!:grin:
senseispcc
04-14-2012, 06:22 PM
For what I can judge the SLI for two GeForce gfx580 1.5go ddr5 works without any problem. It is what I have in my PC.:cool:
is more a childish balls up by nVidia in not having a profile system which doesn't follow the path to the required program's exe
That's not true, Nvidia won't support it until the bugs are fixed
For what I can judge the SLI for two GeForce gfx580 1.5go ddr5 works without any problem.
You can force it but its never worked properly
Thee_oddball
04-14-2012, 10:20 PM
That should be possible.
According to the nVidia docs, you can query this via NVCPL.DLL (liked to documentation).
The call to be used is NvCplGetDataInt() (page 67), with the argument NVCPL_API_NUMBER_OF_SLI_GPUS or NVCPL_API_SLI_MULTI_GPU_RENDERING_MODE you should obtain the information required.
In oder to access this information, you'll need P/Invoke. If it is OK to statistically link NVCPL.DLL you just have to create the correct import (static external method) and you're fine. Otherwise, you can also choose the LoadLibrary and GetEntryPoint way and use the Marshal class to create an instance of a delegate (declared with the correct arguments) which represents the function to be called.
Edit: The following snippet may get you started (I don't have a nVidia card though, so that's completely untested and on your own risk ;) ):
public const int NVCPL_API_NUMBER_OF_GPUS =7; // Graphics card number of GPUs.
public const int NVCPL_API_NUMBER_OF_SLI_GPUS = 8; // Graphics card number of SLI GPU clusters available.
public const int NVCPL_API_SLI_MULTI_GPU_RENDERING_MODE = 9; // Get/Set SLI multi-GPU redering mode.
[DllImport("NVCPL.DLL", CallingConvention=CallingConvention.Cdecl)]
public static extern bool nvCplGetDataInt([In] int lFlag, [Out] out int plInfo);
public static void Main() {
int sliGpuCount;
if (nvCplGetDataInt(NVCPL_API_NUMBER_OF_SLI_GPUS, out sliGpuCount)) {
// we got the result
Console.WriteLine(string.Format("SLI GPU present: {0}", sliGpuCount));
} else {
// something did go wrong
Console.WriteLine("Failed to query NV data");
}
}
source: http://stackoverflow.com/questions/1890456/in-net-3-5-c-is-there-a-way-to-detect-if-nvidia-sli-mode-is-active
For what I can judge the SLI for two GeForce gfx580 1.5go ddr5 works without any problem. It is what I have in my PC.:cool:
My GTX590 is two 1.5 GB 580's on one card. It looks like both GPUs are running full bore when viewed in the control tool. Nevertheless, this card is slower in Clod than my single GPU 3 GB GTX580. IN SLI both GPUs are processing the same data so no benefit.
SKUD
Icarus1
04-15-2012, 02:07 AM
My GTX590 is two 1.5 GB 580's on one card. It looks like both GPUs are running full bore when viewed in the control tool. Nevertheless, this card is slower in Clod than my single GPU 3 GB GTX580. IN SLI both GPUs are processing the same data so no benefit.
SKUD
However, two GTX 580's SLI beat a 590 by 15-20%. It actually has a tough time beating 2 GTX 570's. I believe the GTX 590 is underclocked at about the same as a GTX 480 for heat issues, which is why it is so quiet.
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/41903-nvidia-geforce-gtx-590-3gb-review-23.html
Working SLI in CoD would make a fair difference to the 590 because you cannot turn off SLI.
However, two GTX 580's SLI beat a 590 by 15-20%. It actually has a tough time beating 2 GTX 570's. I believe the GTX 590 is underclocked at about the same as a GTX 480 for heat issues, which is why it is so quiet.
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/41903-nvidia-geforce-gtx-590-3gb-review-23.html
Working SLI in CoD would make a fair difference to the 590 because you cannot turn off SLI.
Agreed
I'm not certain whether the clock speed or the memory is the biggest contributor.
RickRuski
04-15-2012, 08:23 AM
SKUD,
In Sli you don't double the ram of one card unfortunately, so your 590 (2x580's on one card @ 1.5gb V/ram) shouldn't give 3gb of ram only 1.5gb. The 580 with 3gb therefore is the reason of better performance. We will have to wait and see if the patch is going to address the V/ram issue. I have 2x1gb cards in Sli and both are working to full capacity indicated by my precision tool. I think at the moment V/ram or lack of it is some of the reason that those of us with 1gb are noticing more problems than those with 1.5gb or more. My frame rates are very good but the stutters down low are a pain.
SKUD,
In Sli you don't double the ram of one card unfortunately, so your 590 (2x580's on one card @ 1.5gb V/ram) shouldn't give 3gb of ram only 1.5gb. The 580 with 3gb therefore is the reason of better performance. We will have to wait and see if the patch is going to address the V/ram issue. I have 2x1gb cards in Sli and both are working to full capacity indicated by my precision tool. I think at the moment V/ram or lack of it is some of the reason that those of us with 1gb are noticing more problems than those with 1.5gb or more. My frame rates are very good but the stutters down low are a pain.
Again agreed.
I believe if my 590 had 6GB (3GB each GPU) and same clock speed as my 580 it would get equal performance. Now the second GPU and the 1.5GB of memory it has is wasted. My precision tool also says both GPUs are going at it full bore. Trouble is I don't think they are splitting the work load, only doubling it. I believe SLI works by splitting the data stream sending half to each GPU and then recombining by interleaving the data after they have processed it. If the software isn't controlling that then both processors get all the data. Someone correct me if my perception is wrong.
RickRuski
04-16-2012, 03:05 AM
Hi SKUD,
Interesting that you say that your precision tool shows both your cards working full bore, mine says the same. I have tried disabling one of my cards and that shows zero use on the V/mem on that, and the fps drops to about 60% of what I get with both cards working but the V/mem still shows near max use on the working card. I have my clocks and voltages set to the same for each card even though they are from different stables and the percentage of use on each card only varies about 2% from one card to the other, that also varies with one card using more, then the other. Temps on both cards are within about 2--3deg of each other with card 2 always slightly lower than card 1 (this is what information I get from the different makers, the EVGA slightly hotter running than the Asus) and both are within the 50--60deg range. I wonder if others using Sli are getting the same results, it would be interesting to find out.
Icarus1
04-16-2012, 01:42 PM
Again agreed.
I believe if my 590 had 6GB (3GB each GPU) and same clock speed as my 580 it would get equal performance. Now the second GPU and the 1.5GB of memory it has is wasted. My precision tool also says both GPUs are going at it full bore. Trouble is I don't think they are splitting the work load, only doubling it. I believe SLI works by splitting the data stream sending half to each GPU and then recombining by interleaving the data after they have processed it. If the software isn't controlling that then both processors get all the data. Someone correct me if my perception is wrong.
I have 2x GTX 580 3gb (disabled SLI for CoD). I never come close to using the 3 Gb with CoD and I use 2650 x 1600 res on an SSD (helped slightly but not much). I believe just over 2 Gb vRAM is the max I have ever used. My GPU and CPU are never maxed out, and my RAM is never maxed out but the stutters are really a problem especially low or over cities. Proof the stutters are not from a lack of resources at least on my rig.
Thee_oddball
04-16-2012, 01:57 PM
I have 2x GTX 580 3gb (disabled SLI for CoD). I never come close to using the 3 Gb with CoD and I use 2650 x 1600 res on an SSD (helped slightly but not much). I believe just over 2 Gb vRAM is the max I have ever used. My GPU and CPU are never maxed out, and my RAM is never maxed out but the stutters are really a problem especially low or over cities. Proof the stutters are not from a lack of resources at least on my rig.more rendering=more interop...more interop=more stutters :(
Icarus1
04-16-2012, 02:06 PM
more rendering=more interop...more interop=more stutters :(
Could you please expand your explanation. Thanks.
RickRuski
04-16-2012, 08:25 PM
Icarus,
Thanks for your information, from what you have said it looks like lack of V/mem isn't the cause of stutters with CoD that a lot of us are getting. My two cards are only 1gb each and with CoD they are both using between 990mb--1007mb each (cards @1024mb each), no other games that I have come anywhere near this. RoF is using under 75% on each card, so I'm hoping the patch will address CoD's V/mem demand.
Thee_oddball
04-16-2012, 08:52 PM
Could you please expand your explanation. Thanks.
the game itself is not written in C++ but in .NET :( Interop is what happens when you go from managed to unmanaged code (speed tree's ,direct X...etc) which causes a performance hit.
here is a much more in depth explanation
http://forum.1cpublishing.eu/showthread.php?t=30774
Icarus1
04-16-2012, 11:47 PM
the game itself is not written in C++ but in .NET :( Interop is what happens when you go from managed to unmanaged code (speed tree's ,direct X...etc) which causes a performance hit.
here is a much more in depth explanation
http://forum.1cpublishing.eu/showthread.php?t=30774
Thanks.
From what I read the stutters are from badly/hastily coded software.
Not hardware issues. Correct?
Thee_oddball
04-17-2012, 12:20 AM
Thanks.
From what I read the stutters are from badly/hastily coded software.
Not hardware issues. Correct?
sloppy coding just exacerbates the problem...but the interop is still there even with clean code and if you read the latest update Luther said "reduced" and "decreased" not fixed! as I understand it this is just the nature of the beast....interop makes a performance hit whether it be minor or major.
A robust rig will show less of the issue but it will still be there :(
Hi SKUD,
Interesting that you say that your precision tool shows both your cards working full bore, mine says the same. I have tried disabling one of my cards and that shows zero use on the V/mem on that, and the fps drops to about 60% of what I get with both cards working but the V/mem still shows near max use on the working card. I have my clocks and voltages set to the same for each card even though they are from different stables and the percentage of use on each card only varies about 2% from one card to the other, that also varies with one card using more, then the other. Temps on both cards are within about 2--3deg of each other with card 2 always slightly lower than card 1 (this is what information I get from the different makers, the EVGA slightly hotter running than the Asus) and both are within the 50--60deg range. I wonder if others using Sli are getting the same results, it would be interesting to find out.
Here are my GPUs full bore in CLoD
Hoogs
04-17-2012, 06:33 AM
sloppy coding just exacerbates the problem...but the interop is still there even with clean code and if you read the latest update Luther said "reduced" and "decreased" not fixed! as I understand it this is just the nature of the beast....interop makes a performance hit whether it be minor or major.
A robust rig will show less of the issue but it will still be there :(
It was the same with 1946. The stutters ended once a rig had a processor fast enough to compensate. It's just going to be a long road to perfection, but I'm happy to shoot you all down with the odd stutter till then ;)
335th_GRAthos
04-17-2012, 07:29 AM
Your precision tool shows both your cards working full bore, mine says the same. I have tried disabling one of my cards and that shows zero use on the V/mem on that, and the fps drops to about 60% of what I get with both cards working but the V/mem still shows near max use on the working card.
Using sli doubles the fps (130fps) but there is a very annoying "unsmooth" flow of the screen, mainly due to lack of power of the GPUs and lack of V/mem.
I have experienced exactly the same result trying CoD on a PC with an NV420 GPU.
So, non-sli is less fps but smoother game. Also the moment V/mem will reach 99% capacity, the fps will drop by half.
Temps on both cards are within about 2--3deg of each other but it depends on which card-fan has better air intake, both GAINWARD are at the 75deg range with optimised cooling, 83deg without optimized cooling.
~S~
Icarus1
04-17-2012, 01:04 PM
It was the same with 1946. The stutters ended once a rig had a processor fast enough to compensate. It's just going to be a long road to perfection, but I'm happy to shoot you all down with the odd stutter till then ;)
I'm confused. If its because my processor is too slow, why are none of the cores maxed out ever? I never go above 60% on any core.
vBulletin® v3.8.4, Copyright ©2000-2025, Jelsoft Enterprises Ltd.