![]() |
How many users have SLI/Crossfire?
Straightforward poll, requested by BlackSix. No comments needed, just tick the box.
Do you use SLI? Do you use Crossfire? Single card? |
ok sorry for the comment but its important, i think its like the DX9 issue we had, the Sli/Crossfire, its not in ALL computers, but i think its Necessary to have it working in the game, like the Dx9 ,Salute and sorry for the Comment
|
i voted for a single card as i only have 1 gtx 670 at the mo but i will be adding another at the end of the month. so maybe there should be a otion for those who do intend to sli/xfire
|
Quote:
|
I voted single card as I use that now, but when the patch and drivers allow will go for SLI
|
Quote:
|
Quote:
|
Quote:
I think that most of the users are using a single card cause they know that a SLI or Xfire simple won't work. |
But an important note is that this community is probably nerd heaven... A lot of engineers with to much money that love aircraft and hardware I guess (I know that the flying club I've been a member of for a long time has something like 50% engineers so I guess this group is rather similar).
So we are the "few" in regards of the whole target group that is 500.000 plus "medium high average" users... And looking at the Steam Hardware survey from August 2012 the percent of the "average" gamer that has a GTX 580 GPU is 1.16%, and the number of GTX 680 users is 0.51%. In the sigs here it looks more like 50% 580, 20% 680 and 30% other stuff with high end ATI and a bunch of 560:s. Check it out yourself. Can't find the SLI/Crossfire usage there unfortunately: http://store.steampowered.com/hwsurvey |
Would have been good to see 'How many would get SLI if it worked properly'. It is the cheapest way to upgrade GPU performance for older cards but only if the game correctly scales performance.
|
The Poll should have added the option. "Would you add a second gpu if SLI/Xfire worked?" I have an SLI capable system, but just bought the best single gpu as I know from experience games don't always work that much better with two gpus. If people start seeing significant improvement in COD using SLI/Xfire , I would certainly add the second card. Although the sim runs fine on a single gpu, except for some stutters at treetop level over London. Again if the second gpu got rid of those stutters, and improved performance I couldn't buy the second gpu quick enough.
|
Quote:
|
Xfire does work! Doesn't it? :| :?
|
I would be using 2 cards if it was supported as suggested above. As soon as it is, I will be getting a second card.
|
If alot of users have a single 580 and were told SLI works and increases performance by about 50% = more planes on screen / better FPS, would they upgrade ? i suspect the answer would be YES! (as soon as I get some money)
Why waste a perfectly good 580 to upgrade to a 680 when a second 580 would do the job - the 580 is a great card! Like comparing spits and mozzys. :grin: . |
Quote:
|
Quote:
I have the same, just 2 570´s instead. Im not a engineering, but know it can be complex. Still it amaze me, that here in 2012 its is now here in the Sim. How long have the Crossfire and SLI been arround? It should be basig stuff. Mabye its just more complex that I would understand? |
Quote:
|
Quote:
EDIT: Sorry that was meant to be quoted to the post above... |
Quote:
So I gues I would say at present X-Fire does not function. If the jump in performance could be available as was with the first X-Fire patch I am pretty sure once the figures were seen many more game users would concider X-Fire or Sli if they were concidering the quickest and for some may-bee the cheapest way of seeing a notable performance boost . |
What dx10 profile are you using?
I am using twoworlds2.exe and get pretty much 60fps (v sync enabled) everywhere. |
Quote:
Cheers |
I have steady FPS from 45 to 60+
On the new GTX6804GB Card no need for SLI any-more. I only get launchers when I press the F12 for steam screen shots, I run Fraps with no issues, I have had no launchers since this upgraded to this large of vram for the Video card. |
Before the 6970 in use now, I used to have 2X 4890's in CF... due problems with ClOD I dropped them.
|
Quote:
|
Quote:
|
I have 'used' SLI for years and my personnal verdict is: it generally doesn't work.
If it did, it would almost double every FPS of every supported game and honestly, I have never seen that other than with a special indicator in ROF...And without the indicator, I would not have noticed...so that's hardly double the speed. And I have all the good (real) flight sims... Also you buy twice the memory but you don't get it, each card needs to hold the same thing in memory. If one day I want 3 screens, I'll buy 3 cards and link the desktops, no need for SLI here either. In short, buy a good almost top of the line card that would have cost the same as two crappy cards and sleep tight. My two cents. Lou PS: What kind of experience do you have with other programs and SLI/Crossfire ? :!: |
Quote:
True about the memory. |
@JG52 Krupi
I tested in Default X-Fire with medium settings grass and roads on ,shadows off 87fps vs off ,69fps vs on. Default X-Fire with V-High settings grass and roads on ,shadows off 45fps vs off ,46fps vs on. Two Worlds 2 DX10.exe X-Fire with medium settings grass and roads on,shadows off 85fps vs off ,65fps vs on. Two Worlds 2 DX10.exe X-Fire with V-High settings grass and roads on,shadows off 52fps vs off ,55fps vs on. Lower altitude settings produce hesitations on all tests but slightly less annoying in medium settings.The rendering looked better in the Two Worlds profile. @luisv I started ROF with a single 5870 and when SLI/XFire was enabled got a second one .Performance went up by 50% easily so I upped the eye candy setting and got a nice balance .When I upgraded to a single 7970 the performance improved probably due to the extra memory.Adding a second card allowed me to max out all settings other than Anti-Ailiasing which is always left at 2x as thats the real frame killer.A recent AMD Driver updat introduced flicker of the clouds so I had to cut points of light from max to 15 but otherwise it runs as smooth as silk. |
Not worth it
SLI / Crossfire is not worth it. First you have to buy two or three cards, a good cooling solution. Then after a couple of years when you want to upgrade you have to buy two / three new cards again. It is like multiplying the costs.
And very few games support sli / crossfire and does who do it is barely any difference at all. Some give you maybe 5 or 10 fps more. |
Quote:
Winger |
Regarding earlier statements, shouldn't the thread be labeled: How many forum-users have SLI?
Or is it depending on who/what is polled? |
I have a single GTX 670 but when the game is patched and if it adds this feature I would go SLI.
|
Quote:
|
Quote:
All games I have got SLI support except COD(close to 300 just on Steam) and I have a 50% plus FPS increase. Most games are supported at launch, rarely do you have to wait a week. |
Perhaps another poll ?
Perhaps a follow up poll would be a good idea on the lines of who has X-fire or SLI?
How many would Upgrade to X-Fire or SLI if there was a good performance increase to be got from Cliffs of Dover? How many would not Upgrade to X-Fire or SLI if there was a good performance increase to be got from Cliffs of Dover? |
Quote:
I'm not saying it doesn't work at all, I'm saying 'generally doesn't work' not really worth it. I had a couple of GTX9800 with a slightly oc'd quad core and generally with flight simulators, the bottleneck was the cpu...But with a single card too, with the exception of ROF (with the indicator). Flight sims used to be heavier on the cpu side, that has changed now though, I admit. What gets me with this is that with a given price you get either 2 mid range cards or 1 really good one. About the same price as you do pay twice for memory. And many games don't support it. Right now ROF and DCS that I'm aware of, with other flight sims, all you are left with is 1 mid-range card..:( Lou |
Both ROF and DCS do support SLI however DCS don't scale as well as ROF
|
Quote:
Otherwise using sli has not the wished effect. And when you think, that in general sli only uses 50% or 70% from the second gpu in average, the costs are not that low. Add the power consumption and the problems, you usually have and a new single card will be the better upgrade. Sli is a method for enthusiasts or for people with very high resolutions due to triple monitor, for example. To upgrade an insufficient card with an additional insufficient card is not a very genius idea. Sorry, but I always recommend buying a sufficient new card, preferrably a single gpu. Let us see, what future techniques can change on that. I personally think, that no business related company would grant you a sufficient upgrade with a phased out and cheap card, where tjey do not earn money on! Think about that. Ati and nvidia have no priority to give you such an opportunity. They would like to make money and as long as games are also working closely together with them, they will also do not have this as priority. They recommend some support from the card producer and the give support on sales of new products. :) I am going so far to say, that game devs know, how to program a game bad enough to let you buy a new card. |
Quote:
Quote:
Correct. A new generation of a card is better that two or even 3 of the old ones together. |
I have SLI for a 3D hardware too, which is the future, and present, of video games. Luthier posted some 3D images of CLOD (never worked on my nvidia hardware) few months ago...
What a strange marketing strategy... |
Quote:
Most annoying in my eyes and as long as the viewer cannot focus what he wants on images and the image reacts accordingly by focussing as well, it produces even more unnatural effects for the eyes and causes weird eye behaviour. This is definetly not the future or not the near future. U need some devices tracking where you look at on the image and react in realtime with natural focussing effects, so that the eyes cannot see a difference to real focussing. Just with such techniques, u can get good 3d without damaging your eyes. Better look on a 2d screen, because we are used to it and it causes no weird eye malfuctions or brain malfunctions causing easier epileptic breakouts. Sorry: n1 effect for sometimes, but decades from being sufficient. That is a gimmick. You will switch off 3d after several sessions. But it looks cool, especially in flying simulators, no doubt. But your eye cannot do what it wants to do: look around and focussing different things in high speeds. You make your eyes even more weird, because your brain simply knows, that there is something going wrong. :) |
everything in a 3D image without depth of field is on focus, you are talking about a convergence problem, or a shutter/framerate problem, wich are different problems.
3D is a quite new ultra-immersive way of playing, that's all about my sentece "3D is the present and the future of videogames". 3D is not a new technology but a technology now widely avaible, because now it's time to make money with it and a lot of people are trying to solve that problems you're talking about. 3D viewing and HDR monitors are definitely the future of gameplay and video enterteinment. |
Quote:
|
SLI/XFire support is needed. I haveGTX 280 AMP! in SLI Can play many new games only in SLI (with decent setting) I will be lucky sell this card for $80 each. Single 570 or 660 Ti with 2Gb+ $ 300 +.
95% of games in last 3 years support SLI/XFire So what is The really question? Shout by standard. -=FA=- Klabo |
Quote:
|
Quote:
Perhaps we soon see that also in the lifetime, which will be designed to break after some years or a certain amount of hours. Mobile phones are already designed in that direction. They will break after 3 or 4 years. That is business. Concerning 3d: Okay I know, what you want to say. Infinite depth without focus is used in 3d games. In films you often have to follow the focus of the producer and other scenes behind it are washed out. Both, infinite and producer customized depth and focus, are not sufficient and natural methods to solve the problem. The eye simply behaves other ways. In infinite depth, your eye can look on all details, bit the surrounding do not get washed oit or out of focus. That is the same problem, only vice versa. And all the lies, which your eye needs to manage, is so high, that it is a very big problem. I talked to some specialists from frauenhofer institute and they also introduced technology like eye tracking and and eye focus tracking, but this is decades away and still not working sufficient, because our eye is so damn fast. ;) So, simply gimmick for the next 10 or 20 years. Fact! |
Quote:
Quote:
Quote:
Quote:
In summary, I understand you may not like SLI and and don't use it yourself, but that doesn't mean it shouldn't be supported for the benefit of those that do. Boko. |
I think the poll has served the purpose intended.
The results over the last couple of days, arguably have not changed. How much is enough? |
where is the option, " i play clod with on board"???
:razz: |
Quote:
|
+1
P.S: GTX590 Dont know if it works in SLI mode. |
So does this from Luthier's post mean that SLI is fixed for the next patch or not? Doesn't specifically say SLI, just NVIDIA:
"2) Will the next patch be fully NVidia certified, and will it have Crossfire support? NVidia – definitely. CF – still TBD." |
Quote:
|
Quote:
I have a setup like this and in almost any game, it can really achiev good results! Even in CoD I can fly over London with average 33 fps in 5760x1080..... with all settings full. With the new patch I have no crashes after 1 hour of continious game and no noticeable stutters..... Not comparable of course with the 250 average on il2 1946 but you can't have it all ;) Overal it is very good! But the GPUs rock in BF3 and ARMAII in 5760x1080 and the games are so smooth that you're forgetting that you're playing in such a big screen. |
Quote:
What I don´t know is: Do the two cores of a single GPU GTX590 work like in SLI? or is workin only one core? But is very good to know that in a future I can add one more GPU and have very good frames. |
Quote:
I recomend if you you want to have a QSLI to by a hydrosystem to cool them so you could overclock them to real GTX580 or even more. I am thinking to do so in the near future. Otherwise you will have real big noise from the fan system and you cannot of course overclock them as if you do, so you will probable burn the cards......:sad: But be aware: 1. I have combined the cards with a i7 X980 and 12g of RAM. So the results that I am getting from the benchmarks and games are more than good, always in triple monitor setup. 2. I don't know you system specs so I suppose that you have one and you want to buy a second one. For my point of view do it only if you intend to play the game in a triple monitor. When you play a game in a single monitor even in a resolution 2560x1440 one card it is enough to do the job (even more than enough). QSLI in fulHD and hiRES resolutions has no big diference from a dualSLI With respect 335th_GRDedalos |
Quote:
|
to add my two cents, I had GTX 460 SLI setup but got so frustrated with lack of SLI support and poor performance that i bought a GTX 670 to replace them... I would love to have the option of adding a second 670 sometime down the road instead of having to replace a perfectly good card for marginal performance boost simply because a game doesn't have working implementation of a genuinely NICE feature to have.......
|
Quote:
So it is everything, but not 4 cards running slightly on lowrr performance. They are tunning at very low performance. And the expensive ddr5 vram is just wasted, not having any influence. Expensive way. It is like buying another 100gb ddr3 ram just to look good, but with no benefit. Sorry, but do not say, that sli only has small downgrade effect on the cards. It has a very big one!!!!! |
Single card here. AMD Sapphire 7990 ;)
|
Quote:
|
Quote:
So it is not inbetween. Your 4x 580 with 1.5 gig with not have a single bit more thab 1x1.5 gb. That is sli and that needs to be changed. Otherwise ab upgrafe is simply getting by far more expensive than a new single card! Here in clod, you can add 4 more 580 and still have the bottleneck and you will not run a real high resolution or a triple monitor rig sufficiently. And to fire 1 card on 100% and effectively (with the vram issue integrated) 60 - 70 % of all other 3 cards is not very intelligent. It is like buying four cars and directly throw one into trash. Additionally you pay for fuel for 4 cars, although 3 of them are full. You leave this additional fuel untouched forever, except one third which you will burn in your garden for nothing as well. Sounds weird and funny, but noone with the intention to upgrade an old system with sli additional card should ever think about that. It is simply for enthusiasts, who pimp their goid running rig with an second card for an extra bang (to run for example higher res or triple monitors). These guys know, what they have to do and they know, that sli would only work after having a look over the current card and its abilities. Would it be to slow or of a too low vram for example, they wouldn't use it to built up a sli rig. And I just want to tell these facts to you, before you all think, that it is a good idea to built up an sli with 2x 460, because it is cheap. It is simply a bad idea for most of the innocent guys out there not kbowing the facts of sli and what you have to think about it or how to validate Sorry, but please do not all buy sli because some bf3 kiddies tell you. It could happen that you are very upset about your invest. :) |
VRAM isn't the end all be all, a crappy card with a load of VRAM is still a crappy card.
|
Quote:
|
Quote:
Maxing out BF3 at 1920x1080 and maintaining a min FPS of 60 requires a decent i7 and SLI 580's or better. SLI/Crossfire is not a bad idea IF you understand what SLI/Crossfire is and the advantages/disadvantages. |
Omg, sorry, but you did not understand my intention. All I want to say is: All guys with insufficient vram have to reduce settings!!! Some simply do not get that fact, when buying a sli setup! You can hang up 200 videocards in clod with no real benefit, when vram is 512 mb each. Some guys have sli and saying again and again, that the fact about counting only one vram size seems wrong to them. But it is not wrong. Buy sli, if your first card is still sufficient in most terms! Than you are able to benefit the more graphics power. And that is also, what you said, heavy hemi!
When your initial card is running out of vram for example, this should never lead to the decision to buy sli on that base. Sli just will be sufficient, if your initial card is up to date for your games. So, as I said, the sli is normally not a real solution to upgrade very old cards. It is more an upgrade for those, who wants to max out their fps. Normally, these guys are already having a good card. They know these conditions and their buying decision is okay. But i currently seeing sli user using 2x 285 gtx or 2x460s and are now claiming stutters in clod. I just say, that they simply do not know their own rigs' possibilties. I also know kids without knowledge buying sli without thinking about the frame conditions to run it. They bought sli, because they saw a kid playing bf3 on youtube with the headline: sli rocks awesome, but they are not aware of some problems and the initial question, that you should ask yourself before buying sli upgrade. And all I can say about claimers here: You are just uninformed, when thinking to max out clod with an insufficient sli. And in case of clod, the recommended card requirements (even for sli) is VERY high. You need the vram, not only raw power. |
I read somewhere about 2 GB of DDR3 on a GT 430.
Will that do it? Almost certainly, it will not. The amount of RAM is maybe right, but the card is too weak, and the sort of RAM is too slow. It's not only the right card, but also the right sort of RAM as well as enough RAM that is needed. |
It's was even silly to put more VRAM on GTX460 and GTX560, they always put faster memory on 1gb card and it preformed better. Maybe even GTX 660 I'm not sure.
If you run out of frame buffer it'll swap textures with the hard drive, a spinning disk will have a slight stutter. You shouldnt notice much with SSD. |
Quote:
I just always say the same:You didn't hear my advice and now have problems. Stick with it, because there is no solution in solving it except buying again a new card. I cannot help you and do not want to repeat again. You took an advice of a bf3 kid and now you have the problem. Go away and aak him, how to solve, but not me. Just said, that a youtube comment counts more than the opinions of an honestly explaining gaming friend. I cannot help dumb and ignorant people, who will not listen. :) Again: sli = in most cases not to upgrade, only to enhance. If your card is shit and cannot run the game, it would not help to buy a second one. |
Quote:
|
The thing that prevents me from going to an SLI/X-fire setup is all this talk of micro stutter.
The last thing I need is to spend a healthy amount of money on a dual card setup and experience micro stutter. I know many claim never to have experienced it, but others have. How many users here with dual cards have experienced this phenomenon? |
Quote:
|
Quote:
. |
Quote:
|
Not so much a stut..tut...tut...er more like render lags down low flying very high detail.X-fire does make a conciderable difference to performance now.
|
I still have micro stutters now, I thought I had got rid of nearly all of it but the last two patches have returned the problem.
The promised Sli support from Nvidia (luthier has stated that Nvidia have the information but now need to produce a profile for it) may cure what is left. The micro stutters aren't noticeable when dog fighting at altitued, only when chasing fleeing enemy down low. I have limited the amount of stutters I'm getting by limiting fps through EVGA Precision Tool without any loss of performance. Normally I can get fps in London Attack Single Mission of 45--80fps without frame limiting, over water 80---110fps. I've limited fps to 35fps and have smoother game play and very little stutters even down low, but I've had to experiment with a lot of settings to achieve that. C.o.D. is the only modern sim that I play that has this problem (certainly not with R.o.F.) Sli/Xfire support has been promised for so long now it is very frustrating for all of us who have built dual card systems, let's hope that the next Nvidia driver release comes to the party. Luthier said that they had also been in touch with AMD/Radeon about support for their product but at that stage they hadn't had a reply. |
Luthier has already posted the Nvidia profile that will eventually become part of the official drivers. It works very well in SLI now even in 4way SLI.
If your not capable of manually creating a profile using the info Luthier provided then just wait for Nvidia to add it to a driver soon. It does work very well now, the only limiting factor is the rendering thread on the CPU has now completely maxed on my system which means the GPU usage sits around 50% for each GPU over complex terrain but they do all go up to 80% on occasion when the scene is less complex. So bottom line it works exactly as intended, the limiting factor to performance increases pretty much comes down to how fast your CPU is. My CPU is OC to 5Ghz which explains why I get more of a performance increase than others. To be honest I think there is currently only enough CPU horsepower with todays CPUs to drive a 2 way SLI system to 80% usage or higher regularly on CLOD. So anything more than 2 cards is pretty much wasted on CLOD because of its high CPU Ghz dependence. |
And also 2 cards below 1.5 gb vram :) Just ran clod again on highest settings with single gpu at about 50 fps average, but saw a vram reduction to my last (but very old) tests. It uses up to 1.8gb vram at normal 1080p.
So, a card like Ricks 450 with 1gb woill definetly have troubles. 1gb is too less, even 1.5 of the standard 680. Note, that some of the biggest stutters occur with the streaming. I did with my i7 920 @ 4 ghz and ssd. Some stutters are simply not avoidable on some system configs. So, I will need to think about a better cpu to reduce it, although my cpu is running sufficiently for clod. I am not at its performance bottleneck... So, rick, if you want to get rid of vram stutters, u will have to tweak texture load down to 1gb vram usage. That.means reducing texture quality ;) |
All times are GMT. The time now is 11:00 AM. |
Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.