Fulqrum Publishing Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official Fulqrum Publishing forum > Fulqrum Publishing > IL-2 Sturmovik

IL-2 Sturmovik The famous combat flight simulator.

Reply
 
Thread Tools Display Modes
  #11  
Old 12-16-2010, 11:51 AM
Baron Baron is offline
Approved Member
 
Join Date: Dec 2007
Posts: 705
Default

Quote:
Originally Posted by Triggaaar View Post
Well it's unlikely the 580 will be the fasted card when SOW is released, since the 6990 should be out by then. But if youy want a signle gpu, and have loadsa money, 580 should be the best.

I wouldn't bother with Toms Hardware anymore - was great some years ago, but was bought out and reviews have gone downhill. Particularly nvidia vs ATI/AMD.

Indeed, the lack of support from AMD is rubbish (I wonder if emails not returned can be linked to the AMD ATI takeover - staff more worried about their jobs than helping a developer at the time?). But we won't really know what's best until release time - it'll be a DX11 game after all, so it should run equally on either brand, as long as AMD eventually catch up and work with Oleg.
I doubt very much 6990 will be much faster in general. As it stands now Amd have pushed 6950/70 to the limit in regards of tdp (power/heat). The cards are ecentially down klocked to even pass Amd`s optemistic low consumption mantra). This is why they are so much slower than everyone expected imo. A 6990 will proppably be even more hampered.

This speaks volumes imo:

"Should your shiny new 6000-series card’s death turn out to be PowerTune-related, a warranty won’t cover it. Sounds a little like Nissan equipping its GT-R with launch control, and then denying warranty claims when someone pops the tranny. Nevertheless, you’ve been warned."


Stock overclocking gives pretty much 0 boost. The only time anything happens is when u crank up the powertune +20% (and ooops, there goes the warrenty out the window) wich, if im not misstaken is the same as using voltage tuner = more heat/power consumption = Amd mantra out the window.


This is how i intepret it, please show me wrong if possible because its looking bleek for Amd in my opinion.


Anyways, in the end it wont make a differance. People buy the card they want no matter what so it will proppably even it selfe out in the end.

Last edited by Baron; 12-16-2010 at 12:02 PM.
Reply With Quote
  #12  
Old 12-16-2010, 12:16 PM
albx albx is offline
Approved Member
 
Join Date: Jun 2010
Location: Italy
Posts: 716
Default

and... here begin again the battle ATI/AMD Vs. Nvidia....
Reply With Quote
  #13  
Old 12-16-2010, 12:34 PM
Flanker35M Flanker35M is offline
Approved Member
 
Join Date: Dec 2009
Location: Finland
Posts: 1,806
Default

S!

Hopefully we do NOT see the "TWAT" symbol when starting up SoW one day. Would be a big error to margin a game already in a small niché to just one brand..again. Should have learned that from IL-2.

Blaming AMD for NOT developing SoW with Oleg is pretty darn useless as we do NOT know anything what goes on deep behind the scenes. We get tidbits of info, but Oleg does NOT share confidential information with the community. Be Sure!

I read at some point Luthier(if I remember right) had a machine with AMD 5870HD for testing SoW. Both AMD and NV support DX9-11 that are to be used in SoW, would not worry about it

I run AMD system, have a 580GTX too and must say that in games I play (VSync is ON) I see no difference in performance AT ALL between the cards. IL-2 is an exception but it is "TWAT" certified as we all know.

Benchmarks are different, but if BOTH brands can push a steady 60fps+ at your preferred resolution/settings then it is just stupid to argue if the other brand can do 200fps and the other 190fps..as you can not see the difference at all
Reply With Quote
  #14  
Old 12-16-2010, 01:59 PM
Triggaaar Triggaaar is offline
Approved Member
 
Join Date: Sep 2010
Posts: 535
Default

Quote:
Originally Posted by Baron View Post
I doubt very much 6990 will be much faster in general.
The 5970 is already up with the 580, and the 5xxx series had rubbish scaling. The 6970 in crossfire matches the 580 in SLI, so a dual GPU 69xx series will easily be faster than the 580. It will cost more too, and have the dual card associated issues. A dual gpu will obviously use more power than a single, and they won't apply the same restrictions.

Quote:
Stock overclocking gives pretty much 0 boost.
Says who? I haven't got one, so I can't test it, but Gibbo (ocuk employee) tested the 69xx series before release and wrote "Overclocking - Both cards did not support voltage tweak *YET* hence the best stable speeds I could reach was 930MHz Core and 5950MHz Memory on the 6950 and 980MHz Core and 6200MHz Memory on the 6870. I believe with voltage control we will see above 1000MHz and 6500MHz on the 6970's."
So they clearly do overclock.

Quote:
This is how i intepret it, please show me wrong if possible because its looking bleek for Amd in my opinion.
It doesn't look bleak at all. Both manufacturers have good cards out. The 580 is the fastest single gpu, and will be for 12 months or so. But so what, it's expensive. The 570 and 6970 perform similarly,and which is right for you (if that's your price range) depends on the games you play, resolution, number of screens, and whether you're likely to have 2 cards at any point. Under the 570/6970 the best card is the 6950, and that will probably be matched by the 560 shortly. It doesn't look bleak for either of them.
Reply With Quote
  #15  
Old 12-16-2010, 02:00 PM
Meusli Meusli is offline
Approved Member
 
Join Date: Oct 2007
Posts: 376
Default

Well the question I want to know is will the game use Physx at any point, because that's a great Nvidea selling point.
Reply With Quote
  #16  
Old 12-16-2010, 02:03 PM
Triggaaar Triggaaar is offline
Approved Member
 
Join Date: Sep 2010
Posts: 535
Default

I doubt it. Oleg certainly said it doesn't at the moment.
Reply With Quote
  #17  
Old 12-16-2010, 02:29 PM
kalimba
Guest
 
Posts: n/a
Default SOW and NVidia

Since this is a SOW related debate, I would say that NVidia seems to be the best choice for SOW...Otherwise, Oleg would have mentionned AMD's interest in the game ...Be sure that if we have graphic issues with the game, it will fall on AMD's shoulders fast enough...
And about the price...We are talking about $150.00 between the very top GTX580 and wathever is coming under...We have been waiting for this game for about what ? 5 years ?....Its is $30.00 a year that I have put in my piggybank...So, yep, the NVidia is the fastest for the upcomming months...

Salute !
Reply With Quote
  #18  
Old 12-16-2010, 02:42 PM
addman's Avatar
addman addman is offline
Approved Member
 
Join Date: Mar 2010
Location: Vasa, Finland
Posts: 1,593
Default

Isn't it very simple though? Buy the best you can for the money you have! I can't afford a new monster card at the moment but even if I could, why would I want to fork out X amount of € now? SoW is still far off. General advice, wait til SoW is released, try it with your current hardware. You might be surprised/dissapointed at the performance but at least you'll know what the game demands, IMHO of course.
__________________
Reply With Quote
  #19  
Old 12-16-2010, 03:04 PM
speculum jockey
Guest
 
Posts: n/a
Default

Quote:
Originally Posted by addman View Post
Isn't it very simple though? Buy the best you can for the money you have! I can't afford a new monster card at the moment but even if I could, why would I want to fork out X amount of € now? SoW is still far off. General advice, wait til SoW is released, try it with your current hardware. You might be surprised/dissapointed at the performance but at least you'll know what the game demands, IMHO of course.
Good advice. Everyone always quotes Nvidia/ATI's flagship card, but really few people buy them. They're just bragging rights because anyone with 1/2 a brain waits for the next "middle of the road" card release 3-4 months later that has the same or better performance at 1/2 the price. I'm waiting a few months after it's out. Maybe wait for a patch or two. Then after all the reviews and tests are out I'm going to pick the card (whatever brand) that runs it best for the money I'm willing to spend.

As for AMD/ATI not sending a card or a rep over, I can totally understand that.

"Hey, we got a call from Russia, about SOW."
"What game?"
"Storm of war, the guys that made IL-2, it's been in development for the past 5 years."
"A flight sim. . . been developed for 5 years? Forget about it."

Sounds like ATI missed out on a highly lucrative opportunity! It's a wonder they own over 1/2 the market. Oh wait! They don't have to do jack since anyone making a game would be stupid to not use ATI samples to appeal to over half the market.

Hmm. . . Spend money by sending a rep and free cards over so you get a logo on boot-up? Or save your money and time on a niche release and have the developers do it themselves?
Reply With Quote
  #20  
Old 12-16-2010, 05:54 PM
Bricks Bricks is offline
Approved Member
 
Join Date: Nov 2010
Location: Online
Posts: 51
Default

Quote:
Originally Posted by Meusli View Post
Well the question I want to know is will the game use Physx at any point, because that's a great Nvidea selling point.
You can enable a similar rendering on AMDs cards as well. So that argument is a little thin.

Another good hint, why there's not one in general better than the other. Both have their pros and cons.

If everybody would accept that, we could end the discussion here.



After all, since there is no info on how well the final SOW will run with either card, what is the point in making a fuzz anyway?
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 10:00 AM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.