Log in

View Full Version : Good thing SOW is Nvidia's friendly !


kalimba
12-16-2010, 02:55 AM
http://www.tomshardware.com/reviews/radeon-hd-6970-radeon-hd-6950-cayman,2818.html

Not quite there AMD ! Seems like de GTX580 will be the beast to fly SOW at release time ...:rolleyes:

SAlute !

TeeJay82
12-16-2010, 05:27 AM
oh happy dayyyys

JAMF
12-16-2010, 09:05 AM
So limited, posting one link. :P

AnandTech (http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950)
HardOCP (http://www.hardocp.com/article/2010/12/14/amd_radeon_hd_6970_6950_video_card_review)
Guru3D (http://www.guru3d.com/article/radeon-6950-6970-review/)
the Tech Report (http://techreport.com/articles.x/20126)
PC Perspective (http://www.pcper.com/article.php?aid=1051)
HotHardware (http://hothardware.com/reviews/AMD-Radeon-HD-6970--6950-GPU-Reviews-Enter-Cayman/)
Hexus (http://www.hexus.net/content/item.php?item=27983)
Dutch review on Tweakers (http://tweakers.net/reviews/1916/hd-6950-en-hd-6970-het-nieuwe-high-end-duo-van-amd.html)
Tweakers Google translated (http://translate.google.com/translate?js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&sl=nl&tl=en&u=http%3A%2F%2Ftweakers.net%2Freviews%2F1916%2Fhd-6950-en-hd-6970-het-nieuwe-high-end-duo-van-amd.html)
German review on ComputerBase (http://www.computerbase.de/artikel/grafikkarten/2010/test-amd-radeon-hd-6970-und-hd-6950/#abschnitt_einleitung)
ComputerBase Google translated (http://translate.google.com/translate?js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&sl=de&tl=en&u=http%3A%2F%2Fwww.computerbase.de%2Fartikel%2Fgra fikkarten%2F2010%2Ftest-amd-radeon-hd-6970-und-hd-6950%2F%23abschnitt_einleitung)
HardwareCanucks (http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/38899-amd-radeon-hd-6970-hd-6950-review-13.html)
Uncle Tom's (http://www.tomshardware.com/reviews/radeon-hd-6970-radeon-hd-6950-cayman,2818.html)
PCPro.co.uk (http://www.pcpro.co.uk/reviews/graphics-cards/363682/amd-radeon-hd-6970)
Legit Reviews (http://www.legitreviews.com/article/1488/1/)

BitTech 6970 review (http://www.bit-tech.net/hardware/graphics/2010/12/15/ati-radeon-hd-6970-review/1)
TechPowerUp 6970 review (http://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/)
TechSpot 6970 review (http://www.techspot.com/review/348-amd-radeon-6970/)
Benchmark Reviews (http://benchmarkreviews.com/index.php?option=com_content&task=view&id=605&Itemid=72)

Qpassa
12-16-2010, 09:17 AM
I hope sow will be neutral

albx
12-16-2010, 09:20 AM
....

JG52Krupi
12-16-2010, 09:24 AM
The 5970 can still give the 580 a run for its money I doubt i will be looking for a new card any time soon.

JG52Krupi
12-16-2010, 09:33 AM
I hope sow will be neutral

Yeah, it would be foolish to go alone with Nvidia their cards are expensive, AMD are going for cheap, efficient and still quite high performance cards so IMHO you will find that AMD have a larger share of the market...

Feathered_IV
12-16-2010, 10:42 AM
It was very interesting reading the SimHQ interview with Oleg where he said he was in constant contact with Nvidia, who have assisted greatly in helping him get the most out of NV technology for SoW. They even sent tech reps to visit them in Moscow. Even more interesting was that Oleg said he has received no such help from AMD/ATI. They won't even answer his emails!

I know what brand of card I'll be buying for SoW. :rolleyes:

Codex
12-16-2010, 11:28 AM
It was very interesting reading the SimHQ interview with Oleg where he said he was in constant contact with Nvidia, who have assisted greatly in helping him get the most out of NV technology for SoW. They even sent tech reps to visit them in Moscow. Even more interesting was that Oleg said he has received no such help from AMD/ATI. They won't even answer his emails!

I know what brand of card I'll be buying for SoW. :rolleyes:

They're still not getting involved with developers? I thought they changed all that and increased their level engagement last year.

Triggaaar
12-16-2010, 11:34 AM
http://www.tomshardware.com/reviews/radeon-hd-6970-radeon-hd-6950-cayman,2818.html

Not quite there AMD ! Seems like de GTX580 will be the beast to fly SOW at release time ...:rolleyes:Well it's unlikely the 580 will be the fasted card when SOW is released, since the 6990 should be out by then. But if youy want a signle gpu, and have loadsa money, 580 should be the best.

I wouldn't bother with Toms Hardware anymore - was great some years ago, but was bought out and reviews have gone downhill. Particularly nvidia vs ATI/AMD.

It was very interesting reading the SimHQ interview with Oleg where he said he was in constant contact with Nvidia, who have assisted greatly in helping him get the most out of NV technology for SoW. They even sent tech reps to visit them in Moscow. Even more interesting was that Oleg said he has received no such help from AMD/ATI. They won't even answer his emails!

I know what brand of card I'll be buying for SoW. :rolleyes:Indeed, the lack of support from AMD is rubbish (I wonder if emails not returned can be linked to the AMD ATI takeover - staff more worried about their jobs than helping a developer at the time?). But we won't really know what's best until release time - it'll be a DX11 game after all, so it should run equally on either brand, as long as AMD eventually catch up and work with Oleg.

Baron
12-16-2010, 11:51 AM
Well it's unlikely the 580 will be the fasted card when SOW is released, since the 6990 should be out by then. But if youy want a signle gpu, and have loadsa money, 580 should be the best.

I wouldn't bother with Toms Hardware anymore - was great some years ago, but was bought out and reviews have gone downhill. Particularly nvidia vs ATI/AMD.

Indeed, the lack of support from AMD is rubbish (I wonder if emails not returned can be linked to the AMD ATI takeover - staff more worried about their jobs than helping a developer at the time?). But we won't really know what's best until release time - it'll be a DX11 game after all, so it should run equally on either brand, as long as AMD eventually catch up and work with Oleg.

I doubt very much 6990 will be much faster in general. As it stands now Amd have pushed 6950/70 to the limit in regards of tdp (power/heat). The cards are ecentially down klocked to even pass Amd`s optemistic low consumption mantra). This is why they are so much slower than everyone expected imo. A 6990 will proppably be even more hampered.

This speaks volumes imo:

"Should your shiny new 6000-series card’s death turn out to be PowerTune-related, a warranty won’t cover it. Sounds a little like Nissan equipping its GT-R with launch control, and then denying warranty claims when someone pops the tranny. Nevertheless, you’ve been warned."


Stock overclocking gives pretty much 0 boost. The only time anything happens is when u crank up the powertune +20% (and ooops, there goes the warrenty out the window) wich, if im not misstaken is the same as using voltage tuner = more heat/power consumption = Amd mantra out the window.


This is how i intepret it, please show me wrong if possible because its looking bleek for Amd in my opinion.


Anyways, in the end it wont make a differance. People buy the card they want no matter what so it will proppably even it selfe out in the end.

albx
12-16-2010, 12:16 PM
and... here begin again the battle ATI/AMD Vs. Nvidia.... :rolleyes:

Flanker35M
12-16-2010, 12:34 PM
S!

Hopefully we do NOT see the "TWAT" symbol when starting up SoW one day. Would be a big error to margin a game already in a small niché to just one brand..again. Should have learned that from IL-2.

Blaming AMD for NOT developing SoW with Oleg is pretty darn useless as we do NOT know anything what goes on deep behind the scenes. We get tidbits of info, but Oleg does NOT share confidential information with the community. Be Sure!

I read at some point Luthier(if I remember right) had a machine with AMD 5870HD for testing SoW. Both AMD and NV support DX9-11 that are to be used in SoW, would not worry about it :)

I run AMD system, have a 580GTX too and must say that in games I play (VSync is ON) I see no difference in performance AT ALL between the cards. IL-2 is an exception but it is "TWAT" certified as we all know.

Benchmarks are different, but if BOTH brands can push a steady 60fps+ at your preferred resolution/settings then it is just stupid to argue if the other brand can do 200fps and the other 190fps..as you can not see the difference at all ;)

Triggaaar
12-16-2010, 01:59 PM
I doubt very much 6990 will be much faster in general.The 5970 is already up with the 580, and the 5xxx series had rubbish scaling. The 6970 in crossfire matches the 580 in SLI, so a dual GPU 69xx series will easily be faster than the 580. It will cost more too, and have the dual card associated issues. A dual gpu will obviously use more power than a single, and they won't apply the same restrictions.

Stock overclocking gives pretty much 0 boost.Says who? I haven't got one, so I can't test it, but Gibbo (ocuk employee) tested the 69xx series before release and wrote "Overclocking - Both cards did not support voltage tweak *YET* hence the best stable speeds I could reach was 930MHz Core and 5950MHz Memory on the 6950 and 980MHz Core and 6200MHz Memory on the 6870. I believe with voltage control we will see above 1000MHz and 6500MHz on the 6970's."
So they clearly do overclock.

This is how i intepret it, please show me wrong if possible because its looking bleek for Amd in my opinion.It doesn't look bleak at all. Both manufacturers have good cards out. The 580 is the fastest single gpu, and will be for 12 months or so. But so what, it's expensive. The 570 and 6970 perform similarly,and which is right for you (if that's your price range) depends on the games you play, resolution, number of screens, and whether you're likely to have 2 cards at any point. Under the 570/6970 the best card is the 6950, and that will probably be matched by the 560 shortly. It doesn't look bleak for either of them.

Meusli
12-16-2010, 02:00 PM
Well the question I want to know is will the game use Physx at any point, because that's a great Nvidea selling point.

Triggaaar
12-16-2010, 02:03 PM
I doubt it. Oleg certainly said it doesn't at the moment.

kalimba
12-16-2010, 02:29 PM
Since this is a SOW related debate, I would say that NVidia seems to be the best choice for SOW...Otherwise, Oleg would have mentionned AMD's interest in the game ...Be sure that if we have graphic issues with the game, it will fall on AMD's shoulders fast enough...
And about the price...We are talking about $150.00 between the very top GTX580 and wathever is coming under...We have been waiting for this game for about what ? 5 years ?....Its is $30.00 a year that I have put in my piggybank...So, yep, the NVidia is the fastest for the upcomming months...

Salute !

addman
12-16-2010, 02:42 PM
Isn't it very simple though? Buy the best you can for the money you have! I can't afford a new monster card at the moment but even if I could, why would I want to fork out X amount of € now? SoW is still far off. General advice, wait til SoW is released, try it with your current hardware. You might be surprised/dissapointed at the performance but at least you'll know what the game demands, IMHO of course.

speculum jockey
12-16-2010, 03:04 PM
Isn't it very simple though? Buy the best you can for the money you have! I can't afford a new monster card at the moment but even if I could, why would I want to fork out X amount of € now? SoW is still far off. General advice, wait til SoW is released, try it with your current hardware. You might be surprised/dissapointed at the performance but at least you'll know what the game demands, IMHO of course.

Good advice. Everyone always quotes Nvidia/ATI's flagship card, but really few people buy them. They're just bragging rights because anyone with 1/2 a brain waits for the next "middle of the road" card release 3-4 months later that has the same or better performance at 1/2 the price. I'm waiting a few months after it's out. Maybe wait for a patch or two. Then after all the reviews and tests are out I'm going to pick the card (whatever brand) that runs it best for the money I'm willing to spend.

As for AMD/ATI not sending a card or a rep over, I can totally understand that.

:grin:"Hey, we got a call from Russia, about SOW."
:cool:"What game?"
:grin:"Storm of war, the guys that made IL-2, it's been in development for the past 5 years."
:cool:"A flight sim. . . been developed for 5 years? Forget about it."

Sounds like ATI missed out on a highly lucrative opportunity! :rolleyes: It's a wonder they own over 1/2 the market. Oh wait! They don't have to do jack since anyone making a game would be stupid to not use ATI samples to appeal to over half the market.

Hmm. . . Spend money by sending a rep and free cards over so you get a logo on boot-up? Or save your money and time on a niche release and have the developers do it themselves?

Bricks
12-16-2010, 05:54 PM
Well the question I want to know is will the game use Physx at any point, because that's a great Nvidea selling point.

You can enable a similar rendering on AMDs cards as well. So that argument is a little thin.

Another good hint, why there's not one in general better than the other. Both have their pros and cons.

If everybody would accept that, we could end the discussion here.



After all, since there is no info on how well the final SOW will run with either card, what is the point in making a fuzz anyway?

robtek
12-16-2010, 06:31 PM
Rightly said!
The correct thread-title should be: Nvidia is SOW friendly.

T}{OR
12-16-2010, 07:00 PM
Important thing to note when looking at all those reviews (e.g. when comparing the two current rivals HD 6970 vs. GFX 570):

GFX 570:
quieter & colder
better value for money

HD 6970:
hotter and louder
has more RAM - must have if you're playing on more than one monitor


Rightly said!
The correct thread-title should be: Nvidia is SOW friendly.

Oleg himself confirmed that he will be sending early versions of the game to Nvidia for optimizing. That was over a month ago.


Performance wise, both cards are almost equal with only difference in some games for which they are better optimized. Since 570 is an nVidia card, the choice is obvious here.

However, the real diamond here is HD 6950. In CF it's performance is awesome. Decisions, decisions... :)

JG52Krupi
12-16-2010, 07:04 PM
Important thing to note when looking at all those reviews (e.g. when comparing the two current rivals HD 6970 vs. GFX 570):

GFX 570:
quieter & colder
better value for the money

HD 6970:
hotter and louder
has more RAM - must have if you're playing on more than one monitor




Oleg himself confirmed that he will be sending early versions of the game to Nvidia for optimizing. That was over a month ago.


Performance wise, both cards are almost equal with only difference in some games for which they are better optimized. Since 570 is an nVidia card, the choice is obvious here.

However, the real diamond here is HD 6950. In CF it's performance is awesome. Decisions, decisions... :)

LOL when has nvidia ever been the cheaper option :D

T}{OR
12-16-2010, 07:05 PM
LOL when has nvidia ever been the cheaper option :D

Apparently the Hell froze over as 570's are priced lower than 6970's. ;)

JG52Krupi
12-16-2010, 07:11 PM
Apparently the Hell froze over as 570's are priced lower than 6970's. ;)

Dammit i must have slept myself into a parallel dimension again... :-?

robtek
12-16-2010, 08:20 PM
The prices i see atm are HD6970 330€ and GFX 570 350€!

T}{OR
12-16-2010, 08:28 PM
I only quoted what various reviews said on the first page. If that is the case than AMD made a good choice. Or just your retailer. :)

swiss
12-16-2010, 08:46 PM
Good advice. Everyone always quotes Nvidia/ATI's flagship card, but really few people buy them. They're just bragging rights because anyone with 1/2 a brain waits for the next "middle of the road" card release 3-4 months later that has the same or better performance at 1/2 the price. I'm waiting a few months after it's out. Maybe wait for a patch or two. Then after all the reviews and tests are out I'm going to pick the card (whatever brand) that runs it best for the money I'm willing to spend.

As for AMD/ATI not sending a card or a rep over, I can totally understand that.

:grin:"Hey, we got a call from Russia, about SOW."
:cool:"What game?"
:grin:"Storm of war, the guys that made IL-2, it's been in development for the past 5 years."
:cool:"A flight sim. . . been developed for 5 years? Forget about it."

Sounds like ATI missed out on a highly lucrative opportunity! :rolleyes: It's a wonder they own over 1/2 the market. Oh wait! They don't have to do jack since anyone making a game would be stupid to not use ATI samples to appeal to over half the market.

Hmm. . . Spend money by sending a rep and free cards over so you get a logo on boot-up? Or save your money and time on a niche release and have the developers do it themselves?


Really?
You may have noticed the PC market is not emerging but shrinking.
As a result any manufacturer would be well advised to fill any niche they can find or are offered.
Not at any price of course.

The price of cards are peanuts, they would need maybe 6 cards, if they retail at $600, cost will be around $150/p.
Big bucks, huh?

As for sending a rep over - AMD have a Moscow office...

BTW: I now of a few bicycle companies who had exactly the attitude you suggested - all of them went for chapter 11.
Rock Shox and Cannondale are probably the most famous of them.

Chivas
12-16-2010, 11:13 PM
I could care less which company makes the fastest gpu. The competition should make thing cheaper for all of us. I buy which ever gpu is faster at the time I'm upgrading. I like the idea of Nvidia working with the SOW developers. This could lead to a similiar situation where nvidia cards were better than ati card at rendering water in IL-2 at one time.

I know I will be upgrading quickly after SOW is released, and I see the first benchmarks.

Blackdog_kt
12-16-2010, 11:53 PM
S!

Hopefully we do NOT see the "TWAT" symbol when starting up SoW one day. Would be a big error to margin a game already in a small niché to just one brand..again. Should have learned that from IL-2.

Blaming AMD for NOT developing SoW with Oleg is pretty darn useless as we do NOT know anything what goes on deep behind the scenes. We get tidbits of info, but Oleg does NOT share confidential information with the community. Be Sure!

I read at some point Luthier(if I remember right) had a machine with AMD 5870HD for testing SoW. Both AMD and NV support DX9-11 that are to be used in SoW, would not worry about it :)

I run AMD system, have a 580GTX too and must say that in games I play (VSync is ON) I see no difference in performance AT ALL between the cards. IL-2 is an exception but it is "TWAT" certified as we all know.

Benchmarks are different, but if BOTH brands can push a steady 60fps+ at your preferred resolution/settings then it is just stupid to argue if the other brand can do 200fps and the other 190fps..as you can not see the difference at all ;)

Isn't it very simple though? Buy the best you can for the money you have! I can't afford a new monster card at the moment but even if I could, why would I want to fork out X amount of € now? SoW is still far off. General advice, wait til SoW is released, try it with your current hardware. You might be surprised/dissapointed at the performance but at least you'll know what the game demands, IMHO of course.

Good advice. Everyone always quotes Nvidia/ATI's flagship card, but really few people buy them. They're just bragging rights because anyone with 1/2 a brain waits for the next "middle of the road" card release 3-4 months later that has the same or better performance at 1/2 the price. I'm waiting a few months after it's out. Maybe wait for a patch or two. Then after all the reviews and tests are out I'm going to pick the card (whatever brand) that runs it best for the money I'm willing to spend.



That's pretty much the way i see it as well. I've had nVidia cards all my life but got an Ati for the first time when i bought my latest system, simply because it was better and cheaper than nVidia's offerings for my preferred price range at that point in time.

All i care about is having a card that gives me about 50-60 FPS in mid-high settings or 35-50 FPS in high settings (not perfect for now, just a bit more than middle of the road), runs cool and low on wattage and is not terribly expensive. I have a single 1680x1050 22" monitor as well, so it's not like i'm going to need something that can push an awful lot of pixels anytime soon, in fact i will probably bring up my RAM from 3 to 6GB before i even consider swapping my Ati 4890 1GB.

Also, the thread title is misleading as there's no official verdict on which brand runs SoW better and the reason is simple: we don't have SoW yet to run any benchmarks on. All we know is nVidia is interested in pushing some cards to the flight sim crowd, but we don't know how well their architecture works with the game engine or what AMD will or won't do.

IL2 is nVidia friendly due to OpenGL. SoW is a directx game engine, so as long as both GPU manufacturers come up with driver updates to correct any possible glitches it will be just fine either way. And that's the way it should be, because lack of competition only hurts us, the consumers. For example, if the end result from picking any brand is a gain of 10 FPS for an extra 150$ then they can keep their extra 10 FPS, i can wait and run things in lower detail for another 6 months before things get cheaper ;)

In any case, there's a lot of speculating and quoting benchmarks from games that are fundamentally different from flight sims (like first person shooters for example) in the way they work and what kind of resources they depend on. Well, i don't need 150 FPS, give me 50 FPS but stable/constant frame rates so as not to mess up my gunnery, some fast RAM to shorten the map loading times and that's what i need for my flight simming. We don't really know much about the kind of graphics features the SoW engine will or will not use, or how much it will benefit from each GPU architecture. Case in point, the excitement about tesselation (used mainly in shooters up to now) until Oleg Maddox himself went forward and said that SoW won't use any tesselation at all.

The whole situation reminds me of a Greek saying that goes "the fish are still in the water but the frying pan is already on the fire" :grin:

Flanker35M
12-17-2010, 06:59 AM
S!

Well summed up Blackdog. People rave about FPS shooter benchmarks, snip parts suitable for them and tout either brand being superior to another. Which is pretty useless.

Going to the bang for the buck thing. Take AMD 6950HD x 2, cost less than a 580GTX and kicks the living crap out of it in performance. Even 580GTX in SLI is only marginally faster than 6950HD in XFire. The 6970HD is equal or even faster due the much improved scaling AMD has managed to achieve. Even so with 6970HD you save almost 300USD compared to 580GTX SLI..with that money you can get a SSD disc to boost loading times and/or more RAM ;)

It is all about what you want to invest. Let's take an example. I have now an AMD Phenom II X6 1090T BE which loses to the Intel 980X in raw performance. But when looking at what I get for the money AMD wins. Here the AMD costs ~230€ vs. ~930€ of the Intel 980X. So a whopping 700€ difference there.

Motherboards for both are roughly the same price (X58 and 890FX) so not much difference. Intel uses triple channel so need 3 x RAM sticks where AMD with it's dual channel takes 2x. A small saving there too, yet insignificant IMO.

Now take the 580GTX which here is ~500€ vs. AMD 6970HD at ~390€. I save 110€ on the spot. So far I have spent 810€ less on the AMD than an Intel rig with nVidia. For the 6970HD my 650W Corsair PSU is enough, for the 580GTX safer bet is 700W+, again some tens, even over 100€ depending on PSU, more if need to upgrade.

So let's do some calculations and conclusions. I buy Intel 980X for 930€ + 580GTX for 500€ totalling 1430€. Mobo/RAM are not an issue, quite same prices. Now I buy AMD X6 1090T for 230€ + 6970HD for 390€ totalling 620€. The difference is 810€! With that I can add another 6970HD, a better PSU for XFire, a SSD drive and STILL costing LESS than the Intel/nVidia rig..and being comparable or even faster performance wise.

Bear in mind that if you go SLI on 580GTX the gap widens even more as you need a 1000W+ PSU to run it, those cost over 200€ here. And still the AMD rig is very close in performance and both an overkill to ANY game around at this point.

My point? Really what Blackdog said, the biggest and baddest stuff are used by a far less amount of players than the mainstream products, yet these being capable of running any game with good performance. Therefore claiming one brand is better than another is quite narrow minded. Boils down what you get for the money invested, not all have this "daddy pays" or born with a "golden spoon in mouth" ;)

SoW will be a stress test for the systems we have, but yet it seems it needs more RAM(texture loading etc.), a good CPU(physics, DM/FM etc.) and less GPU(no tessellation or PhysX there). The top end GPU would be needed at high resolutions with lot of FSAA/AF used IMO.

Looking that AMD or nVidia/Intel is the only way to play a game is stupid. Use what you want, whatever floats your boat. There is no need being a fanboy of either brand, both do their job in any game today.

Have a nice weekend.

klem
12-17-2010, 09:10 AM
Well I have to make a decision today as my new PC is built, awaiting my GPU decision. I've spent the past three days poring over the reviews. I have been steered by the written reviews but this link shows graphically what many are saying.
http://www.pcinlife.com/article_photo/cayman/results_game/total.png
Call of Duty is interesting because I believe it is OpenGL which is IL-2 country but these cards should spank IL-2 anyway. The Directx/3D table at the bottom looks at Directx/3D.

All the reviews are pointing to a couple of general conclusions.

General: the 6970 targets the 570GTX, the 6950 pitches in between the 480GTX and the 570GTX. (btw, the 6970 does not target the 580GTX. That is 5970 and 6990 territory.)

Performance: Below 1900 resolution the 570GTX has the edge on the 6970. However, if Tesselation is in use the 6970 wins (but SoW won't have Tesselation). At 1900 its a bit of a toss-up. Above 1900 the 6970 takes over.

Power and Heat: The 570 edges the 6970 out by something like 30w and 10 degrees under load. One review stated that two 6970s in Crossfire might lead to heat problems if they are very close, depending on the motherboard/ventilation.

Price: Not much between the 570 and the 6970 depending on which supplier/badge you pick.


My position is: 1680x1050 resolution and not likely to change in the next 2-3 years); New PC is overclocked from 3GHz to 4.2GHz so heat is a point to watch; My budget is £300, squeeezable to £350 in a very cause cause.

I am not a fanboy. I was very happy with my ATI 9600GT and have been very happy with my NV 7800GTs. As much as I wanted to believe the hype and buy a shiny new 570GTX-killing 6970, for me it looks like the 570GTX. My only thought is that a few reviewers feel that driver improvements might increase the 6970 performance. I wonder.......

Got to make my mind up in the next few hours and make the phone call....... :(

Baron
12-17-2010, 09:48 AM
Well I have to make a decision today as my new PC is built, awaiting my GPU decision. I've spent the past three days poring over the reviews. I have been steered by the written reviews but this link shows graphically what many are saying.
http://www.pcinlife.com/article_photo/cayman/results_game/total.png
Call of Duty is interesting because I believe it is OpenGL which is IL-2 country but these cards should spank IL-2 anyway. The Directx/3D table at the bottom looks at Directx/3D.

All the reviews are pointing to a couple of general conclusions.

General: the 6970 targets the 570GTX, the 6950 pitches in between the 480GTX and the 570GTX. (btw, the 6970 does not target the 580GTX. That is 5970 and 6990 territory.)

Performance: Below 1900 resolution the 570GTX has the edge on the 6970. However, if Tesselation is in use the 6970 wins (but SoW won't have Tesselation). At 1900 its a bit of a toss-up. Above 1900 the 6970 takes over.

Power and Heat: The 570 edges the 6970 out by something like 30w and 10 degrees under load. One review stated that two 6970s in Crossfire might lead to heat problems if they are very close, depending on the motherboard/ventilation.

Price: Not much between the 570 and the 6970 depending on which supplier/badge you pick.


My position is: 1680x1050 resolution and not likely to change in the next 2-3 years); New PC is overclocked from 3GHz to 4.2GHz so heat is a point to watch; My budget is £300, squeeezable to £350 in a very cause cause.

I am not a fanboy. I was very happy with my ATI 9600GT and have been very happy with my NV 7800GTs. As much as I wanted to believe the hype and buy a shiny new 570GTX-killing 6970, for me it looks like the 570GTX. My only thought is that a few reviewers feel that driver improvements might increase the 6970 performance. I wonder.......

Got to make my mind up in the next few hours and make the phone call....... :(

Feels abit "unfair" for us "outsiders" comming in at this late stage trying to sway u one way or the other. :)

I can only go with what i personally would have choosen and imo u cant go wrong with Evga 570. (remember, drivers for 5xx series is very new to)

Reasently got myselfe a Evga 470 and are extremly happy with it, way cooler than i expected, no noise when surfing (ALL cards are noisy in load) , very good service from Evga (step up program etc). Overclocks like a motherf**er to doesnt hurt either:)

In the next few months i will proppably use the step up program to get a Gtx570 instead, no hazzel trying to sell of my used 470, just swap and pay the differance and viola:)

Flanker35M
12-17-2010, 09:54 AM
S!

Klem would not go wrong with either card IMO. If he plays titles with PhysX then he should go for nVidia. Otherwise tough cal.

JG52Krupi
12-17-2010, 10:29 AM
S!

Klem would not go wrong with either card IMO. If he plays titles with PhysX then he should go for nVidia. Otherwise tough cal.

I have played physx games with my 5970 such as mafia 2, the effects where amazing.

Flanker35M
12-17-2010, 11:15 AM
S!

Mere gimmicks to me. I will look into it when EVE Online will get the new physics system that possibly uses PhysX or something..but not in a couple of years. For me the game and the plot, how it is done is more important than see a flag waving or some pieces of rock fly around and bounce like they are weightless as usually the ground in games is solid and does not give in as IRL.

And you can use AMD + PhysX..just have to use drivers that have been "enhanced" to bypass the nVidia's code. ;)

lbuchele
12-17-2010, 11:28 AM
I´ll probably go for the GTX 580 or the upcoming GTX 590 IF the SOW reviews point to more GPU horsepower.
It´s important to put some money in a really fast CPU too.

speculum jockey
12-17-2010, 12:38 PM
When figuring in the cost of a card vs. another card and then adding to the SLI/Crossfire you should take into account the increased power consumption. GPU's these days are not like the Voodoo3's and 64mb Geforce 4's of late 90's early 00's. They are big, and they eat a lot of electricity. Might not be a factor for people who have an apartment or are living at their parent's house, but that adds up over the year, especially if you're one of those gamers who spends a few hours a night gaming and not just having your cards sitting idle while surfing the net.

I'd love to see some kWh usage per month/year for some of those cards. Given the heat they produce they might rival your average TV or small appliance.

klem
12-17-2010, 01:40 PM
Well the deed is done. I have gone for the EVGA Superclocked 570GTX.

Power consumption of the 6970 and 570 at stock speeds is very similar although one report put the 6970 as the higher of the two on higher resolutions/game demands.

It was a fine-line decision. I first had to shake off the disappointment of the 6970 not being a 570 killer in order to give it a fair go. I almost did the sulky-kid thing :) Ultimately it came down to the points (for my needs) in my previous post. Add to that the fact that Oleg is developing SoW using Nvidia cards, which may have a sliver of an effect, and the EVGA stock 570GTX review I found that put it ahead of stock 570 cards without even o/clocking and I reckon that the Superclocked EVGA is the best choice for me.

Can't wait to get my hands on my little beauty :)

Scan.co.uk 3XS SLIK i7 X58 950 BlackOps Bundle, tailored, comprising:-
Asus Sabertooth motherboard, Intel X58 chipset, Socket 1366, PCI-E 2.0 , DDR3 1866MHz, USB3/SATA 6Gb/s, SATA RAID, ATX
Intel Core i7 950 Bloomfield 45nm, 3.06 GHz, QPI 4.8GT/s, 8MB Cache, 23x Ratio, 130W, Retail, o/c to 4.2GHz
Prolimatech Megahalems Rev B,Super 6 Heatpipe Tower Cooler (Fanless, cooled by case roof fan)
EVGA Superclocked 570GTX
128GB Crucial RealSSD C300, MLC-Flash, SATA3 6Gbps, 2.5" SSD, Read 355MB/s, Write 70MB/s
6GB (3x2GB) Corsair Dominator, DDR3 PC3-12800 (1600), CAS 8-8-8-24, DHX, XMP, New connector, 1.65V
Coolermaster HAF 912 Plus, Black, Mid Tower Case
750W Corsair HX Series Modular PSU
2 x 120mm Akasa Apache Ultra Silent Fan HDB Bearing PWM fan w/4 pin connector & Rubber Pins
Microsoft Windows 7 Home Premium 64 Bit, Operating System, Single, - OEM
Acronis True Image 2010 sofware
Built and Overclocked to 4.2GHz, Tested and Heat Soaked
1 year on site Warranty plus 1 year rtb

T}{OR
12-17-2010, 01:43 PM
S!

Well summed up Blackdog. People rave about FPS shooter benchmarks, snip parts suitable for them and tout either brand being superior to another. Which is pretty useless.

Going to the bang for the buck thing. Take AMD 6950HD x 2, cost less than a 580GTX and kicks the living crap out of it in performance. Even 580GTX in SLI is only marginally faster than 6950HD in XFire. The 6970HD is equal or even faster due the much improved scaling AMD has managed to achieve. Even so with 6970HD you save almost 300USD compared to 580GTX SLI..with that money you can get a SSD disc to boost loading times and/or more RAM ;)

It is all about what you want to invest. Let's take an example. I have now an AMD Phenom II X6 1090T BE which loses to the Intel 980X in raw performance. But when looking at what I get for the money AMD wins. Here the AMD costs ~230€ vs. ~930€ of the Intel 980X. So a whopping 700€ difference there.

Motherboards for both are roughly the same price (X58 and 890FX) so not much difference. Intel uses triple channel so need 3 x RAM sticks where AMD with it's dual channel takes 2x. A small saving there too, yet insignificant IMO.

Now take the 580GTX which here is ~500€ vs. AMD 6970HD at ~390€. I save 110€ on the spot. So far I have spent 810€ less on the AMD than an Intel rig with nVidia. For the 6970HD my 650W Corsair PSU is enough, for the 580GTX safer bet is 700W+, again some tens, even over 100€ depending on PSU, more if need to upgrade.

So let's do some calculations and conclusions. I buy Intel 980X for 930€ + 580GTX for 500€ totalling 1430€. Mobo/RAM are not an issue, quite same prices. Now I buy AMD X6 1090T for 230€ + 6970HD for 390€ totalling 620€. The difference is 810€! With that I can add another 6970HD, a better PSU for XFire, a SSD drive and STILL costing LESS than the Intel/nVidia rig..and being comparable or even faster performance wise.

The error here is comparing ridiculously expensive i7 980X with and X6 1090T when even the i7 950 (the real competitor) beats it in the majority of tests. Even if the program (i. e. video, audio editing software) can use all 6 cores, it is highly unlikely that you will benefit from that, albeit marginally. Not today, not yet. Nowadays 4 cores are more than enough. But that isn't my point. What I am trying to say is that is isn't down to how many cores CPU has, but how many threads can it handle. And this is where has Intel has AMD beaten. In its raw power and performance.

Does this all mean that Intel is a better choice. No. AMD is better value for money, I do fully agree here.

Intel has currently 2 sockets out on the market - 1156 (soon to be replaced by 1155 - SandyBridge) and 1366 which is a high end model.

AM3 and 1156/1155 should compete with one another, 1366 is still in class of its own IMHO.

For example: while triple channel RAM isn't really an advantage speed wise, it is money wise when you have to choose between 6 or 8 GB or RAM. Placing 8 GB (4x2) will force you to run them in 2T mode. ;)


Bear in mind that if you go SLI on 580GTX the gap widens even more as you need a 1000W+ PSU to run it, those cost over 200€ here. And still the AMD rig is very close in performance and both an overkill to ANY game around at this point.

My point? Really what Blackdog said, the biggest and baddest stuff are used by a far less amount of players than the mainstream products, yet these being capable of running any game with good performance. Therefore claiming one brand is better than another is quite narrow minded. Boils down what you get for the money invested, not all have this "daddy pays" or born with a "golden spoon in mouth" ;)

SoW will be a stress test for the systems we have, but yet it seems it needs more RAM(texture loading etc.), a good CPU(physics, DM/FM etc.) and less GPU(no tessellation or PhysX there). The top end GPU would be needed at high resolutions with lot of FSAA/AF used IMO.

Looking that AMD or nVidia/Intel is the only way to play a game is stupid. Use what you want, whatever floats your boat. There is no need being a fanboy of either brand, both do their job in any game today.

Have a nice weekend.

I fully agree with this.


When figuring in the cost of a card vs. another card and then adding to the SLI/Crossfire you should take into account the increased power consumption. GPU's these days are not like the Voodoo3's and 64mb Geforce 4's of late 90's early 00's. They are big, and they eat a lot of electricity. Might not be a factor for people who have an apartment or are living at their parent's house, but that adds up over the year, especially if you're one of those gamers who spends a few hours a night gaming and not just having your cards sitting idle while surfing the net.

I'd love to see some kWh usage per month/year for some of those cards. Given the heat they produce they might rival your average TV or small appliance.

The latest rivals are more or less equal if you look at power consumption. It is the heat where newest AMD series disappointed IMO. 6970 (Saphire which are known for their GPU coolers) gets almost as hot as GTX 480, and the fans are quite louder than the rest of the reviewed cards. Almost every review wrote a part about this or at least mentioned it.

As I said earlier the real gem here is 6950, which is by far the best option for Cross Fire. For the majority of us mortals that will use single GPU solutions - GTX 570 which is quiet and on average outperforms its main rival 6970 in 6 out of 11 tests, and this makes it (for me at least) the biggest value for money. GTX 580 and HD 5970 are in class of their own.

We should also note that it is surprising to see so many reviewers posting so variable results. Both Nvidia and AMD haven't optimized their drivers yet.

T}{OR
12-17-2010, 01:46 PM
Can't wait to get my hands on my little beauty :)

Scan.co.uk 3XS SLIK i7 X58 950 BlackOps Bundle, tailored, comprising:-
Asus Sabertooth motherboard, Intel X58 chipset, Socket 1366, PCI-E 2.0 , DDR3 1866MHz, USB3/SATA 6Gb/s, SATA RAID, ATX
Intel Core i7 950 Bloomfield 45nm, 3.06 GHz, QPI 4.8GT/s, 8MB Cache, 23x Ratio, 130W, Retail, o/c to 4.2GHz
Prolimatech Megahalems Rev B,Super 6 Heatpipe Tower Cooler (Fanless, cooled by case roof fan)
EVGA Superclocked 570GTX
128GB Crucial RealSSD C300, MLC-Flash, SATA3 6Gbps, 2.5" SSD, Read 355MB/s, Write 70MB/s
6GB (3x2GB) Corsair Dominator, DDR3 PC3-12800 (1600), CAS 8-8-8-24, DHX, XMP, New connector, 1.65V
Coolermaster HAF 912 Plus, Black, Mid Tower Case
750W Corsair HX Series Modular PSU
2 x 120mm Akasa Apache Ultra Silent Fan HDB Bearing PWM fan w/4 pin connector & Rubber Pins
Microsoft Windows 7 Home Premium 64 Bit, Operating System, Single, - OEM
Acronis True Image 2010 sofware
Built and Overclocked to 4.2GHz, Tested and Heat Soaked
1 year on site Warranty plus 1 year rtb

Very nice build. Except I would change the SSD for a Corsair Force series (285/275 R/W) and Dominators for latest Muskin RAM that dominates them in all areas. ;)

JG52Krupi
12-17-2010, 01:46 PM
We should also note that it is surprising to see so many reviewers posting so variable results. Both Nvidia and AMD haven't optimized their drivers yet.

+1 For truth, when they update the drivers that's when its time to take stock :D

KG26_Alpha
12-17-2010, 01:47 PM
I'll wait for Windows 8 128bit

SoW will be nicely settled in by then and a more civilized bunch of hardware will be out to run it properly :mrgreen:





.

T}{OR
12-17-2010, 01:50 PM
I'll wait for Windows 8 128bit

SoW will be nicely settled in by then and a more civilized bunch of hardware will be out to run it properly :mrgreen:

Do you remember how long it has taken us to switch to x64? :mrgreen: Not to mention the hardware part. There is a saying, if you want to wait - you can wait forever as there will always be better hardware to buy next year. :D

KG26_Alpha
12-17-2010, 01:51 PM
Do you remember how long it has taken us to switch to x64? :)


I was running x64 years ago under Win xp most stable system ever for me. :)


Not to mention the hardware part. There is a saying, if you want to wait - you can wait forever as there will always be better hardware to buy next year. :D

Im not wasting my money on this 64bit stuff, I sell enough of it to know better, that's why I'm waiting for 128bit to throw my money at it, :)





.

JAMF
12-17-2010, 01:57 PM
WSGF has their Eyefinity review online:

http://www.widescreengamingforum.com/wiki/AMD_Radeon_6970_6950_-_Featured_Review

Looks like I'm switching to two 6950 on CrossFire. :) Look at those puppies "scale"!

klem
12-17-2010, 02:08 PM
Very nice build. Except I would change the SSD for a Corsair Force series (285/275 R/W) and Dominators for latest Muskin RAM that dominates them in all areas. ;)

There's always a bleeding critic! ;)

Do you know how much angst I've suffered just putting this baby together?

T}{OR
12-17-2010, 02:31 PM
There's always a bleeding critic! ;)

Do you know how much angst I've suffered just putting this baby together?

And there always will be. I just wrote what I read on the net, based on the various reviews.

Take it or leave it, your choice. :)

klem
12-17-2010, 03:23 PM
And there always will be. I just wrote what I read on the net, based on the various reviews.

Take it or leave it, your choice. :)

:)

You may be interested in this.
http://www.anandtech.com/show/3656/corsairs-force-ssd-reviewed-sf1200-is-very-good/1
Both the Crucial and the Corsair come out very well. I was interested in the read speed and gaming performance. Looks like I'll be ok (my new mobo has a 6Gb/s controller).

T}{OR
12-17-2010, 03:34 PM
Already read it mate. That article is almost 8 months old. ;)

Try this one:

http://www.anandtech.com/show/4020/ocz-vertex-plus-preview-introducing-the-indilinx-martini/2

KG26_Alpha
12-17-2010, 03:47 PM
And there always will be. I just wrote what I read on the net, based on the various reviews.

Take it or leave it, your choice. :)


I always find it best to comment on what you have first hand experience of.

As we know the internet is all ways 100% correct factually and totally unbiased so it must be all true.


@ Klem

Looks like a good system so long as its stress tested :)


.

Triggaaar
12-17-2010, 04:45 PM
Power and Heat: The 570 edges the 6970 out by something like 30w and 10 degrees under load.You seem have compared them fairly, but the 570 uses a little more power than the 6970.

AnAnd - Load Power Consumption - Crysis
570 = 361 Watts
6970 = 340 Watts
(I'll ignore Furmark as it's limited, but amd much less)

Hardware Canucks - 1hr 3DMark Batch size test
570 = 337 Watts
6970 = 364 Watts

That's a 48 Watt swing. Watt is that all about?

Now checking Guru3d, which calculates the GPU load:
570 = 213
6970 = 207
And HardOCP, which calculates total system
570 = 454W
6970 = 428W
And BitTech
570 = 330
6970 = 306

JG27CaptStubing
12-17-2010, 04:46 PM
Some very interesting comments coming from both camps... I look at it in much more simpler terms.

Regarding Dual GPU setups... While I enjoyed the scaling of my SLI setup back in the day I didn't enjoy the headaches involved everytime there was a driver update. Studders galore in IL2 and even with a lot of tweaking and messing around with the newer Nvidia drivers you still can't entirely get rid of studders in IL2.

My choice to go with a 580 was simple. I wanted the fastest possible card without the tweaking crap that comes with multi GPU setups. For those trying to save a buck it certainly does scale well and it looks like AMD stepped up with their new rev of drivers and cards.

Sure sure you can save a few bucks and get great scaling with multi GPU setups but it does come at a cost. You will pay for it eventually. I have yet to see Nvidia or ATI come out with perfect drivers.

I'm now running an EVGA 580 and it's been a fantastic NO hassel experience. In terms of SOW I don't think it's going to matter much which camp you're in though I know for a fact Oleg likes Nvidia.

C_G
12-17-2010, 05:01 PM
Just a bit of speculation here, but I think that in most cases SoW is going to be bottlenecking the CPU, not the GPU.

I think this because there is a lot more going on "under the hood" (i.e. non-visually) in terms of DM than before.
We know that each subsystem of the aircraft is modeled internally for DM purposes.
We also know from his thread on radar a while back, that ground objects are also subject to a sophisticated DM model. IIRC he stated that if the power generator for a radar station (or maybe it was for a searchlight) is knocked out then the radar station (or search light) will not work, though it has not been damaged itself.

I think it's safe to assume that AI routines will be more complex and that FM routines will also.

On the other hand, we also know (iirc- I may be mistaken) that SoW is being designed to utilize all the cores of multicore CPU's, so it may well be that notwithstanding the increase in "under the hood" complexity CPUs will not be the chokepoint and a need for bleeding edge cards will still exist.

This uncertainty is a large part of the reason I'm holding off my upgrades until after SoW is released and we have some definite information on this.

C_G

Coen020
12-17-2010, 05:34 PM
Ok guys, i tried to understand as much of this thread as i possibly could but, without much succes i'm afraid.

My only question is: will it make much difference? amd or nvidia?

and will SOW run good on my pc?

Processor / CPU - AMD Athlon II X4 635
Mainboard - Gigabye GA-790XTA-UD4
PSU - Antec 550W Basiq Modular
Memory - 4GB OCZ DDR3 1333MHz
OS - Windows 7 64-bit
VGA - XFX 5850

Triggaaar
12-17-2010, 05:46 PM
My only question is: will it make much difference? amd or nvidia?Honestly, we don't know. These are the facts though:
Oleg has said it will run best with a DX11 card - so both nvidia and AMD are fine there
Oleg has said that Nvidia have been more helpful so far - this may indicate that drivers will be better for nvidia, but we are guessing

and will SOW run good on my pc?

Processor / CPU - AMD Athlon II X4 635
Mainboard - Gigabye GA-790XTA-UD4
PSU - Antec 550W Basiq Modular
Memory - 4GB OCZ DDR3 1333MHz
OS - Windows 7 64-bit
VGA - XFX 5850Here we have to guess, and it depends what you mean by good. I would be happy to bet that you would need a top end system to run at the highest settings. But your system is quite nice, so just enjoy it until SoW is released, and then we'll know for sure if there are any bits that are holding it back.

C_G
12-17-2010, 05:58 PM
Ok guys, i tried to understand as much of this thread as i possibly could but, without much succes i'm afraid.

My only question is: will it make much difference? amd or nvidia?

and will SOW run good on my pc?

Processor / CPU - AMD Athlon II X4 635
Mainboard - Gigabye GA-790XTA-UD4
PSU - Antec 550W Basiq Modular
Memory - 4GB OCZ DDR3 1333MHz
OS - Windows 7 64-bit
VGA - XFX 5850

I would think that so long as you go with an AMD or nVidia card that roughly matches your upper range machine you'll be fine. There may be some bells and whistles option at the very high end that you'll have to decide whether to compromise fps for, but Oleg won't have designed SoW:BoB such that a machine like yours won't run it well.

IMO, the AMD/ATI vs nVidia wars are rather ridiculous as they often come down to a difference of a few percentage points of difference in performance. If you look at the fps difference between the two companies in the charts in the first (Tom's) article linked, the difference in performance between same-price level cards is marginal and unlikely to be noticeable in practice (can anyone ACTUALLY tell the difference between 111 fps and 93 fps?).

The bottom line is that you get what you pay for (and the companies price as against their competitor for roughly equivalent performance), with diminishing returns per $ as you get to the "bleeding edge" of new technology.

I don't think going nVidia or going AMD is going to make a huge difference in practice so long as you get roughly equivalent cards.

LoBiSoMeM
12-17-2010, 06:16 PM
Always amazes me why people think in therms of upgrade because some VGA brand "support" a game development...

The problem with IL-2 and ATI was the OpenGL drivers. IL-2 was optimized using OpenGL, runs with higher quality under OpenGL, and ATI do some bad work in these drivers, but IL-2 is a "discontinued" product. And TODAY ATI cards with TODAY DRIVERS can run IL-2 like NVIDIA cards. IL-2 is "easy" for today VGAs.

But let's look at ALL OTHER TITLES. Neither ATI or NVIDIA have "gigantic troubles" running any other title. And all the major titles today runs under DirectX. So, why in hell do you believe SoW will run so better in NVIDIA hardware, if SoW will use Dx9-11 APIs?!?! Just because you think ATI didn't take the hand of Oleg and leave him to a walk on the park?

Unbelieavable. Any modern DX title will run "better" in a "better" card, ATI or NVIDIA. I alway bought cards of the two brands just looking at the actual market, and have my share of good cards of the two brands, and my share of melting cards too. And I never had to change the brand to "run better" some title, NEVER.

Pointless discussion. Read reviews, look at the specs of each card, and let the developers use the cards potential. It's their work and Oleg team will do great with ATI or NVIDIA brands, it's just DirectX API.

And if some of the two companies screw some driver, it's the way it is. Just to remember: is better some blocky text in IL-2 because of a bad driver than a melting card... Do you remember? All the two major companies make HUGE mistakes...

Out of this kind of discussion.

KG26_Alpha
12-17-2010, 06:46 PM
Always amazes me why people think in therms of upgrade because some VGA brand "support" a game development...

The problem with IL-2 and ATI was the OpenGL drivers. IL-2 was optimized using OpenGL, runs with higher quality under OpenGL, and ATI do some bad work in these drivers, but IL-2 is a "discontinued" product. And TODAY ATI cards with TODAY DRIVERS can run IL-2 like NVIDIA cards. IL-2 is "easy" for today VGAs.

But let's look at ALL OTHER TITLES. Neither ATI or NVIDIA have "gigantic troubles" running any other title. And all the major titles today runs under DirectX. So, why in hell do you believe SoW will run so better in NVIDIA hardware, if SoW will use Dx9-11 APIs?!?! Just because you think ATI didn't take the hand of Oleg and leave him to a walk on the park?

Unbelieavable. Any modern DX title will run "better" in a "better" card, ATI or NVIDIA. I alway bought cards of the two brands just looking at the actual market, and have my share of good cards of the two brands, and my share of melting cards too. And I never had to change the brand to "run better" some title, NEVER.

Pointless discussion. Read reviews, look at the specs of each card, and let the developers use the cards potential. It's their work and Oleg team will do great with ATI or NVIDIA brands, it's just DirectX API.

And if some of the two companies screw some driver, it's the way it is. Just to remember: is better some blocky text in IL-2 because of a bad driver than a melting card... Do you remember? All the two major companies make HUGE mistakes...

Out of this kind of discussion.

Because Nvidia Have supported IL2 for over 10 years

T}{OR
12-17-2010, 06:58 PM
I'm now running an EVGA 580 and it's been a fantastic NO hassel experience. In terms of SOW I don't think it's going to matter much which camp you're in though I know for a fact Oleg likes Nvidia.

Excellent point. No hassle with single GPU-s.


Just a bit of speculation here, but I think that in most cases SoW is going to be bottlenecking the CPU, not the GPU.

I think this because there is a lot more going on "under the hood" (i.e. non-visually) in terms of DM than before.
We know that each subsystem of the aircraft is modeled internally for DM purposes.
We also know from his thread on radar a while back, that ground objects are also subject to a sophisticated DM model. IIRC he stated that if the power generator for a radar station (or maybe it was for a searchlight) is knocked out then the radar station (or search light) will not work, though it has not been damaged itself.

I think it's safe to assume that AI routines will be more complex and that FM routines will also.

On the other hand, we also know (iirc- I may be mistaken) that SoW is being designed to utilize all the cores of multicore CPU's, so it may well be that notwithstanding the increase in "under the hood" complexity CPUs will not be the chokepoint and a need for bleeding edge cards will still exist.

This uncertainty is a large part of the reason I'm holding off my upgrades until after SoW is released and we have some definite information on this.

C_G

IIRC long time ago when news about SoW started Oleg confirmed only dual core support. If they implemented support for more than two cores it would be a pleasant surprise. The fact that we will have x64.exe is more important that +6 core support. ;)


Always amazes me why people think in therms of upgrade because some VGA brand "support" a game development...

The problem with IL-2 and ATI was the OpenGL drivers. IL-2 was optimized using OpenGL, runs with higher quality under OpenGL, and ATI do some bad work in these drivers, but IL-2 is a "discontinued" product. And TODAY ATI cards with TODAY DRIVERS can run IL-2 like NVIDIA cards. IL-2 is "easy" for today VGAs.

But let's look at ALL OTHER TITLES. Neither ATI or NVIDIA have "gigantic troubles" running any other title. And all the major titles today runs under DirectX. So, why in hell do you believe SoW will run so better in NVIDIA hardware, if SoW will use Dx9-11 APIs?!?! Just because you think ATI didn't take the hand of Oleg and leave him to a walk on the park?

Unbelieavable. Any modern DX title will run "better" in a "better" card, ATI or NVIDIA. I alway bought cards of the two brands just looking at the actual market, and have my share of good cards of the two brands, and my share of melting cards too. And I never had to change the brand to "run better" some title, NEVER.

Pointless discussion. Read reviews, look at the specs of each card, and let the developers use the cards potential. It's their work and Oleg team will do great with ATI or NVIDIA brands, it's just DirectX API.

And if some of the two companies screw some driver, it's the way it is. Just to remember: is better some blocky text in IL-2 because of a bad driver than a melting card... Do you remember? All the two major companies make HUGE mistakes...

Out of this kind of discussion.

Nvidia (allegedly) already has SoW code over a month (so said Oleg) for the purposes of driver optimization. And if you looked at all those reviews you would have seen that certain rival cards like GTX 570 and HD 6970 run better in some games and worse in others - with a margin of 10 FPS. Not much, but difference exists non the less.

JG27CaptStubing
12-17-2010, 08:16 PM
Another point I wanted to bring up... We are very used to seeing performance in IL2 with its older yet updated OpenGL engine. I can assure you one thing. The move to DX9 and above will offer a much better scaling and visuals. In other words if IL2 used DX9 you would already see much better performance versus the visuals.

What is going to be interesting is to see the impact on the CPU. Getting a game to scale via CPU is usually a myth. It just may be that it offers more options more than FPS.

Biggs [CV]
12-17-2010, 08:24 PM
One brand or the other is not gonna make a huge differance. If Nvidia gets 10 more frames per second so be it. If you notice the differance in 10 fps you need a new GPU.

klem
12-17-2010, 09:13 PM
You seem have compared them fairly, but the 570 uses a little more power than the 6970.

AnAnd - Load Power Consumption - Crysis
570 = 361 Watts
6970 = 340 Watts
(I'll ignore Furmark as it's limited, but amd much less)

Hardware Canucks - 1hr 3DMark Batch size test
570 = 337 Watts
6970 = 364 Watts

That's a 48 Watt swing. Watt is that all about?

Now checking Guru3d, which calculates the GPU load:
570 = 213
6970 = 207
And HardOCP, which calculates total system
570 = 454W
6970 = 428W
And BitTech
570 = 330
6970 = 306

Yes Triggaaar, it was Hardware Canuck's figures I noted down. I seem to have overlooked the others :|
My brain was probably addled by then :)

For those pointing out the mimimal difference between the 570/6950 I can only say I had to decide where to throw my £300. And I don't throw money easily so I got quite deeply into those small differences :D

And they are right. Between the two, it probably doesn't matter too much which way you jump if you are only considering "My Position". High tesselation and resolution needs would have pushed me towards the 6970.

swiss
12-18-2010, 12:12 AM
I'd love to see some kWh usage per month/year for some of those cards. Given the heat they produce they might rival your average TV or small appliance.


be my guest:

http://www.guru3d.com/article/geforce-gtx-580-review/7

less than $10/month

Flanker35M
12-18-2010, 09:53 AM
S!

What I meant by comparing the 980X and 1090T BE is that those are the "top of the line" the manufacturer has. Now AMD launched the 1100T BE which is a bit more expensive than the 1090T. My comparison just showed that I can build an AMD top of the line rig capable of running ANY game greatly with less invested money than if I chose Intel/nVidia. If I could pour out money just like that, then sure I would run Intel rig, but my hard earned money is needed to run a family and RL too ;) So AMD was a logical choice for me and has not dissapointed me in any game I play :)

As of stated above we seem to cling on IL-2 and it's rather old OpenGL engine to judge capabilities of a GPU. Sure in the Black Death track there is a difference in FPS, but when actually playing you can not tell the difference at all. And I have used both brands on IL-2, online and offline. I do not fully trust the benchmarks, I play the games I have and see myself :) With small tweaking I have gotten them to run as I want, on both brands again ;)

I hope and wish SoW will NOT be optimized for just one brand, to get a symbol spinning or appearing on startup. DirectX is the same for both brands, they just need to get their drivers right. To force a player to change hardware because of some code writing is stupid and short sighted from any developer. I am pretty sure Oleg & Team have not fallen to this pit.

klem
12-19-2010, 07:43 AM
..............
IIRC long time ago when news about SoW started Oleg confirmed only dual core support. If they implemented support for more than two cores it would be a pleasant surprise. The fact that we will have x64.exe is more important that +6 core support. ;).................

I hope he has by now! I don't know what's involved in making it run on more than two cores but "flight sims tend to be CPU rather than GPU limited" (a quote I picked up somewhere) so that has to be an important focus for him in a cutting edge sim. Anyone upgrading for SoW now is going to have more than two cores and will be pretty £!$$*^ off if they can't get the best out of their investment. I know I will be.

Can you imagine Oleg saying (eventually), "best system requirements AMD 3800+ dual core CPU (recommend overclocking)" ?!

btw the 64 exe won't be available for some time and won't be as important as multi-core (another quote I picked up somewhere, please don't shoot the messenger).

Hecke
12-19-2010, 08:05 AM
btw the 64 exe won't be available for some time and won't be as important as multi-core (another quote I picked up somewhere, please don't shoot the messenger).

What the ...? Didn't Oleg see what happens with only 2 GB of Ram at Igromir?

swiss
12-19-2010, 08:17 AM
What the ...? Didn't Oleg see what happens with only 2 GB of Ram at Igromir?

1st: klem is not oleg or part of the dev team
2nd: Oleg programmed it, he probably knows better than you what he does.

:rolleyes:

LoBiSoMeM
12-19-2010, 08:26 AM
People thinking about ATI/NVIDIA support and I only can think about multi-core full support and x64 memory adress.

I dream with devs working hard in these two points. It's time to go in this direction, and people with less than 4GB of RAM and x4 CPU starts to think about some upgrade...

In sims, will be a must: much more performance for less money, with fully optimized engines for x64 and x4 or more cores.

klem
12-19-2010, 09:05 AM
1st: klem is not oleg or part of the dev team
2nd: Oleg programmed it, he probably knows better than you what he does.

:rolleyes:

Quite right.
But Oleg did say in a post during the last couple of months that the 64 exe would not be available for a while as they have more pressing things on their minds. Thats all I know.

Then again it's my understanding that 32 bit supports 3.?? Gb RAM and can assign a full 2Gb RAM to a 32bit application. That's much better than trying to share a bare 2Gb RAM across the system AND the application as the Demo PCs apparently had to.

Hecke
12-19-2010, 11:28 AM
1st: klem is not oleg or part of the dev team
2nd: Oleg programmed it, he probably knows better than you what he does.

:rolleyes:

@ 1st: Bananas are crooked.
@ 2nd: Not only better than me.

:rolleyes:

JAMF
12-19-2010, 11:56 AM
People thinking about ATI/NVIDIA support and I only can think about multi-core full support and x64 memory adress.

I dream with devs working hard in these two points. It's time to go in this direction, and people with less than 4GB of RAM and x4 CPU starts to think about some upgrade...

In sims, will be a must: much more performance for less money, with fully optimized engines for x64 and x4 or more cores.I hope Mr. Maddox at least meant he's programmed for two threads. Since people noticed smoothness, when they upgraded to 2 cores with IL2, I'd expect more smoothness. At least the programming for 2 threads can divide the calculations. One for the flight model and one for AI, ballistics, weather, logistics, special effects and other calculations. The other 2 cores/threads on a 4 core CPU can and will be used by the OS for everything in the background. Teamspeak/Ventrilo, FRAPS capturing, HD activity and what not.

So one can see that programming for 2 threads will necessitate at least 3 cores for smooth operation. When we look at the Steam hardware survey, 63% has less than 3 cores!!! 51% has a CPU speed between 2.3 and 3GHz. Just 40% has 4GB or more in their system.

One can't blame 1C for developing the programme for the largest customer base. Going by these numbers you'd get the following minimum recommended system specs:
Minimum - Recommended:
CPU: 3 cores 2.4GHz - 4 cores 3GHz
RAM: 3GB - 8GB
OS: Windows XP - Win7 64B
HD space: 10GB - 20GB
VRAM: 512MB SM3.0 - 1.5GB SM3.0
DirectX: 9.0c - 9.0c (*)

(*) Hey, Mr Maddox never said DX11 was needed, only "supported". That's like saying it's programmed for DX8 and will run on higher spec'd systems. :D

[EDIT] To put this back on topic: 60+% of DX11 GPUs in the survey is AMD, so nVidia isn't likely to have pushed for DX11 implementation.

swiss
12-19-2010, 12:04 PM
You get those Steam numbers because a lot of ppl (I guess) play on the laptop, where duocores still are very common.
But then again the steam average is useless - flight sims are a niche, if you play them on a laptop your not really the average flight sim customer.
;)

JAMF
12-19-2010, 12:14 PM
You get those Steam numbers because a lot of ppl (I guess) play on the laptop, where duocores still are very common.
But then again the steam average is useless - flight sims are a niche, if you play them on a laptop your not really the average flight sim customer.
;)Sure, but with such a large survey, one can get a good idea of what's out there.

If they want a better grasp of their user base, they can add an information gathering HW/SW tool to "Patch 4.10" and get that data send back home.

swiss
12-19-2010, 12:22 PM
Sure, but with such a large survey, one can get a good idea of what's out there.

If they want a better grasp of their user base, they can add an information gathering HW/SW tool to "Patch 4.10" and get that data send back home.

A survey on a the il2 community will do no good - we have lots of ppl playing on stone age PCs.
Why? Because Il2 runs pretty ok on those toasters, there was no need for a new system.
A survey on the dcs/ed forums would get more realistic results.

Again, a guess.

JAMF
12-19-2010, 01:08 PM
A survey on a the il2 community will do no good - we have lots of ppl playing on stone age PCs.
Why? Because Il2 runs pretty ok on those toasters, there was no need for a new system.And where do you think 90% of the SoW users will come from? You want to know all your customers that will buy your game, not just the ones that had money for upgrades. If it runs only on their bleeding edge high end systems, you'll sell maybe 20% (if you're lucky) of what you could have. Mr. Maddox wants to make some money as 1C:Maddox is not a charity.

swiss
12-19-2010, 01:28 PM
And where do you think 90% of the SoW users will come from? You want to know all your customers that will buy your game, not just the ones that had money for upgrades. If it runs only on their bleeding edge high end systems, you'll sell maybe 20% (if you're lucky) of what you could have. Mr. Maddox wants to make some money as 1C:Maddox is not a charity.

What I said is: We have a lot of ppl running old systems because, so far, there was no need for a new one. No-need...but willing to upgrade.
Defining min specs up the sky is just as wrong as coding it for methusalem systems, for the latter you dont even have to start.

But that was not the point: I just said the steam average is useless if you want know your target customers' average system.
The niche for flight sims is imho very very small, and those customers have systems above average.
(although It cant be that small if there are at least three companies offering joysticks in the $300+ range...)

Considering the min specs, I would put them to a level which was upper mid-end 2 years ago(which seems to work, if you look at BF:BC2) - but it's not up to me, and honestly I don't give a flying f***.
The day the game published we will know what we have to buy or not.