Fulqrum Publishing Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official Fulqrum Publishing forum > Fulqrum Publishing > IL-2 Sturmovik: Cliffs of Dover

IL-2 Sturmovik: Cliffs of Dover Latest instalment in the acclaimed IL-2 Sturmovik series from award-winning developer Maddox Games.

Reply
 
Thread Tools Display Modes
  #121  
Old 11-02-2010, 11:37 PM
Skoshi Tiger Skoshi Tiger is offline
Approved Member
 
Join Date: Nov 2007
Location: Western Australia
Posts: 2,197
Default

Quote:
Originally Posted by Triggaaar View Post
have a better understanding of the game's use of Physx and Tesselation, and whether it matters if we go nvidia or amd (ATI).

Didn't Oleg mention a long while back that they were using their own physics engine????

Quote:
Originally Posted by Triggaaar View Post

Nope, that's another lost soul with his torch out
LOL


Cheers!
Reply With Quote
  #122  
Old 11-03-2010, 01:09 AM
speculum jockey
Guest
 
Posts: n/a
Default

Quote:
Originally Posted by Hecke View Post
Actually I never tried an ATI. And Oleg saying it will look better on Nvidia doesn't help me change this trend.
If Oleg stated, that it would look as good on Ati I would definately buy the new 6970, because the new gtx 580 stuff of nvidia seems to be a "fail" again.
Hot, loud, ...

Too bad Sandy Bridge isn't out this year.
I've played 15-20 games that had the big green Nvidia logo at the start and none of them looked any better or any worse on an ATI card. Image quality usually becomes a factor if you're playing a game at 16x AF or 8x FSAA.

You're not going to notice a difference between the two cards until you look at your power consumption and your wallet. The screen is the last place their differences become apparent. I don't want to sound like an ATI fanboy, but their cards have been hitting the performance/price sweet spot a lot better for the past few years than Nvidia has.

If anyone is trying to decide what they should get, go to tom's hardware, check out the video card chart in your game of choice, and see the performance stats of all the different cards. Now find the performance area you want and check out the card prices. Whatever gives you the most FPS at the lowest price is the winner.
Reply With Quote
  #123  
Old 11-03-2010, 04:09 AM
julian265 julian265 is offline
Approved Member
 
Join Date: Aug 2009
Posts: 195
Default

^ Valid points. But there are other things which make me stay with Nvidia. These are things I've experienced or noticed on forums. Your experience may be different, but of course I'm always going to place more weight on my own observations when making decisions.

Compatibility.
In IL2 my 4870 had graphics corruption issues unless I stayed with the 8.9 drivers. The same happened again to 5xxx owners. (then they fixed the drivers, then broke them again in the next version, then fixed...)

DCS: A-10C (yes, it's a beta, but I still think it's relevant) there are quite a few high-end (i7, etc) computer owners complaining about terrible frame-rates. The link between them? They have late-model ATI cards.

In general, on the forums I frequent, I notice more threads about problems with ATI-based hardware and compatibility than nvidia.

Control panel.
I don't like nvidia's CP much, but the ATI CP, I cannot stand! For example, when using the on-board ATI stuff on a TVPC, I needed to disable over-scan (IIRC). After much fruitless poking around, I found the answer on a forum - you need to press a little, UN-LABELLED button (which did not look like a button!) to get to the relevant section...

Then there's the catalyst AI, which seems to need to be turned off for most of my games, making me wonder why it is there, and enabled by default.

I could put up with the control panel if I were confident about the card's functionality, but my observations about compatibility issues put me off, to the point where I justify the extra cost and (somewhat irrelevant) extra heat/power usage. These are just my observations, no doubt others will see the opposite, or the opposite might be true for a different set of games.
__________________
DIY uni-joint / hall effect sensor stick guide:
http://www.mycockpit.org/forums/cont...ake-a-joystick
Reply With Quote
  #124  
Old 11-03-2010, 08:07 AM
domian domian is offline
Approved Member
 
Join Date: May 2010
Posts: 17
Default

Quote:
Originally Posted by julian265 View Post
DCS: A-10C (yes, it's a beta, but I still think it's relevant) there are quite a few high-end (i7, etc) computer owners complaining about terrible frame-rates. The link between them? They have late-model ATI cards.
I can confirm that.

But Lock On - the base layer - runs much better on Nvidia Cards for years.
Reply With Quote
  #125  
Old 11-03-2010, 08:43 AM
Hecke
Guest
 
Posts: n/a
Default

do you guys know, if ...

a AMD - ATI combination works better than a AMD - Nvidia combo?

or if ...

a Intel - Nvidia combination works better than a Intel - ATI combo?

I'm speaking of stuff with comparable power.
Just wondering because AMD and Intel are competitors as well as Nvidia and ATI, so they might have intentionally done sth to make some combinations work not as good.
Reply With Quote
  #126  
Old 11-03-2010, 12:37 PM
klem's Avatar
klem klem is offline
Approved Member
 
Join Date: Nov 2007
Posts: 1,653
Default

Quote:
Originally Posted by speculum jockey View Post
If anyone is trying to decide what they should get, go to tom's hardware, check out the video card chart in your game of choice, and see the performance stats of all the different cards.
Well I just did that and they don't cover the 6000 series yet in their Graphics card best buy Hierarchy. What they do say here..

http://www.tomshardware.com/reviews/...arts,2776.html

is "The Radeon HD 6870 is slower than Radeon HD 5870. Radeon HD 6850 is slower than Radeon HD 5850. It's confusing, we know, but AMD has what it considers a good explanation for the naming scheme." and in the final page Verdict "The high-end Radeon HD 5870 and 5970 will be replaced by the “Cayman” and “Antilles” Radeon HD 6900-series before the end of Q4 2010."... also supported by supposedly leaked AMD information (I won't post the link as there may be legal issues) stating the ATI "Cayman" is supposedly being released late November=6970?, Antilles in December=6990?

There's nothing on AMDs site about the 6900 series and they don't answer the phone ("leave a number and we'll call back") but other Google-guessers are expecting release in late November and guessing the 6970 to be round 1/3 more in price than 5870. It's only someone's guesswork though. So I may have to wait ?

Back to this Thread, I don't see AMD doing anything more before SoW is released so for me that's part of the system spec puzzle resolved. 6970/6990 will probably be the way to go for max effect unless you want to wait for Nvidia's response. But what do I know?
__________________
klem
56 Squadron RAF "Firebirds"
http://firebirds.2ndtaf.org.uk/



ASUS Sabertooth X58 /i7 950 @ 4GHz / 6Gb DDR3 1600 CAS8 / EVGA GTX570 GPU 1.28Gb superclocked / Crucial 128Gb SSD SATA III 6Gb/s, 355Mb-215Mb Read-Write / 850W PSU
Windows 7 64 bit Home Premium / Samsung 22" 226BW @ 1680 x 1050 / TrackIR4 with TrackIR5 software / Saitek X52 Pro & Rudders
Reply With Quote
  #127  
Old 11-03-2010, 01:17 PM
kendo65 kendo65 is offline
Approved Member
 
Join Date: May 2008
Posts: 908
Default

Struck by this quote from the TomsHardware article

http://www.tomshardware.com/reviews/...arts,2776.html

"And in the midst of all of that jockeying, there are new games launching that may or may not be under the influence of developers who selectively cooperate with one GPU vendor or the other. These are anticipated games. Games we've wanted to test for some time now. But we face the possibility that one hardware architecture might be highly-optimized, while the other company's driver team still hasn't seen the title running. Now there's a recipe for hard-to-explain benchmark results."

Given Oleg's arranged presentation for nVidia this week ( and i believe I remember him saying that he'd made overtures to AMD/ATI but had got no replies or interest...) then we may have another situation like for il-2 where nVidia are the way to go???

Last edited by kendo65; 11-03-2010 at 01:26 PM.
Reply With Quote
  #128  
Old 11-03-2010, 02:13 PM
Igo kyu's Avatar
Igo kyu Igo kyu is offline
Approved Member
 
Join Date: Sep 2008
Posts: 703
Default

Quote:
Originally Posted by Hecke View Post
do you guys know, if ...

a AMD - ATI combination works better than a AMD - Nvidia combo?

or if ...

a Intel - Nvidia combination works better than a Intel - ATI combo?

I'm speaking of stuff with comparable power.
Just wondering because AMD and Intel are competitors as well as Nvidia and ATI, so they might have intentionally done sth to make some combinations work not as good.
Intel is also a competitor of nVidia's. The new Intel processors do not work with nVidia motherboards, PCIe cards still work, but nVidia's integrated graphics don't, only Intel's own. Look at the way the nVidia Ion graphics chip no longer works with the new Intel Atom processor. The Atom and Ion are low power and no use for SoW, but they are typical of how friendly Intel and nVidia are (not at all friendly).

AMD could make their systems unfriendly to nVidia's graphics cards, but then people might choose graphics cards ahead of processors, so they could lose CPU sales, and it would take extra work, so why bother.
Reply With Quote
  #129  
Old 11-04-2010, 11:54 PM
Triggaaar Triggaaar is offline
Approved Member
 
Join Date: Sep 2010
Posts: 535
Default

Quote:
Originally Posted by Skoshi Tiger View Post
Didn't Oleg mention a long while back that they were using their own physics engine????
I haven't been here that long (which is just as well, I'd be out of my mind with all the expectation), bt if that's the case, thanks for the heads up.

Quote:
Originally Posted by Hecke View Post
do you guys know, if ...

a AMD - ATI combination works better than a AMD - Nvidia combo?

or if ...

a Intel - Nvidia combination works better than a Intel - ATI combo?

Just wondering because AMD and Intel are competitors as well as Nvidia and ATI, so they might have intentionally done sth to make some combinations work not as good.
Well obviously ATI was taken over by AMD, do the new 6xxx series are AMD, and will obviously work well with AMD CPUs. And I'm sure all manufacturers want their products to work well with whatever system their customer may have (i've not seen any reports of collusion).

Quote:
Originally Posted by kendo65 View Post
Struck by this quote from the TomsHardware article

"And in the midst of all of that jockeying, there are new games launching that may or may not be under the influence of developers who selectively cooperate with one GPU vendor or the other."
Yes there are some games that have received funding from card manufacturers - eg, Nvidea paid a chunk for Crysis, and no doubt had an input on its development. Nvidea and Ubisoft are boosom buddies - ATI cards were running Assassin's Creed better than Nvidia thanks to DirectX 10.1, and support for 10.1 was suddenly removed, ruining performance for many ATI owners. There has recently been fighting between AMD (was ATI) and Nvidia over Ubisoft's new HAWX 2 game - before it passed Beta it was being used for benchmarking in reviews (not unusual), and was perfoming better with Nivdia cards. AMD cried foul, and said they'd be able to work on the drivers once they were given access to the game.

Quote:
Given Oleg's arranged presentation for nVidia this week ( and i believe I remember him saying that he'd made overtures to AMD/ATI but had got no replies or interest...) then we may have another situation like for il-2 where nVidia are the way to go???
Particulalry with Ubisoft rumoured to be the publisher outside of Russia, that is a concern. I currently own an Nvidia card, and I'm waiting for this game before upgrading, but the fact is that AMD cards are currently better than Nivdia cards, but it's possible the game could be made to suit Nvidia better.

Last edited by Triggaaar; 11-04-2010 at 11:57 PM.
Reply With Quote
  #130  
Old 11-05-2010, 03:15 AM
LoBiSoMeM LoBiSoMeM is offline
Approved Member
 
Join Date: May 2010
Posts: 963
Default

The sim will run under DX9,10 or 11.

Just one API at a time to be used by NVIDIA and ATI. The "tweaks" to one side or other are minimal.

People are scared for nothing. If some game dev create a game with serious performance issues or visual degradation in some VGA brand, this dev was really a "genious"...

The "optimized" is more "I received some money from X to put some logo in the splash screen"...

Relax! I will buy one HD 5850 1GB to run SoW BoB.
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 11:09 AM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.