![]() |
#121
|
|||
|
|||
![]() Quote:
LOL Cheers! |
#122
|
|||
|
|||
![]() Quote:
You're not going to notice a difference between the two cards until you look at your power consumption and your wallet. The screen is the last place their differences become apparent. I don't want to sound like an ATI fanboy, but their cards have been hitting the performance/price sweet spot a lot better for the past few years than Nvidia has. If anyone is trying to decide what they should get, go to tom's hardware, check out the video card chart in your game of choice, and see the performance stats of all the different cards. Now find the performance area you want and check out the card prices. Whatever gives you the most FPS at the lowest price is the winner. |
#123
|
|||
|
|||
![]()
^ Valid points. But there are other things which make me stay with Nvidia. These are things I've experienced or noticed on forums. Your experience may be different, but of course I'm always going to place more weight on my own observations when making decisions.
Compatibility. In IL2 my 4870 had graphics corruption issues unless I stayed with the 8.9 drivers. The same happened again to 5xxx owners. (then they fixed the drivers, then broke them again in the next version, then fixed...) DCS: A-10C (yes, it's a beta, but I still think it's relevant) there are quite a few high-end (i7, etc) computer owners complaining about terrible frame-rates. The link between them? They have late-model ATI cards. In general, on the forums I frequent, I notice more threads about problems with ATI-based hardware and compatibility than nvidia. Control panel. I don't like nvidia's CP much, but the ATI CP, I cannot stand! For example, when using the on-board ATI stuff on a TVPC, I needed to disable over-scan (IIRC). After much fruitless poking around, I found the answer on a forum - you need to press a little, UN-LABELLED button (which did not look like a button!) to get to the relevant section... Then there's the catalyst AI, which seems to need to be turned off for most of my games, making me wonder why it is there, and enabled by default. I could put up with the control panel if I were confident about the card's functionality, but my observations about compatibility issues put me off, to the point where I justify the extra cost and (somewhat irrelevant) extra heat/power usage. These are just my observations, no doubt others will see the opposite, or the opposite might be true for a different set of games.
__________________
DIY uni-joint / hall effect sensor stick guide: http://www.mycockpit.org/forums/cont...ake-a-joystick |
#124
|
|||
|
|||
![]() Quote:
But Lock On - the base layer - runs much better on Nvidia Cards for years. |
#125
|
|||
|
|||
![]()
do you guys know, if ...
a AMD - ATI combination works better than a AMD - Nvidia combo? or if ... a Intel - Nvidia combination works better than a Intel - ATI combo? I'm speaking of stuff with comparable power. Just wondering because AMD and Intel are competitors as well as Nvidia and ATI, so they might have intentionally done sth to make some combinations work not as good. ![]() |
#126
|
||||
|
||||
![]() Quote:
http://www.tomshardware.com/reviews/...arts,2776.html is "The Radeon HD 6870 is slower than Radeon HD 5870. Radeon HD 6850 is slower than Radeon HD 5850. It's confusing, we know, but AMD has what it considers a good explanation for the naming scheme." and in the final page Verdict "The high-end Radeon HD 5870 and 5970 will be replaced by the “Cayman” and “Antilles” Radeon HD 6900-series before the end of Q4 2010."... also supported by supposedly leaked AMD information (I won't post the link as there may be legal issues) stating the ATI "Cayman" is supposedly being released late November=6970?, Antilles in December=6990? There's nothing on AMDs site about the 6900 series and they don't answer the phone ("leave a number and we'll call back") but other Google-guessers are expecting release in late November and guessing the 6970 to be round 1/3 more in price than 5870. It's only someone's guesswork though. So I may have to wait ? ![]() Back to this Thread, I don't see AMD doing anything more before SoW is released so for me that's part of the system spec puzzle resolved. 6970/6990 will probably be the way to go for max effect unless you want to wait for Nvidia's response. But what do I know?
__________________
klem 56 Squadron RAF "Firebirds" http://firebirds.2ndtaf.org.uk/ ASUS Sabertooth X58 /i7 950 @ 4GHz / 6Gb DDR3 1600 CAS8 / EVGA GTX570 GPU 1.28Gb superclocked / Crucial 128Gb SSD SATA III 6Gb/s, 355Mb-215Mb Read-Write / 850W PSU Windows 7 64 bit Home Premium / Samsung 22" 226BW @ 1680 x 1050 / TrackIR4 with TrackIR5 software / Saitek X52 Pro & Rudders |
#127
|
|||
|
|||
![]()
Struck by this quote from the TomsHardware article
http://www.tomshardware.com/reviews/...arts,2776.html "And in the midst of all of that jockeying, there are new games launching that may or may not be under the influence of developers who selectively cooperate with one GPU vendor or the other. These are anticipated games. Games we've wanted to test for some time now. But we face the possibility that one hardware architecture might be highly-optimized, while the other company's driver team still hasn't seen the title running. Now there's a recipe for hard-to-explain benchmark results." Given Oleg's arranged presentation for nVidia this week ( and i believe I remember him saying that he'd made overtures to AMD/ATI but had got no replies or interest...) then we may have another situation like for il-2 where nVidia are the way to go??? Last edited by kendo65; 11-03-2010 at 01:26 PM. |
#128
|
||||
|
||||
![]() Quote:
AMD could make their systems unfriendly to nVidia's graphics cards, but then people might choose graphics cards ahead of processors, so they could lose CPU sales, and it would take extra work, so why bother. |
#129
|
||||
|
||||
![]() Quote:
Quote:
Quote:
Quote:
Last edited by Triggaaar; 11-04-2010 at 11:57 PM. |
#130
|
|||
|
|||
![]()
The sim will run under DX9,10 or 11.
Just one API at a time to be used by NVIDIA and ATI. The "tweaks" to one side or other are minimal. People are scared for nothing. If some game dev create a game with serious performance issues or visual degradation in some VGA brand, this dev was really a "genious"... The "optimized" is more "I received some money from X to put some logo in the splash screen"... Relax! I will buy one HD 5850 1GB to run SoW BoB. |
![]() |
|
|