PDA

View Full Version : Nvidia, anyone?


Flyby
03-27-2010, 06:32 PM
Last night I read a slew of reviews about the new Nvidia GTX480 and GTX470. Since I plan to use a 24 inch monitor I now figure I can go with an ATi 5850, then later do it in Crossfire. I was hoping to be impressed by the 470, but it's just not fitting my price/performance comfort level. Sadly I'm reading that the prices for ATi's cards will not drop because the performance of Nvidia's new cards don't really overwhelm Ati. Plus, the heat and the power draw for the GTXs put me off (for the small performance advantage they manage).
just my thoughts.
Flyby out

Flanker35M
03-27-2010, 07:44 PM
S!

Will be upgrading my 5870HD to the 2Gb version when available :) About nVidia..everyone can draw their own conclusions and choose whatever card or brand :)

Flyby
03-27-2010, 08:59 PM
hmmm...2gigs of memory. Are you planning to run a 30 inch monitor? I'd wait to see if that extra memory does any good if at lower resolutions. I'm not sure the extra memory will serve you well except at 2560x1600.
Flyby out

=PF=Coastie
03-27-2010, 09:19 PM
Yeah, I can honestly say that I was hoping the Fermi would do much better so that ATI lowered their prices. I was hoping to upgrade my 5770 to a 5850 or better. But I am sure they will come down after their refresh which will be pretty soon. I am hoping to get a 5850 for the same price the 5770's are now($160). But I will likely have to wait for next xmas for that.

Flanker35M
03-27-2010, 09:26 PM
S!

ATI should have gone with 2Gb from the start. 1Gb of memory is nothing these days when looking at games. If I crank up RoF details nearly 80% is used already and that does not leave much headroom. So the more the better IMO :)

genbrien
03-27-2010, 09:59 PM
S!

Will be upgrading my 5870HD to the 2Gb version when available :) About nVidia..everyone can draw their own conclusions and choose whatever card or brand :)

I think that the 2gb version of the 5870 will be only with 6 display port....

Flanker35M
03-27-2010, 10:26 PM
S!

Yes, I read about this on the Sapphire 5870HD 2Gb TOXIC review. Sure helps with many monitors but also in a few games that are intensive with memory, like Rise Of Flight etc. So if within affordable price range while trading in or selling my current 5870HD..why not :)

Igo kyu
03-27-2010, 10:46 PM
I dunno.

I'm using one 512MB HD 4870 ATM.

I was thinking of getting a second, but the prices have gone ridiculously high for 512MB ones (1GB ones seem to be much more reasonable, which in general terms is just silly), so I can't really see that since the memory is limiting, 2x HD 4850 means two new cards, and not that great a performance gain over one HD 4870, and beside that there are problems with IL*2 and Ati cards. I wish the nVidia cards had turned out better/cheaper, but I might end up buying those anyway.

nearmiss
03-28-2010, 01:40 AM
Last night I read a slew of reviews about the new Nvidia GTX480 and GTX470. Since I plan to use a 24 inch monitor I now figure I can go with an ATi 5850, then later do it in Crossfire. I was hoping to be impressed by the 470, but it's just not fitting my price/performance comfort level. Sadly I'm reading that the prices for ATi's cards will not drop because the performance of Nvidia's new cards don't really overwhelm Ati. Plus, the heat and the power draw for the GTXs put me off (for the small performance advantage they manage).
just my thoughts.
Flyby out

You might want to carefully research that 24" monitor thing. I bought a 24" 1920x1200 and I may have missed the mark.

IT seems the standard may narrow down to 1920 x 1080 like HDTV. Also, I have two dvi ports on my video card, but you can't mix and match monitors. The monitors have to be the same resolution. So I'll have to find another 1920x1200.

Sadly, the prices on digitial monitors are dropping, except for 1920x1200. I bought mine over a year ago and the price is still the same.

Also, remember if the 3D is on your mind you need 120hz monitor.

Good luck

tagTaken2
03-28-2010, 08:39 AM
I wouldn't buy another ATI card primarily because I found the Catalyst software to be pitiful- I could write a couple of pages about it's inadequacies. It even conflicted with joystick driver software (that took a while to pin down, as the first thing I loaded up), plus the psychedelic effect I got in Il-2, plus the fact that almost every game I've bought in the last couple of years has been optimised for nvidia.

AKA_Tenn
03-28-2010, 09:18 AM
ok one thing to keep in mind is that when u use SLI or crossfire, you don't get to use the memory from both cards, just the GPU's.. even if u had 3 512MB videocards in your computer crossfired, u'd still only have 512mb allocatable texture memory...

and weather u choose ATI or nvidia it makes no difference, but look at more than a few reviews and benchmarks for all cards within your price range...

unlike a bad movie, that a review said was good... videocard reviews with solid benchmarking in multiple games/appz (because its not just showing you factory specs for the card, its showing you real world performance) will give you a good idea as to how fast the videocard you want to buy is compared to others in the same price range, so long as you remember, a Pentium 4 with 256MB system ram is not fast enough for a modern videocard... if your going to buy a high end videocard, the rest of your system has to be faster than it to make full use of it. i'd say no less than a core2@3GHz with 4GB system memory (i don't know what the AMD equivalent is) for a HD5850/280GTX (just an example)

Flanker35M
03-28-2010, 05:28 PM
S!

This pic has been circulating various places, outbursts of laughs everywhere regardless which brand the user prefers. To lighten up the gloomy wait of TD 4.10 :D

http://img17.imageshack.us/img17/4531/thermi.jpg

Qpassa
03-28-2010, 07:47 PM
S!

This pic has been circulating various places, outbursts of laughs everywhere regardless which brand the user prefers. To lighten up the gloomy wait of TD 4.10 :D

http://img17.imageshack.us/img17/4531/thermi.jpg

Yeah,thats a joke,but this is real :S
http://img.techpowerup.org/100326/Capture598.jpg
http://hardocp.com/images/articles/126962492671BZgJ5ZxI_7_3.png

AKA_Tenn
03-29-2010, 11:31 AM
Yeah,thats a joke,but this is real :S


lol thats damn funny, question tho... can't a card be designed to run at higher temps? or is a given rule that hardware has to be below a certain "acceptable" temperature no matter what its was designed to take?

Edit: I use a HD5850, so temperature problems aren't really a problem, just curiousity is the problem :P

Flanker35M
03-29-2010, 12:02 PM
S!

There are certain thresholds electronics and such can take. Looking at the fighterplane equipment, the cooling mechanism is VERY efficient and even more so if the equipment is sensitive to overheat, like a computer array would be for example.

Skoshi Tiger
03-29-2010, 01:28 PM
The higher the temperatures the higher the resistance, the more power required to make it work. If the temps get too high they stop working.

It's a bit of a vicious circle. In general cooler is better.

Flanker35M
03-30-2010, 04:25 PM
S!

The addy told a lot but this graph even more ;) :-D

http://img36.imageshack.us/img36/1957/gpuo.png

Flyby
03-30-2010, 07:16 PM
You might want to carefully research that 24" monitor thing. I bought a 24" 1920x1200 and I may have missed the mark.

IT seems the standard may narrow down to 1920 x 1080 like HDTV. Also, I have two dvi ports on my video card, but you can't mix and match monitors. The monitors have to be the same resolution. So I'll have to find another 1920x1200.

Sadly, the prices on digitial monitors are dropping, except for 1920x1200. I bought mine over a year ago and the price is still the same.

Also, remember if the 3D is on your mind you need 120hz monitor.

Good luck
S~ nearmiss. As it turns out, my wife's PC has the Asus vk264h monitor. It's 1900x1080. I thought I might buy one for me. It has 2ms response time, and fairly rich colors (deep blacks, imo) Plus the price is dropping all the time. Several months ago we paid $219.00 for her monitor. Asus has a 27 inch monitor with the same resolution with 2ms as well. Pixels are going to be bigger, I think, but that makes bandits easier to see, I hope! :D
Flyby out

Qpassa
03-30-2010, 10:01 PM
S!

The addy told a lot but this graph even more ;) :-D

http://img36.imageshack.us/img36/1957/gpuo.png

xDDDDDDDDDDDDDDDDDDDDDD

Flyby
03-30-2010, 10:46 PM
I'm dismayed that the 5850 does so poorly in Flanker35M's graph. :( But I want to see which of these cards does best at making ice cream! ;) BTW, does anyone think either of these cards will be overkill for SoW?
Flyby out

Flanker35M
03-31-2010, 05:51 AM
S!

Flyby, all of those cards will run SoW for sure in DirectX mode. My wild guess is that there is not much tesselation in SoW as it is pretty much useless in a flight sim. FPS and RPG etc. are a different matter though, as seen in Aliens vs Predator for example.

As of making ice cream..I think the fan of the nVidia is the fastest so it can at least whip cream but beware of the heat ;)

Flyby
03-31-2010, 01:06 PM
S!

Flyby, all of those cards will run SoW for sure in DirectX mode. My wild guess is that there is not much tesselation in SoW as it is pretty much useless in a flight sim. FPS and RPG etc. are a different matter though, as seen in Aliens vs Predator for example.

As of making ice cream..I think the fan of the nVidia is the fastest so it can at least whip cream but beware of the heat ;)
Wouldn't tessellation be beneficial in the landscape and the depiction of buildings, especially in low-alt flying? Well maybe not, unless one is simply cruising along, taking in the scenery on a free flight. Might be nice to have it there, though. We have still yet to see any screens depicting a DX11 rendering. iirc. But what the heck? I want a really good flight sim. Decent landscape is almost expected these days anyway, as state of the art.
Flyby out

Flanker35M
03-31-2010, 01:12 PM
S!

Tesselation eats resources and do you want that? ;) I think a flight sim needs a good terrain, but less technical gimmicks like tesselation as buildings are really secondary. Do you really note if a house is tesselated or not when flying at combat altitude or when whizzing by at low alt trying to shake that bogey? ;)

As an effect it is nice, but IMO should be used with caution, to add to the game not just be everywhere even not needed. It is good to see that DirectX 11 has features, but not many titles even today utilize fully even the older DirectX features ;) ATI released now also full support for OpengGL 4.0 which has a lot of new features so definitely interesting times ahead.

Wolf_Rider
03-31-2010, 02:05 PM
How I remember, with distaste, all the tesselation techiques etc promises, supposed screenshots (which turned out to be artist concepts) which were promised with FSX, Vista and DX10.
(too bad the FSX geniuses didn't realise all the damn trees they put in, brought systems to their knees)

Flyby
03-31-2010, 02:58 PM
S!

Tesselation eats resources and do you want that? ;) I think a flight sim needs a good terrain, but less technical gimmicks like tesselation as buildings are really secondary. Do you really note if a house is tesselated or not when flying at combat altitude or when whizzing by at low alt trying to shake that bogey? ;)

As an effect it is nice, but IMO should be used with caution, to add to the game not just be everywhere even not needed. It is good to see that DirectX 11 has features, but not many titles even today utilize fully even the older DirectX features ;) ATI released now also full support for OpengGL 4.0 which has a lot of new features so definitely interesting times ahead.
I've just read that the latest tessellation API (?) isn't really all that visually impressive for the hit it imposes, so I concede your point. I also concede that DX11 may not offer huge improvements over DX10 for gamers either (except maybe impressively exploding bombers, cloud densities, and that fine mist of blood drawn by an explosive cannon shell which has traumatically removed a pilot's arm). As for OpenGL4.0, I'd love to see some combat flight sim use it. I always preferred running IL2 in Opengl versus DirectX.
I'm speaking to you all with a bucket over my head. I just read another review of Nvidia's new cards, and it's another reason to save a few bucks and get an ATI card. I so wanted to have a reason to buy a GTX470. It seems the only reason to buy that card now is to add a little to the coffers of the local utility company.
Flyby out

F19_lacrits
03-31-2010, 03:21 PM
I've just read that the latest tessellation API (?) isn't really all that visually impressive for the hit it imposes, so I concede your point. I also concede that DX11 may not offer huge improvements over DX10 for gamers either (except maybe impressively exploding bombers, cloud densities, and that fine mist of blood drawn by an explosive cannon shell which has traumatically removed a pilot's arm). As for OpenGL4.0, I'd love to see some combat flight sim use it. I always preferred running IL2 in Opengl versus DirectX.
I'm speaking to you all with a bucket over my head. I just read another review of Nvidia's new cards, and it's another reason to save a few bucks and get an ATI card. I so wanted to have a reason to buy a GTX470. It seems the only reason to buy that card now is to add a little to the coffers of the local utility company.
Flyby out

Reason you like flying IL2 in OpenGL is that Oleg and crew focused on OpenGL and DX8 was just there to please the other crowd.. For SoW there is a focus on DX and I haven't seen anything that Oleg and crew would be making it for OpenGL.
At least the GTX470 is alot better price/performance than the GTX480 and can stir up some with the 5870 at certain games and resolutions. If I was a NVIDIA fan with a limited budget I'd look at the GTX470.. But, who knows when, or if, the average joe can get his hands on one of these cards? Rumor has it that it that demand for the first batch of cards is high and shipping volumes are low. They haven't even started selling these new babies; a paper launch!
Who ever is considering a GTX480 should think again and consider the Radeon 5970; same price bracket at roughly the same power bracket.. But the 5970 outperforms the GTX480 and runs circles around it in many games.

Flyby
03-31-2010, 03:45 PM
Reason you like flying IL2 in OpenGL is that Oleg and crew focused on OpenGL and DX8 was just there to please the other crowd.. For SoW there is a focus on DX and I haven't seen anything that Oleg and crew would be making it for OpenGL.
At least the GTX470 is alot better price/performance than the GTX480 and can stir up some with the 5870 at certain games and resolutions. If I was a NVIDIA fan with a limited budget I'd look at the GTX470.. But, who knows when, or if, the average joe can get his hands on one of these cards? Rumor has it that it that demand for the first batch of cards is high and shipping volumes are low. They haven't even started selling these new babies; a paper launch!
Who ever is considering a GTX480 should think again and consider the Radeon 5970; same price bracket at roughly the same power bracket.. But the 5970 outperforms the GTX480 and runs circles around it in many games.
Are you saying I'd have preferred DX8 if Oleg had focused on that and OpenGL was just there
to please the other crowd? Ammagad! I've just been reduced to a mindless gargoyle! :D I live in a Capitalist country. When the GTX470 is finally available, I think the price/performance ratio will give way to fanboy-ism. If you want it you have to pay a premium price for it (just to have it). It is not so dominant over the 5850 as to be a btter deal, imo.
Flyby out

Thunderbolt56
03-31-2010, 06:18 PM
...Asus has a 27 inch monitor with the same resolution with 2ms as well. Pixels are going to be bigger, I think, but that makes bandits easier to see, I hope!


Yes, it does make bandits easier to see on the 27". Not because the pixels are bigger, but because the dot pitch is slightly larger. A pixel is a pixel. That's like saying a pound of feathers is lighter than a pound of rocks. But dot pitch, that's another story.

Flanker35M
03-31-2010, 06:47 PM
S!

SoW was originally started in OpenGL I think, but changed to DirectX at some point. I can be wrong but got this impression when ages ago they talked about re-doing the gfx engine etc. SoW will be DirectX 9.0c also as we have seen screens (WIP) where stated this was on lowers render DX9. So this caters for those that do not have nor want to invest in DX11 cards yet. Wise choice IMO from Oleg & Team, broader audience with more options available yet the same sim for all.

As of tesselation. In AvP it was used here and there and looked nice. But frankly I did not even note it unless I really checked for it. Sure the Unigine HEaven v2.0 is a great show of it, especially for nVidia but it is at that stage only a synthetic benchmark to me as there is no AI, physics or other stuff a game has. Just an empty world with lot of tesselation around.

XFX did drop nVidia Fermi, both 470/480GTX, from it's lineup for a reason we can only speculate of. They invest instead to new upcoming ATI cards and refreshing those that are in the market. Discussion is hot all over the place, but speculating is pointless as we do not know the details.

Nevertheless..SoW will be the new benchmark as IL-2 was 10 years ago..and still goes on strong thanks to continued support from Oleg/TD :)

AKA_Tenn
04-01-2010, 03:21 AM
all tesselation really is... in reguards to dx11 anyway... would be like... for instance look at the propeller cap on the P-47 in il2... its not round, its like... a hexagon... tesselation would be able to make that round, but still using the same space by basically making polygons foldable/warpable... it would also allow things like panel lines actually looking like panel lines, not just the skin...

yea, it might be a bit of a performance hog, but it'll allow u to make a round object where, without tesselation there wouldn't be enough memory to do it...

Skoshi Tiger
04-01-2010, 08:07 AM
I doubt I'll be buying a new graphics card until SOW gets released. So until then the market will have a chance to settle down a bit.

We'll need a few driver revisions before we can see whats the Nvidia cards can actually do performance wise. As I see it if you want the extra speed your not really worried about power usage, in realistic terms we're only talking about a few cents an hour in extra power.

As allways, if you need the fastest card NOW, you going to have to pay a premium. (I haven't seen any 480/470 for sale in Australia yet, so really it's a moot point at the moment over here!)

Just have to find a way of squirreling away a few dollars now and again to upgrade for SOW without the trouble and strife knowing!

flyingbullseye
04-01-2010, 10:47 PM
Considering the problems some have had with the new ATI cards on 1946 I wonder if people will have same problem with fermi and or drivers. Guess it won't be long to find out.

Flyingbullseye

Flanker35M
04-02-2010, 03:02 PM
S!

I run DirectX 11 games on my ATI 5870HD without a glitch. SoW will be DX11 so...Go figure ;)

Codex
04-02-2010, 10:38 PM
ok one thing to keep in mind is that when u use SLI or crossfire, you don't get to use the memory from both cards, just the GPU's.. even if u had 3 512MB videocards in your computer crossfired, u'd still only have 512mb allocatable texture memory...

and weather u choose ATI or nvidia it makes no difference, but look at more than a few reviews and benchmarks for all cards within your price range...

unlike a bad movie, that a review said was good... videocard reviews with solid benchmarking in multiple games/appz (because its not just showing you factory specs for the card, its showing you real world performance) will give you a good idea as to how fast the videocard you want to buy is compared to others in the same price range, so long as you remember, a Pentium 4 with 256MB system ram is not fast enough for a modern videocard... if your going to buy a high end videocard, the rest of your system has to be faster than it to make full use of it. i'd say no less than a core2@3GHz with 4GB system memory (i don't know what the AMD equivalent is) for a HD5850/280GTX (just an example)

Depends on what operating system your using, Vista and Win7 will allocate all memory both System RAM and Graphics card Ram as one pool of memory.

Also to date all games used for benchmarking have found that its the graphics cards that are the limiting factor, from a Phenom II to a Intel extream processor have little or no impact on frame rates in modern games when it comes to playing games at full resolutions of 24" to 30" monitors.

PA_Willy
04-07-2010, 10:01 AM
ok one thing to keep in mind is that when u use SLI or crossfire, you don't get to use the memory from both cards, just the GPU's.. even if u had 3 512MB videocards in your computer crossfired, u'd still only have 512mb allocatable texture memory...

Exactly. With WHICHEVER OS you use. In CF and SLI, VRAM doesn't increase respect one only of these cards.

For example, two 5870 (1Gb per each one) cards in CF will result in 1Gb of VRAM available for the system (not 2Gb). Each GPU works with its own VRAM associated.

You cannot add them.

Flyby
04-07-2010, 02:13 PM
Exactly. With WHICHEVER OS you use. In CF and SLI, VRAM doesn't increase respect one only of these cards.

For example, two 5870 (1Gb per each one) cards in CF will result in 1Gb of VRAM available for the system (not 2Gb). Each GPU works with its own VRAM associated.

You cannot add them.
seems inefficient. I wonder if SoW will benefit from a single card with 2mb of ram versus two separate cards with 1mb each? I'd better read up on how SLi and Crossfire work. maybe the ram is shared as in a series circuit?
FLyby out

PA_Willy
04-07-2010, 03:14 PM
seems inefficient. I wonder if SoW will benefit from a single card with 2mb of ram versus two separate cards with 1mb each? I'd better read up on how SLi and Crossfire work. maybe the ram is shared as in a series circuit?
FLyby out

VRAM is not shared. Each GPU needs its own VRAM to load information. In SLI and CF, you get two GPU (doubled computing power). That's the advantage.

There are two ways to work:

- SFR (split frame rendering): Each GPU renders half frame (frame generated in half time than with one only GPU).

- AFR (alternate frame rendering): Each GPU renders a frame, and the next is being rendering by the other GPU (you get 2 frames in same time you got one only frame with one GPU).

In both ways, each GPU needs its VRAM to read/write and generate the new frame.

Of course, all is in theory. You don't get double of frames with SLI/CF. But in some games with decent drivers you can get upto 80% more (usually, the figure is near to 40-50% more).

I have got two dual GPU systems (SLI, Nvidia).

Flanker35M
04-07-2010, 03:18 PM
S!

Waiting for the updated 2Gb 5870HD, later CrossFire when SoW is released :) Was interesting to read that the 5850HD in CrossFire beat the crap out of both 480GTX and ATI 5970HD! And the XFire setup costs about the same as one 5970/480GTX, uses less power than either of them and runs cooler too..huh! Interesting times!

JG27CaptStubing
04-07-2010, 03:53 PM
Yeah cearly Nvidia missed the boat on this one... Their DX 11 part comes out six months after ATI launched their card. To boot it requires a ton of voltage it runs very hot and it's marginally faster than the 5870.

The only thing that has me interested at this point is the fact Nvidia supports 3d gaming much better. I tested in it out on a buddies rig and it has a lot of potential for games especially if SOW supports it. I had a chance to use it with FSX and wow it really adds to a game.

Meanwhile it looks like I will have to wait for a refresh from Nvidia. This round is a loss.

Flyby
04-07-2010, 05:08 PM
thanks for the tech info, PA_Willy. I must admit I haven't really been a fan of either Crossfire or SLi because of the low return on the investment for the sims I like (having read the reports of others who have tried two video cards and not been satisfied with the scaling in sims like IL2 and LockOn).

Still, 40% improvement might be acceptable if one were to see that in SoW using two 5850s in Crossfire as Flanker35M suggested. The cost of that setup might be worthwhile! :D

Captn_Stubbing, I agree with you about Nvidia. The what-ifs are something to dream for: a 512bit bus, lower power consumption, and lower heat signatures. That would have made it a killer single-GPU card, far and away better than the 5870. But it just wasn't so. Maybe next time. Meanwhile I'm reading that ATI/AMD is in no hurry to lower the prices of their top 5xxxx series. They are just waiting to see how the demand for Nvidia's new cards come to pass. I imagine if that demand is underwhelming then people won't get much of a price break from ATI. I was looking forward to some friendly competition making the 5870 more affordable for me.
Flyby out

Flanker35M
04-07-2010, 09:42 PM
S!

Some healthy competition would do good for the industry. Seems that these 2 rivals are too concentrated on their "GPU crown hunt" that the average buyer is left out. What use is of any kind of technical gimmick or feature if the average joe can not afford nor get their hands on these new cards?

I could get a 5870HD here for less than 400€ and the estimated price for 480GTX for example is 500-600€, in the range of the 5970HD. Add to this I would have to buy a new GOOD and quality PSU for the Fermi, talking about 100€ or even more and possibly a new case with wind tunnel built in for 200€..hmm..not very appealing for me for the meager FPS gain I would get. Not everyone has the money tree growing in the backyard ;)

Flyby
04-08-2010, 01:25 AM
Flanker35M, I read you loud and clear. If I narrow my GPU focus to SoW, as I assume many here will do, I am always wondering how much GPU will be required. Many have asked Oleg this question. One day we will know the answer. For the sake of the many I hope something no more powerful than the 5850 is required to render it in all it's Glory. I'm sure that card is more agreeable to many more budgets than the 5870 or the GTX480. Further still, I'm hopeful SoW scales well enough to played satisfactorily on much lesser cards. I personally prefer a card that supports DX11, though I am not sure that DX11 will be a huge factor in SoW, at least not in the sort run of that graphics engine. (who knows? Maybe by the time SoW_Korea or SoW_VietNam DX11 will finally dazzle our socks off.) ATI has several less powerful DX11-capable cards. Time will reveal Nvidia's competitive reply. Let's hope so
Flyby out

Thunderbolt56
04-08-2010, 01:07 PM
Higher power consumption wouldn't be an issue for me if the performance payoffs were there. The wacky temps? a definite no go for me. It's already hot enough here in Florida.

Hopefully a refresh will bring the temps down a bit and make all the other stuff (i.e. power consumption, Vram frequency and overall performance) better as well (Just like G92 did for G80).

Even with the somewhat lateral performance comparison with the 5870, I'd take the NV offering if those other things weren't there.

Flyby
04-08-2010, 01:57 PM
Here's hoping Nvidia has something up it's sleeve that will unleash the 480's full 512 bus as well as bring it's heat signature down. I think if it can accomplish those two things I would not worry too much about power consumption either.
Flyby out

PA_Willy
04-08-2010, 02:00 PM
You will have to wait for 28nm process to get better Nvidia cards (lower temps and power consumption).

ATI and Nvidia are manufacturing in 40nm at this moment. I hope 28nm will be available in Q4 2010 or maybe 2011.

Regards.

Flanker35M
04-08-2010, 02:48 PM
S!

We have to wait for this new 28nm process for a bit it seems, there are difficulties with it. Intel has it's own 32nm plants, but you can guess they do not give them for ATI/nVidia ;)

When scouring the net it seems this nVidia chip has very low yield from the production, various sources say 20% to less than 50% which is not good concerning price and availability.

Let's see what these "GPU crown hunters" come up with this year. Some kind of refresh is coming from both to spice up the race even more. But as it seems ATI is quite confident thus they have not lowered prices that much even nVidia has published the new cards.

Only one thing is sure..whatever oompah and performance these cards from both brands promise and PR agencies tout..we are the ones paying for this ;) :D

PA_Willy
04-08-2010, 02:56 PM
The chips manufacturer for Nvidia AND ATI is TSMC.

Regards.

Mysticpuma
04-08-2010, 04:39 PM
I was interested to see where Nvidia were going with Fermi, and looking at screen-shots, the technology looks like it has amazing potential.

I also like the fact that there seems to be (I had 2x8800 GTS G92 in SLI) little issue with Nvidia in IL2.

However, I can't be that exclusive to my gaming, and the 8800's had a poor frame-rate at Max detail in CoD:World at War, so I looked around for a better card, and at-the time I found the Radeon 4890 Vapor-X (1Gb) at a good price.

Oh my! What a card! Al settings maxed out in (CoD), no slow-down, dynamic lighting...amazing!

I just finished Modern Warfare 2, on max settings, and no-noticeable slowdown.

The second reason I went for the Vapor-X is Temperature and Noise. Initially I had a 4870x2 (for about a week and sold it) and even though that was a beast, so was the noise from the cooling fan! The temps as-well would idle at 86 Degrees Celsius!

Noise was the biggest consideration as the PC is in a living room, and although I sit there with the Surround-sound headphones on, the family had to listen to the hair-dryer built in fan, blasting away, trying to keep the 4870x2 cool!

For this reason, cards that run cool are important to me, and a greater consideration than some others have.

Now that's a long way of getting round to the fact that I would love to have one of the cards with Fermi tech. but there is no-way I can justify the cost, when compared to the performance gain against the 4890 I have, to have a new piece of kit that sounds as-loud as the 4870x2 did!

Maybe one-day I'll get a water-cooling kit and be able to have one of these cards, nice and quiet in my PC, but for now, I can run IL2 at 1920x1200, with AA Super-Sampled, 24x and AF at 16x and lots of other effects.

So, for now, no-thanks to the latest Nvidia, but maybe one-day when they are cooler.

Cheers, MP.

flyingbullseye
04-08-2010, 05:31 PM
Puma, how did you get slow downs with your sli rig in COD:WaW? I have a 3870 running on a FX-60 and never had any slow downs with max settings (1680x1050). You sure there wasn't some conflict going on somewhere?

Back on topic regarding the new NV cards, the mainstream cards look more inviting especially the GTX 440, 450 if/when they ever get released. Can't see at this point spending north of $300 for just a gpu.

Flyingbullseye

Flanker35M
04-08-2010, 06:31 PM
S!

TSMC has problems with nVidia Fermi yields, had some with ATI initally = had to wait 2 months to get my 5870HD. And the 28nm process has been changed a few times already so there is a delay for that. Not expecting it to come before SoW. Maybe some small refreshes from both companies possibly?

Puma, I see your point. My 5870HD is quiet as well, even under load. A Fermi has been measured at some 70db and that is loud, too loud. You could dry laundry with that fan screaming for help there ;) On paper and in some tests Fermi is impressive, but the initial gain for the price and additional investments to be done to actually run it silent and cool = not worth the money IMO.

Anyways, shall see what happens in near future with the new processes and all :) Interesting times indeed.

Flyby
04-08-2010, 07:59 PM
So can the new ATI cards display "water=3". That setting sure looked nice on my old 6800 Ultra. I guess I need to ask Oleg is "water=3' can be displayed in SoW using the red cards.
Flyby out

lbuchele
04-09-2010, 11:40 AM
I´m not denying Nvidia´s GTX 480 yet.
They are hot ,power hungry and noisy, but much less in real world conditions, not so different from other high end video cards of today, with the exception of the temperature, that surpasses all the other brands.
Will be a question of whose brand has the more adequate drivers for SOW.

Thunderbolt56
04-09-2010, 12:49 PM
So can the new ATI cards display "water=3". That setting sure looked nice on my old 6800 Ultra. I guess I need to ask Oleg is "water=3' can be displayed in SoW using the red cards.
Flyby out


"water=3" will probably be a foreign language to SoW. ;)

Flyby
04-09-2010, 04:01 PM
"water=3" will probably be a foreign language to SoW. ;)
quite so.
Flyby out