Fulqrum Publishing Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official Fulqrum Publishing forum > Fulqrum Publishing > IL-2 Sturmovik: Cliffs of Dover

IL-2 Sturmovik: Cliffs of Dover Latest instalment in the acclaimed IL-2 Sturmovik series from award-winning developer Maddox Games.

Reply
 
Thread Tools Display Modes
  #1  
Old 08-15-2012, 01:12 PM
Walshy Walshy is offline
Approved Member
 
Join Date: Nov 2010
Location: Belfast, Northern Ireland
Posts: 114
Default

Quote:
Originally Posted by Stublerone View Post
Omg, okay. It seems, there is bashing around and you don't want to read all I have written. My card is running on 1080p with up to 2.7gb vram usage, at least 2.4.

Some other people also report this and we are now talking about "crank up means only high?". If I talk about crank up the system, please totally crabk it up, not half!!!

I just see our beloved gtx 680 users taking part in stutter discussions. This is not only, because the drivers are bad for it. Or do you still believe in santa claus? It is lag of memory!! -> that means, you cannot run it cranked up and you COULD get stutters.

Crazy, that lynch is writing something about pro gamer and that I said, that you are all casual gamer. Never talked about that. You just have to avoid mentioning games, which are definetly built to be used with low ram to serve all systems. I never talked about you, i talked about the games, which you are comparing with clod.

The sentence " why is clod running so bad and skyrim not?" just feeds me up and I cannot believe, that someone of this com really wrote that!! Sry, bit that is sad.

I also never talked about the 680 is worse. I said, that the 7970 with its vram should have less problems. The opinion, that even 670s are running faster than 7970 is simply not true.
Bf3 is an nvidia game and the 680 initially was not faster, which was a shocking moment for all nvidia guys. They now solved it with bf3 patches and nvidia patches. Some of you should be aware of that tactics, working closely together with the game developer and solve out the competitor. Giys, who are into it, know that this was often the case in the fight ati/nvidia. Just wanted to mention that.

Noise, heat and evetually life time lose from overclockinf are just weak statements out of the brochure.

80% of the gamers, deciding to buy one of these cards, will overclock it. Ati has the better bill of material and you will not reduce life span on both cards, except you want to run your card over 6 years. All cards are consuming power and noone in high end will care. It is not much and the standby power consumption is nearly the same.

I do not want to discuss the whole thing. I stated, which is the better card in my opinion and comparing 670 and 7970 is nuts. Take out the boost in the 670 or overclock the hd 7970 will cause, that hd7970 is simply faster!

Btw: I never heard of ati user having generally more problems. Il2 1946 was ati game and clod is also ati game in its current situation, if I just do a wild shot into the dark. Never heard a 7970 user taking part of stutter discussions...

I am no pro gamer, just a guy who really read alot of the situation between 680 and 7970. Only few sides really switch on their minds and benched the cards realistically. What is realistically? High end card users will tweak and overclock. So the only comparison to do is: overclock both cards to its stable maximum and than compare. You can see, that they are closely together and when it com to real high resolution, the 680 in its nvidia nature just cannot compete anymore. High res is not 1080p btw....

Just my opinion, but I dont want to discuss it, as it gets boring. Buy a half kepler, quickly released to avoid damage and be happy.
Sorry but this whole rant is nonsense, he claims to have experience in the technical side of computers, building them and an understanding of the hardware involved. He clearly doesn't have clue and and isn't very experienced in the matter, he hasn't read any hardcore computing magazines that review the whole computing industry and devoted to hardcore gaming or been to any technical websites/forums and devoted to debating at length stuff like this. Since the new nvidia cards have come out AMD, and now no longer ATI as AMD took over the ATI company completely in 2006 for 5.4 billion dollars, has been on the back foot. The nvidia cards are a whole redesign of the Fermi architecture which they began with the 500 series of cards. Overclocking kills cards plain and simple and his statement that overclocking doesn't kill cards shows his complete ignorance on the subject. Overclocking stresses the components and puts them under greater strain to perform, which causes heat and which puts the fans and cooling system under more strain to deal with said heat. Running any sort of rig under those conditions even if your cooling system is a water system eventually those components are going to wear out, and playing demanding at high settings will lower that that threshhold even more! The rreason he doen't want to discuss it anymore is proof he hasn't got a clue about the subject he's talking about and subsequently wouldn't like to show his ignorance! I've been playing IL-2 from the very beginning way back in the days of the IL-2 Sturmovik beta, which I still have somewhere on disc, the nonsense he said about nvidia users having issues with forgotten battles and not ATI users is complete and utter nonsense. I remember discussions going on for hundreds of pages at a time on this forum and also over at the ubisoft forum about graphical issues that ati users encountered with every patch, and discussions at length how tweak the config.ini for a work around! Remember when Pacific Fighters rolled out and the problems ati/AMD users had with the new clouds and water settings? Sorry but don't listen to his half arsed attempts to sound knowledgable. From a computer geek and MCSE/ACMT professional! Oh and stublerone MCSE stands for Microsoft Certified Systems Engineer and ACMT stands for Apple Certified Macintosh Technician go look them up ...
Reply With Quote
  #2  
Old 08-15-2012, 01:48 PM
Flanker35M Flanker35M is offline
Approved Member
 
Join Date: Dec 2009
Location: Finland
Posts: 1,806
Default

S!

Well, AMD had problems with original IL-2 that were FIXED! I know that for sure as I used both AMD and nVidia in IL-2, since beta as well. And I tested beta on TNT cards, Kryo cards etc. to get feedback to Oleg's team. Talk about a swap-o-rama and numb arse testing!

nVidia and AMD both make good hardware that run any game today. Even I could not use Water=4 straight with AMD there were ways to get around it, but for me FPS was more important than some a bit shinier water AMD gave me a solid 60fps with VSync, that is all that mattered at the time. So did nVidia cards I owned(up to 580GTX). So better not say AMD was nothing but trouble with IL-2.

What annoys me is this nVidia lobbying "let us give you some money to game developing..in exchange we give code that gimps AMD and you have to add that TWAT logo spinning at start up"..Every freaking game nV is involved with has something extra installed that AMD can not use..look at Skyrim and some other titles. PhysX is just a gimmick, won't go into that. AMD is not any better in this regard either so "pot and kettle"

I pondered hard if I should go for 680GTX over my current 7970HD. I did not as it does not offer anything SIGNIFICANTLY better in IQ or performance in games I currently play. And those titles are not seen in the hardware reviews, go figure. In CoD for example AMD and nVidia are equally good performance wise..if nV offers me only very little increase in FPS the cost difference for that is too big IMO, even the card itself is a good one. So I settle with 7970HD for now..next time I will review need for a new card is when the 8xxxHD and 7xxGTX series come out next year. No need to slap ePeens around which one is better before that
Reply With Quote
  #3  
Old 08-15-2012, 03:23 PM
Baron Baron is offline
Approved Member
 
Join Date: Dec 2007
Posts: 705
Default

Quote:
Originally Posted by Flanker35M View Post
S!

Well, AMD had problems with original IL-2 that were FIXED! I know that for sure as I used both AMD and nVidia in IL-2, since beta as well. And I tested beta on TNT cards, Kryo cards etc. to get feedback to Oleg's team. Talk about a swap-o-rama and numb arse testing!

nVidia and AMD both make good hardware that run any game today. Even I could not use Water=4 straight with AMD there were ways to get around it, but for me FPS was more important than some a bit shinier water AMD gave me a solid 60fps with VSync, that is all that mattered at the time. So did nVidia cards I owned(up to 580GTX). So better not say AMD was nothing but trouble with IL-2.

What annoys me is this nVidia lobbying "let us give you some money to game developing..in exchange we give code that gimps AMD and you have to add that TWAT logo spinning at start up"..Every freaking game nV is involved with has something extra installed that AMD can not use..look at Skyrim and some other titles. PhysX is just a gimmick, won't go into that. AMD is not any better in this regard either so "pot and kettle"

I pondered hard if I should go for 680GTX over my current 7970HD. I did not as it does not offer anything SIGNIFICANTLY better in IQ or performance in games I currently play. And those titles are not seen in the hardware reviews, go figure. In CoD for example AMD and nVidia are equally good performance wise..if nV offers me only very little increase in FPS the cost difference for that is too big IMO, even the card itself is a good one. So I settle with 7970HD for now..next time I will review need for a new card is when the 8xxxHD and 7xxGTX series come out next year. No need to slap ePeens around which one is better before that
Like i said, i didnt want to start anything and like you say, the closest truth is that both 7970 and 670 is good cards.

The thing i reacted to was the complete misinformation provided.

And to Stubleron: I really dont know how you manage to use up 2.7 GB memory. I myself run on 1920x1080 and my memory usage peeks at 1061 MB (Black Death) with everything set as high as it goes and i use NVidias FXAA to, witch works like a charm im glad to say.

Last edited by Baron; 08-15-2012 at 03:29 PM.
Reply With Quote
  #4  
Old 08-15-2012, 04:53 PM
Von Crapenhauser Von Crapenhauser is offline
Approved Member
 
Join Date: Aug 2012
Posts: 69
Default

I,m no expert.

But I got a Jetstream 680 gtx with 2g DDR3 at it runs Clod just fine at max with just AA at X2.
No stuttering exept at 1st loading up,then smooth as silk.

i5 2.9 Quad,
Gigabyte GA-h61 chipset,
8g 1300mhz Ram,
Palit Jetstream 680gtx 2g DDR 3,
microsoft FF2 Stick,
CH pro Pedals.
750w PSU Icecooler.
2x Sata HD,
HP Dvd 1260 Sata Drive.
X4 Cooling fans.
Reply With Quote
  #5  
Old 08-16-2012, 09:16 AM
Stublerone Stublerone is offline
Approved Member
 
Join Date: Sep 2011
Posts: 250
Default

Quote:
Originally Posted by Baron View Post
And to Stubleron: I really dont know how you manage to use up 2.7 GB memory. I myself run on 1920x1080 and my memory usage peeks at 1061 MB (Black Death) with everything set as high as it goes and i use NVidias FXAA to, witch works like a charm im glad to say.
Hi Baron, you are just right over the whole matter. Both cards are good, but I just wanted to say, that vram is getting more and more important and that this is a clear advantage of the ati, especially when talking about higher reaolutions. Think abou tv's getting 4k resolutions and think about the current way of pc monitors to stay on 1080p, although we were playing games in a higher resolution 10 years ago. In this case or in case of multi monitor usage, you could perhaps run into problems in the next years. And the fact, that sli do not help you to get more vram to load textures fluently without problems, will also not change this.

But you are right, both cards do a good job.

To your vram statement: tell me your trick Please have a look at phats testings, which are near to mine. It was a thread " recommended settings..." or so.

The more vram you have, the more it could be used potentially. That could also be a reason, that our loads are varying.
Reply With Quote
  #6  
Old 08-15-2012, 09:30 PM
Codex Codex is offline
Approved Member
 
Join Date: Nov 2007
Location: Hoppers Crossing, Vic, Australia
Posts: 624
Default

Quote:
Originally Posted by Walshy View Post
Overclocking stresses the components and puts them under greater strain to perform, which causes heat and which puts the fans and cooling system under more strain to deal with said heat. Running any sort of rig under those conditions even if your cooling system is a water system eventually those components are going to wear out, and playing demanding at high settings will lower that that threshhold even more!...
When talking about the current generation of GPU’s, that’s not necessarily true.

With the fact that the GPU vendors have released “OC” versions of the 7xxx, and the fact that nVidia will automatically “overclock” their GPUs via their GPU Boost functionality, AND still provide a warrantee, means that these new GPUs can be “overclocked” and still remain with in their tolerances.

With this round of GPU’s the term “overclocked” or “superclocked” is a smoke screen, it’s marketing pure and simple. These puppies can go much higher in terms of raw clock speed and heat and still live to fight on.
Reply With Quote
  #7  
Old 08-15-2012, 11:50 PM
Walshy Walshy is offline
Approved Member
 
Join Date: Nov 2010
Location: Belfast, Northern Ireland
Posts: 114
Default

Quote:
Originally Posted by Codex View Post
When talking about the current generation of GPU’s, that’s not necessarily true.

With the fact that the GPU vendors have released “OC” versions of the 7xxx, and the fact that nVidia will automatically “overclock” their GPUs via their GPU Boost functionality, AND still provide a warrantee, means that these new GPUs can be “overclocked” and still remain with in their tolerances.

With this round of GPU’s the term “overclocked” or “superclocked” is a smoke screen, it’s marketing pure and simple. These puppies can go much higher in terms of raw clock speed and heat and still live to fight on.
Indeed they will, but constantly put under such stresses will eventually wear them out, that's what I'm saying, saying that'll never take damage when clocked is not what happens in reality ...
Reply With Quote
  #8  
Old 08-16-2012, 01:20 AM
Codex Codex is offline
Approved Member
 
Join Date: Nov 2007
Location: Hoppers Crossing, Vic, Australia
Posts: 624
Default

Quote:
Originally Posted by Walshy View Post
Indeed they will, but constantly put under such stresses will eventually wear them out, that's what I'm saying, saying that'll never take damage when clocked is not what happens in reality ...
Well nothing is ever designed to last forever
Reply With Quote
  #9  
Old 08-16-2012, 08:48 AM
Stublerone Stublerone is offline
Approved Member
 
Join Date: Sep 2011
Posts: 250
Default

Quote:
Originally Posted by Walshy View Post
Indeed they will, but constantly put under such stresses will eventually wear them out, that's what I'm saying, saying that'll never take damage when clocked is not what happens in reality ...
There are still games, which forces the nvidia cards to run at boost all the time. Misunderstandings happen here. The topic: "better buy 7970 or 670 in THIS game" means for me in zhe current state clearly: buy hd7970 as it provides you enough under ALLL circumstances! Multi monitor or other higher resolution = less possibilty of stutter, caused by vram issues. And YES, here are people talking about stutters on 680, which for sure is mainlycaused by the current driver, but the possibilities to run out of vram is still there.

Walshy, please don't get personal and stop to make yourself so big. You can be expert of whatever you want, but this is currently not changing anything in my result to give the initial poster the information for THIS game!

You are right with problems in il2, but nvidia had issues, too. Workarounds were normal, especually in these days without that regular support activities from the hardware companies. It was less frequently and so, you had issues more often.

I never meant to say, that overclocking do not stress, but the BOM of the ati card is quite a good quality and the card materials compared to the nvidia are more expensive. I do not say that it directly causes better performance, but the card itself has good materials. Gpu is "underclocked" as ati also say with the fact, that new cards are revised and got the name 1ghz editions. You and me are simply not able to say anything about lifetime in this case.

I just do not have a sufficient link, but a simple magazine, i do not care. Some big magazines already revised their benchmarking methods and also had to say, that their initial test reaults are too much "pro" nvidia. The whole results, which came out on release of the 680 were some kind of weird. And I only saw 1 or 2 websites with a reasonable testing. Reasonable to me is the following situation:

Nvidia uses boost and it uses it, when needed. So, in benchmarks it will use it very often or nearly always. So, we cannot compare the cards directly. So, we have to find a solution for it and as they are high end cards for enthusiasts, who know how to get all out of it, we should take a look over stock cooled reference cards, overclocked to its stable maximum.

This test was made by only a few websites. The result, comparing the 680 overclocked (currently do not know the spec on that, but it was even higher than the overclockings of some big magazines) and the hd7970 @1200/1500 was very interesting and showed, that it is close together, but the whole package of features provided by ati was better, so that ati got the lead and their award for the fastest card. You never can say, that it is all true, but this was the only good benchmarking at release in the whole web.

Overclocking leads to shorter lifetime-> not comparable as you just do not know all influences and material behaviours. You will have to look on your warranty. After this period, the card can break any day and you are not able to do anything. They could also have implemented life time dependend parts, which simply forces the card to die after some years. This is known from lamp or mobile phone lifetimes, which are more and more designed to die in a certain timeframe, to sell new products.

It is just dumb NOT to overclock your card, if it is needed. That is my opinion. Or do you try to use a card more than 4 years? Perhaps your next solution is to undervolt all components to safe equipment lifetime.

BTW, my hd7970 do not get over 45 degrees overclocked. You have critical, lifespan reducing temperatures on the components, which is simply not reached by both: nvidia and ati. You will get graphic problems and instability, before you reach temperatures over the components specs.

Do not take it so serious. It isn't necessary to talk about arse ... Just my opinion. And please do not read between lines to strengthen your diss. Keep it cool. And to shoot back a bit:

If you were working in my company and I see your statement of your jobs, I would fire you. This is unprofessional attitude and you jobs have nowhere near something to do with graphics cards or shouldn't state it in private opinion in a gaming forum, if you do not officially write for your company. BTW, I am asdfjkl engineer in rijfndk ltd. and I have the longest! what a crap!!! Really!!!

Last edited by Stublerone; 08-16-2012 at 09:46 AM.
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 02:54 AM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.