Official Fulqrum Publishing forum

Official Fulqrum Publishing forum (http://forum.fulqrumpublishing.com/index.php)
-   IL-2 Sturmovik (http://forum.fulqrumpublishing.com/forumdisplay.php?f=98)
-   -   Hopefully SOW favors nvidia... (http://forum.fulqrumpublishing.com/showthread.php?t=16367)

swiss 09-14-2010 06:18 AM

Hopefully SOW favors nvidia...
 
http://www.kitguru.net/components/gr...illion-crysis/

Feuerfalke 09-14-2010 08:02 AM

I doubt it.

Crysis is a mainstream-game and has much higher sales-numbers than most (any?) flightsim. The sum might sound large, but it's just a marketing campaign to implement the nVidia-logo in the startup and you bet that many people will buy an nVidia-Card, because they think it means that Crysis will run better, just because there is a different label on the box of the graphics card.

PE_Tigar 09-14-2010 08:15 AM

I hope SOW doesn't favor any HW brand or make in particular, we should be able to choose freely. The market's already duopolized anyway.

The fact that Il2 runs better on nVidia (in general) is due to the choice of API (yes, yes, it uses both DX and OGL, but no one in their right mind uses DX in Il2) and better OGL support on the side of nVidia. Since SOW will be DX only (AFAIK) I expect we'll see the cards offering better performance in DX11 performing better in SOW.

Flanker35M 09-14-2010 08:24 AM

S!

I really doubt it. IL-2 favored nVidia but now works with ATI as well, thanks to the AMD driver team and the IL2 community. IL2 is OpenGL(old OGL version mind you), SoW will be using DirectX API. Will SoW use PhysX or tesselation? I strongly doubt PhysX and tesselation..where do you need those gimmicks? I see no sensible reason to use that green logo nor optimize for it..both AMD and nVidia run any DX 11 game with very good performance.

julian265 09-14-2010 08:28 AM

Quote:

Originally Posted by PE_Tigar (Post 181200)
I hope SOW doesn't favor any HW brand or make in particular, we should be able to choose freely. The market's already duopolized anyway.

+1

swiss 09-14-2010 01:00 PM

Quote:

Originally Posted by Feuerfalke (Post 181197)
I doubt it.

Crysis is a mainstream-game and has much higher sales-numbers than most (any?) flightsim. The sum might sound large, but it's just a marketing campaign to implement the nVidia-logo in the startup and you bet that many people will buy an nVidia-Card, because they think it means that Crysis will run better, just because there is a different label on the box of the graphics card.

Actually the point was: If you want to play Crysis2 - you'll have to get an Nvidia.
If you want to play Sow and Crysis, see above.

Maybe I should have I written: if SOW favors, then I hope it's...


The (ex ATI) Radeon driver team is joke, I doubt they take anyone from the IL community for serious, otherwise, how could you explain they FU the game with every new driver release?
They also stole 5 days of my life I had to spend to sort out system errors caused by their CCC upgrade.
Thanks guys, awsome job!
Never again.
(Ok, I just bought a 4350 for my HTPC, that's a different purpose though)


Btw: Can you can see where where those two are taking the fight?
If that pays out for NV (ATI lost with Dirt2), you can expect any major release to favor one card only.
Eventually they turn into publisher.

Feuerfalke 09-14-2010 01:39 PM

Quote:

Originally Posted by swiss (Post 181250)
Actually the point was: If you want to play Crysis2 - you'll have to get an Nvidia.

No, it is not. It's what nVidia would like to tell you.

As the article says, Crysis2 will be optimized for nVidia and support PhysX, but that doesn't mean it won't run on ATI/AMD-cards. Actually PhysX also runs on ATI/AMD-cards, with no noteable performance-loss - well, it official ran, until nVidia decided to include a blocking routine that deactivated PhysX as soon as an ATI/AMD-card was detected in the PC....

Needless to say, that was a futile attempt. A driver-patch is available on many sites on the internet.



But even if it wasn't, as it was stated here a lot of times: PhysX is used primarily for graphical effects. It's not rendering the fundamental physics-engine of the game. Even the money nVidia gave EA, it wouldn't be a compensation for the loss of potential sales - how many idiots would buy a 400$ nVidia-card for a 27$ game? LOL

swiss 09-14-2010 02:01 PM

For $2M they sure will try to disadvantage AMD cards. ;)

In fact, this February I spent $170(260GTXoc) for a GPU just for IL2. The latter I got free because I downloaded it[Rem:got a real hardcopy by now].
It was on overkill for sure - but hey, I'll rather spend too much than spending less for a card which can't do the job(or any other).

If I buy a new card, which will be necessary for SOW, I want one of the best - and want it to run the latest/most popular/my choice games on "perfect" too.

Qpassa 09-14-2010 02:04 PM

Quote:

Originally Posted by PE_Tigar (Post 181200)
I hope SOW doesn't favor any HW brand or make in particular, we should be able to choose freely. The market's already duopolized anyway.

The fact that Il2 runs better on nVidia (in general) is due to the choice of API (yes, yes, it uses both DX and OGL, but no one in their right mind uses DX in Il2) and better OGL support on the side of nVidia. Since SOW will be DX only (AFAIK) I expect we'll see the cards offering better performance in DX11 performing better in SOW.

Absolutely, I have an ATI ( 5870 330€ )and I dont want to have a bad performance

Feuerfalke 09-14-2010 03:56 PM

Quote:

Originally Posted by swiss (Post 181269)
For $2M they sure will try to disadvantage AMD cards. ;)

Sometimes I wonder if you guys really don't watch what's going on in the world around you.

They'll just add the label and run a few extra-tests on nVidia-machines, while nVidia has the chance to optimize their drivers to the game. When it's released, nVidia by miracle shows higher FPS. Of course every nVidiot will cheer and don't realize that the small boost comes from slightly degraded textures at longer distances and some other deactivated filtering options. After a month the tide turn against nVidia again then back and forth over....

The exact same procedure as for every single release of a potential blockbuster-game. Just that this time it was in the summer and somebody didn't have anything else to write an article about. :rolleyes:

robtek 09-14-2010 04:53 PM

Yep, those who follow a hype will never arrive.
I look where i get the most bang for my buck, and that was ati a year ago.

Hecke 09-14-2010 05:01 PM

Would be great if Oleg would be sponsored and would be equipped with Directx 11 and AA able cards so he could show us the almost full glory of BoB.

Hunden 09-15-2010 01:36 AM

Quote:

Originally Posted by Feuerfalke (Post 181260)
No, it is not. It's what nVidia would like to tell you.

As the article says, Crysis2 will be optimized for nVidia and support PhysX, but that doesn't mean it won't run on ATI/AMD-cards. Actually PhysX also runs on ATI/AMD-cards, with no noteable performance-loss - well, it official ran, until nVidia decided to include a blocking routine that deactivated PhysX as soon as an ATI/AMD-card was detected in the PC....

Needless to say, that was a futile attempt. A driver-patch is available on many sites on the internet.



But even if it wasn't, as it was stated here a lot of times: PhysX is used primarily for graphical effects. It's not rendering the fundamental physics-engine of the game. Even the money nVidia gave EA, it wouldn't be a compensation for the loss of potential sales - how many idiots would buy a 400$ nVidia-card for a 27$ game? LOL

I'm prepared to spend in the thousands for my crap to run at its best, I have no other hobbies. I guess that makes me retarded. LOL I spent about 10 us dollars on il2 and another 900 or so just on crap to injoy my 10 dollar investment.

Feuerfalke 09-15-2010 06:00 AM

@ Hecke:
That has nothing to do with the hardware but with the HDR-implementation. Depending on how you model it, you have to set FSAA and AF ingame.

@ Hunden:
No, that doesn't make you "retarded", but it makes Intel and nVidia rich.

Bobb4 09-15-2010 09:21 AM

Quote:

Originally Posted by Feuerfalke (Post 181404)

@ Hunden:
No, that doesn't make you "retarded", but it makes Intel and nVidia rich.

:grin::grin::grin:

carl 09-15-2010 06:03 PM

how many idiots would buy a 400$ nVidia-card for a 27$ game lol i suspect most of us on these forums when bob comes out:!:

Blackdog_kt 09-16-2010 02:57 AM

I never buy the current top of the line GPUs, i usually upgrade to stuff that's in the 150-200 Euro range. The cool thing about frequent upgrades is that the previous series drops in price.

When i got my i7 rig last year i went with an Ati 4870 after a lot of years of having nVidia cards, because as Robtek said it was the best bang for the buck. That particular card was deffective and gave up the ghost last Christmas, but thanks to the 3 year warranty i got a 4890 for free, which i still use now.

I think i won't upgrade again for about a year or until the current DX11 cards drop to the 200 Euro price range, whatever happens first. I don't mind running SoW in less than full detail, as long as SoW at medium looks better than IL2 at full i'm satisfied. The good thing is that Ati is gearing up for the 6xxx series GPUs so i might be able to get a 58xx in about 6 months for a good price.

I don't know about the nVidia 4xx series. They are obviously very powerful GPUs as well, but differences in architecture suggest that unless heavy tesselation is involved the performance gain in comparison to Ati cards is not enough to justify the extra pice, increased temperatures and power consumption, at least for now. When the Fermi cards got released, their single-GPU cards were priced almost as high and ran almost as hot or hotter than the dual-GPU Ati ones and that begs the question "if i'm paying dual-GPU money and having dual-GPU temps and consumption, why not actually buy a dual-GPU card and get the performance benefit that goes with it?". Of course advances have been made, nVidia released more affordable models and worked on the temp issues, but i think they were clearly overtaken during this season. I hope they bounce back, not because i take any side in the Ati/nVidia fanboy wars, but because competition among them means better prices for me.

Flanker35M 09-16-2010 05:11 AM

S!

What Blackdog said reflects maybe best the real situation among gamers in general. In my opinion only a small percentage of the gamers have the money to invest to the top end gear. Even here my bet is on mid range hardware rather than superduper OC'd stuff. Both AMD and nVidia get most of their bucks from the 100-250€/USD range, that segment is that sells most. Enthusiast range, like AMD HD5890 or nVidia 480GTX, is for the benchers and to grow your ePeen ;)

This year and next year will be interesting regarding hardware as AMD is putting out the new 6xxx series and nVidia has been launching it's mid range products now based on the Fermi. I would be dissapointed if SoW favored one brand more than the other as both give bang for the buck. My .02€ :)

WTE_Galway 09-16-2010 05:38 AM

In reality what software developers support in terms of hardware is based almost entirely on how much support, and what development tools and assistance, the respective hardware manufacturers make available at a reasonable cost.

Bearcat 09-16-2010 06:01 AM

I just hope that it looks at least as good as WoP... and delivers the same kinds of frames.. Visually at the moment WoP has no equal in the WWII market.. I am hoping that once SoW comes out that will no longer be the case..

Feuerfalke 09-16-2010 06:35 AM

Quote:

Originally Posted by Bearcat (Post 181657)
I just hope that it looks at least as good as WoP... and delivers the same kinds of frames.. Visually at the moment WoP has no equal in the WWII market.. I am hoping that once SoW comes out that will no longer be the case..

I honestly hope that it will look more realistic than WoP's overdone Hollywood-Effects. I'm not a pilot, but I fly to frequently to think this is anywhere close to realism.

Flanker35M 09-16-2010 06:44 AM

S!

Without going into the WoP effects or stuff, one must give credit how well it runs on a steady FPS even with a LOT of things going on and with lot of objects visible. Compare Berlin map in IL-2 and WoP..which one runs smoother if you make a similar situation on both..

Again..SoW is a new engine and we have not seen much of it in form of in-game footage. Only a couple videos and that's it. Before we can see the game in action it is hard to tell how it will be visually and perfromance wise. I am sure Oleg & Team pull this off as expected and we get something to drool over for the next 10+ years.

Blackdog_kt 09-16-2010 09:29 AM

Actually, you guys got me thinking a bit and it seems to me that we can't examine the graphics on an isolated basis.
I certainly can live with a SoW that will look better than WoP but won't be able to be run at that detail. Let me explain.

If SoW runs at 50% more detail than WoP but our PCs can't take it, then probably most people will fly on reduced settings that could be, let's say 20% less detail than WoP.

To me, that's not an outright flaw if it's because of all the extra goodies running under the hood of the SoW engine. If i have extra features in campaigns, complex AI logic, imrpoved FM/DM and systems modelling then i can surely put up with slightly reduced graphics quality.

If the new FM/DM, AI and dynamic weather modules stress my PC too much, i'll turn down the graphics rather than turn down the realism settings. From that point on it's just a case of waiting for stronger PCs to arrive and gradually become affordable.

Just a small reminder, as it seems to me a lot of people will be initially disappointed. I think not a lot of us will be able to run SoW on more than medium settings when it comes out, but if this is the case it probably won't be because the graphics engine is bad, but because the rest of the game engine is too good for current day PCs.

Feathered_IV 09-16-2010 09:38 AM

I'm surprised nobody remembers the interview early this year where Oleg said that Nvidia have been in continous contact with him regarding technology and implementation, whereas ATI won't even respond.

You guess which cards will run SoW better.

Flanker35M 09-16-2010 11:25 AM

S!

Valid points Blackdog. You can't have it all without something suffering. Crank up the graphics to insane levels and you can be sure your computer won't handle the AI/FM/DM and a big gaggle of planes on the screen. I bet we can have a balance between the realsim vs eye candy. for me realism comes before eye candy.

Feathered, quite a strong statement as since that Oleg has said nothing. I really wish to see a game without the red or green tag spinning at the beginning. We will see when SoW is out tho..until then make wild guesses.

Thunderbolt56 09-16-2010 12:00 PM

Quote:

Originally Posted by PE_Tigar (Post 181200)
The fact that Il2 runs better on nVidia (in general) is due to the choice of API (yes, yes, it uses both DX and OGL, but no one in their right mind uses DX in Il2) and better OGL support on the side of nVidia. Since SOW will be DX only (AFAIK) I expect we'll see the cards offering better performance in DX11 performing better in SOW.


This^

Though I haven't owned an ATI card since the old original 9700 Pro, and from all indications I'll likely stay with NV, I hope there's really no distinct manufacturer preference.

swiss 09-19-2010 11:50 AM

Quote:

Originally Posted by Hunden (Post 181382)
I'm prepared to spend in the thousands for my crap to run at its best, I have no other hobbies. I guess that makes me retarded. LOL I spent about 10 us dollars on il2 and another 900 or so just on crap to injoy my 10 dollar investment.

Actually it was IL2 what got my into the computer building/modding hobby.
I'm already well in the 4 digit range for my IL2 stuff, but still have no rudders. ;)

And yes, I dont have any other hobbies either.

kimosabi 09-19-2010 01:30 PM

Quote:

Originally Posted by swiss (Post 181250)

The (ex ATI) Radeon driver team is joke, I doubt they take anyone from the IL community for serious, otherwise, how could you explain they FU the game with every new driver release?
They also stole 5 days of my life I had to spend to sort out system errors caused by their CCC upgrade.
Thanks guys, awsome job!
Never again.

You must be doing something wrong. Ati does listen and every driver batch since 10.3 has improved my OGL performance in IL-2. 10.8 was the last needed improvement to run PL in IL-2, installing 10.8 made me able to finally reinstate vertex arrays. On my system they do not FU the game at all. They give it better performance every time they release a new driver. So, you must be doing something awfully wrong, my friend.

swiss 09-20-2010 11:14 AM

Quote:

Originally Posted by kimosabi (Post 182711)
You must be doing something wrong. Ati does listen and every driver batch since 10.3 has improved my OGL performance in IL-2. 10.8 was the last needed improvement to run PL in IL-2, installing 10.8 made me able to finally reinstate vertex arrays. On my system they do not FU the game at all. They give it better performance every time they release a new driver. So, you must be doing something awfully wrong, my friend.

Funny, I read of ppl having problems after installing the latest driver all the time.
Only downgrading or waiting for a newer driver fixes the prob.
Now, I don't care whether they are doing something wrong, fact is, this kind of trouble seems to be reserved for Radeon owners only.
(I think we even have some in this forum)

Concerning the 10.8: Wait for 10.9 to FU something else. lol

Qpassa 09-20-2010 11:24 AM

Quote:

Originally Posted by swiss (Post 182976)
Funny, I read of ppl having problems after installing the latest driver all the time.
Only downgrading or waiting for a newer driver fixes the prob.
Now, I don't care whether they are doing something wrong, fact is, this kind of trouble seems to be reserved for Radeon owners only.
(I think we even have some in this forum)

Concerning the 10.8: Wait for 10.9 to FU something else. lol

10.9 has been released :rolleyes:

robtek 09-20-2010 02:08 PM

I just wish the people here would quit bashing the other brand of graphic-card then this they have installed.
Nobody gains anything from it, except some really meagre egos.:-D
If all gets right SoW will run on both brands regarding to the power of the chip and not of the brand - name.
Atm afaik NV runs without problems but is more expensive for the same power
and Ati/AMD has had some driver glitches with some older games but is less expensive.

swiss 09-20-2010 02:27 PM

Quote:

Originally Posted by Qpassa (Post 182977)
10.9 has been released :rolleyes:

So?

Remember the block text thing?

started with V1.0
-resolved in V1.1
there again in V1.2
-resolved in V1.3
...


I'm not bashing Radeon because I have some emotional affection to NV(I still have two ATI in other systems and like them - there), but the sole fact those guys cause a lot of unnecessary trouble.

For all I care this thread can die.

Qpassa 09-20-2010 02:34 PM

with 10.9 IL2 works OK, 5870 ATI RADEON

swiss 09-20-2010 02:40 PM

sure, and the next version potentially will f*** it up again. That's why I said they don't give a sh1t about us. ;)

Flanker35M 09-20-2010 03:11 PM

S!

Swiss, if IL-2 has worked with 10.8 and 10.9 I see no reason to bash ATI/AMD. If that driver set works why uograde to next one, if it works well on other titles you play as well ;) That we even got the drivers to work in the first place is kind of a feature in itself as IL-2 is over 10 years old and not really one of the hottest releases around. Now with a working driver set why complain? NV has it's problems as well, just check some forums ;)

Also remember IL-2 uses a jurassic version of OpenGL and from the start IL-2 was optimized to NV due various reason I won't go into here. Say IL-2 was coded in latest OGL I am sure there would be far less glitches. NV or ATI, both work and deliver performance that fill the needs of a gamer. Not everyone plays these damn shooters pouring in from doors and windows. For me ATI 5870HD and nVidia 480GTX have worked as advertised :)


All times are GMT. The time now is 02:44 AM.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.