Official Fulqrum Publishing forum

Official Fulqrum Publishing forum (http://forum.fulqrumpublishing.com/index.php)
-   IL-2 Sturmovik: Cliffs of Dover (http://forum.fulqrumpublishing.com/forumdisplay.php?f=189)
-   -   FAQ-QUESTIONS,release date,system specs, for CoD (http://forum.fulqrumpublishing.com/showthread.php?t=16401)

Foo'bar 11-09-2010 02:41 PM

Quote:

Originally Posted by domian (Post 196817)
There is and there will be no planned release date...

You know what? Thank God, I say! ;)

Tte. Costa 11-09-2010 03:48 PM

Quote:

Originally Posted by domian (Post 196817)
I couldn´t imagine how the programmers can work without a strict timetable.

I think is the best way to work, not to relase a game under pression of publisher and full of bugs.

addman 11-09-2010 04:42 PM

I think it's very interesting to note how long this game has been in development, considering it's neither CoD nor Duke Nukem -which WILL be released next year by Gearbox who now owns the IP-. They -Maddox Games- must have relatively low burn rate since the IL-2 series is by no means is a triple-A series of games.

Whoever their publisher is, they must have A LOT of confidence in O-Team. Everybody's saying Ubisoft is the publisher but I find that hard to believe since Ubisoft since a while back has stated that their main focus for games are in the console market -piracy- and rather then bringing out many different IP's they'll be focusing on a smaller range of triple-A titles instead i.e Assassins Creed.

Oleg and his team is a dying breed IMO, game studios have more money then ever but the sales of the games not only has to match the expenditures but also produce waaaay more profit if there will be any sequels/second chances. This is where DLC makes it's entrance me thinks, forget about expecting free patches with additional content, those days are over. A lot has happened since the first releases of the IL-2 series. Bug fixing patches will be free of course but expect to pay for all the other stuff, planes, map packs etc. Finally Oleg can actually make some money from us cry babies ;)

I'm more then happy to pay for stuff coming out of Olegs team because I know it's always top quality. These are just some educated or less educated guesses IMHO :)

The Kraken 11-09-2010 05:26 PM

Quote:

Originally Posted by domian (Post 196842)
Sure, but this could become a neverending story, because it is nearly impossible to eliminate all bugs.

I want SoW in good shape too, no question, but i believe, Oleg is a perfectionist and the game is in good shape already.

Oleg himself has mentioned several areas that still need to be worked on, and that isn't limited to small and unimportant features. Overall SoW hasn't been in development considerably longer than Il2, but with a lot more content and detail to cover.

And the business model that requires a mandatory release date independent of the actual state of the software is, fortunately, not the only possible approach.

Richie 11-09-2010 07:43 PM

Quote:

Originally Posted by Gringo (Post 196808)
Hi there!

May I would like to go back to the first post of this thread and the first question:

1. When will BOB SOW be released or or in other words, when can we buy it? :oops:

Thank you very much in advance for a serious answer. :cool:

Best wishes
Gringo

In my opinion spring 2011 Feb March April. I think that's a fair guess.

Hecke 11-09-2010 07:57 PM

"thousands will be playing SoW BoB by October" - Maybe October 2011? ;)

klem 11-09-2010 09:27 PM

Quote:

Originally Posted by T}{OR (Post 194714)
Oleg confirmed there will be a x64 exe. On a similar note - DCS A-10C is a x64 sim, first ever released. So my bet is that with SoW we will have similar support, for both x32 and x 64.

A squadmate just pointed this out from these discussions on 24th September:

Quote:
Originally Posted by steam View Post
Oleg, does this mean that the game will be two versions of 64 and 32 bit?

Currently the answer is no, it doesn't. In future - maybe.

:(

Chivas 11-09-2010 11:19 PM

A developer can have the all the work schedules you want but they are almost impossible to keep.

SOW announced by Oleg in 2003 to be released in 2005.
.....this did not happen as the developer decided to help Luthier build Pacific Fighters into a full addon.

SOW announced by Ubi in 2006 to be released in 2007
.....this did not happen as there seems to be more money to be gained with further addons to the Il-2 series, ie IL-2 1946 etc.

SOW development again started after the release of Il-2 1946 in 2008.
....there were a few setbacks in this period including employee problems on a small team
....it looked as if it could be released by Oct 2010, but anyone who has developed anything understands there are still allot of minefields to clear.
....now it all depends on how long it will take to clear the beta bugs.

Its certainly been along time, but it appears the wait will be worth it. Hopefully there will be enough sales to make building further additions to the SOW series worth the time and effort.

Richie 11-09-2010 11:38 PM

Directx11 explained well here.
======================


http://www.youtube.com/watch?v=6Wp4Y-u8-Qw

Now that I've checked out Chias's post I'm glad it's been delayed and it's a Dx11 game

major_setback 11-10-2010 10:35 PM

Quote:

Originally Posted by Richie (Post 196969)
Directx11 explained well here.
======================


http://www.youtube.com/watch?v=6Wp4Y-u8-Qw

Now that I've checked out Chias's post I'm glad it's been delayed and it's a Dx11 game


Well, from that video (the bit about the cobblestones) Parallax Occlusion Mapping seems something to look out for! Looks 3D - but is easy on computer resources.

http://en.wikipedia.org/wiki/Parallax_occlusion_mapping

"Parallax occlusion mapping (POM) is an enhancement of the parallax mapping technique. Parallax occlusion mapping is used to procedurally create 3D definition in textured surfaces, using a displacement map (similar to a topography map) instead of through the generation of new geometry. This allows developers of 3D rendering applications to add 3D complexity in textures, which correctly change relative to perspective and with self occlusion in real time (Self-shadowing is additionally possible), without sacrificing the processor cycles required to create the same effect with geometry calculations."

http://www.youtube.com/watch?v=gcAsJ...eature=related

More on the side-bar

http://www.youtube.com/watch?v=KKe1p...eature=related

Richie 11-11-2010 12:26 AM

I'm thinking it may have a positive effect on the instrument panel with the framing around the different gauges?

WTE_Galway 11-11-2010 12:49 AM

Quote:

Originally Posted by Richie (Post 197233)
I'm thinking it may have a positive effect on the instrument panel with the framing around the different gauges?

textured leather comes to mind

klem 11-11-2010 10:57 PM

Quote:

Originally Posted by Richie (Post 196969)
Directx11 explained well here.
======================

But will SOW be using these features (Tesselation, Compute Shading, Parallax Occlusion etc.?

Does it have to be 'programmed in' by Oleg or does it 'come with Directx11'?

Excuse my ignorance but as you'll know from earlier posts I am divided over ATI's 6970 (capable of Tesselation etc) which may not be out for a while or the 5870.

phoenix1963 11-12-2010 06:15 AM

Klem - I think all recent ATI cards have tesselation built in, whereas it's only the lastest family of NVidia cards that have it.

You are correct, Oleg would have to program DX11 tesselation in the game, rather than using the LODs (level of detail) models which switch between models. That's why the Stuka (in particular) jumps in resolution suddenly when you approach it.

Tesselation is a way of offloading increased detail to the graphics card rather than the CPU. But as an NVidia engineer once said to me, at the end of the day the card eventually does a traditional triangle-lighting-pixellation render - it's more of an programming interface and work distribution issue.

I think I remember Oleg saying it wouldn't have tesselation, but I'm not 100% sure. I suggest to you that he'd be ruling out a fair number of NVidia users with older cards if he went for it.

56RAF_phoenix

klem 11-12-2010 11:00 AM

Thanks phoenix

I spoke to ATI and they emphasis that the 6000 series is cheaper than the 5000 and is optimised for Tesselation. There was a bit of a language problem but I think he was saying the 5000s also deliver Tesselation (I've seen this in other posts) and he defintiely said the game has to be written for it.

I think Oleg has said it would be Dirext11 (hearsay from other threads) but I believe that won't deliver Tesselation on its own.

According to a couple of websites the 5870 is 'faster' than the 6870 except where Tesselation is used

http://www.hardware-infos.com/grafikkarten_charts.php
http://www.tomshardware.com/reviews/...50,2782-7.html
http://www.tomshardware.com/reviews/...arts,2776.html

F19_lacrits 11-12-2010 12:34 PM

Quote:

Originally Posted by klem (Post 197553)
According to a couple of websites the 5870 is 'faster' than the 6870 except where Tesselation is used

This is also confirmed by AMD, the tesselation engine in 68xx is much improved compared to previous 58xx-series GPU's. It's about twice as fast at lower levels of tesselation.. though this advantage goes down with higher levels of tesselation. See this graph from xbitlabs comparing 68xx tesselation vs. 58xx.
http://www.xbitlabs.com/images/video...d6800_tess.png

.. Though AMD are still behind nvidia Fermi GF100 when it comes to tesselation.

You will find that the 6870 and 6850 are not far behind their predecessors overall, and in CrossFire they scale even better than "old" 58xx.. though they have less raw "GPU"-power, they are better optimized for their task.

Triggaaar 11-12-2010 03:28 PM

Quote:

Originally Posted by phoenix1963 (Post 197510)
Klem - I think all recent ATI cards have tesselation built in, whereas it's only the lastest family of NVidia cards that have it.

I think you have this the wrong way around. Nvidia have led ATI/AMD in tesselation for a while, as can be seen in the benchmarks.

JAMF 11-12-2010 04:34 PM

Quote:

Originally Posted by Triggaaar (Post 197681)
I think you have this the wrong way around. Nvidia have led ATI/AMD in tesselation for a while, as can be seen in the benchmarks.

nVidia had a type of tesselation before 2002? Ati had Truform back then.

K_Freddie 11-14-2010 02:38 PM

That Dx11 video is sooooo pathetic.
What did he actually tell us,.... 'Tesselation' :rolleyes:

And what about the rest of DX11... Now that would be informative.
;)

Triggaaar 11-14-2010 05:59 PM

Quote:

Originally Posted by JAMF (Post 197720)
nVidia had a type of tesselation before 2002? Ati had Truform back then.

Did anyone have tesselation that made a positive contribution to any game back then? I'm really thinking of the last batch of cards, the 5xxx series against the 4xx series, and the nvidia cards were better at tesselation.

JAMF 11-14-2010 07:33 PM

Quote:

Originally Posted by Triggaaar (Post 198341)
Did anyone have tesselation that made a positive contribution to any game back then? I'm really thinking of the last batch of cards, the 5xxx series against the 4xx series, and the nvidia cards were better at tesselation.

Tessellation.

And you basically said phoenix1963 claim was wrong, where it was you who was wrong. That's what I was referring to.

The reason there were no positive contributions, because not many game developers did make use of it. If you had a Radeon 9500, you would have seen it in Quake, Unreal Tournament, Rainbow Six, Morrowind and some others.

The other point was that AMD is innovating. nVidia isn't. nVidia does the "oooh, let's do that too, but a bit better" a generation after AMD. Same with surround gaming.

JAMF 11-14-2010 08:56 PM

Quote:

Originally Posted by domian (Post 198371)
Flaming fanboy :roll:

Funny that, as I had nothing but nVidia cards till now.

speculum jockey 11-15-2010 01:56 AM

Quote:

Originally Posted by domian (Post 198371)
Flaming fanboy :roll:

It's sort of true. They're usually the ones that look to see what happens then try and catch up and pass the innovators. Most of their early 2000's innovation came about after acquiring 3DFX and all their tech and a lot of their staff.

swiss 11-15-2010 05:18 AM

Quote:

Originally Posted by speculum jockey (Post 198405)
It's sort of true. They're usually the ones that look to see what happens then try and catch up and pass the innovators.

Actually a smart move, economically too.

Flanker35M 11-15-2010 06:24 AM

S!

AMD or nVidia, you can not go wrong these days. Anything above 60FPS at your desired screen resolution and details is good IMO. Comparing cards how they perform in IL-2, for example, is just plain stupid as we know AMD has some issues with it AND back then IL-2 was optimized for nVidia. I've been using both brands and not a single game has performed badly in the games I play. I prefer AMd because it is more silent, cooler and draws less power for almost equal performance to the green team. For me a few % means a squat, I play not live for benchmark scores :D

dduff442 11-15-2010 02:57 PM

AMD and nVidia are both guilty of anti-competitive practices -- that's what optimisations for a specific card are all about. Competition in this way rather than on price hurts buyers.

PhysX and CUDA are examples of anti-competitive practices as well. Nvidia want gamers to hold on to old CPUs and buy new GPUs to cement their control of the market. CPU power is very cheap compared with GPU power, and physics belongs on the CPU.

dduff

swiss 11-15-2010 03:27 PM

Quote:

Originally Posted by dduff442 (Post 198470)
ll. Nvidia want gamers to hold on to old CPUs and buy new GPUs

Hold on to old cpus?

Pls explain.

Triggaaar 11-15-2010 03:58 PM

Quote:

Originally Posted by dduff442 (Post 198470)
PhysX and CUDA are examples of anti-competitive practices as well. Nvidia want gamers to hold on to old CPUs and buy new GPUs to cement their control of the market. CPU power is very cheap compared with GPU power, and physics belongs on the CPU.

dduff

Quote:

Originally Posted by swiss (Post 198476)
Hold on to old cpus?

Pls explain.

I believe duff's point is that nvidea want you to become dependant on the GPU for the PhysX elements of a game, which suits the cards they sell more than their competitor. But it would be more economical for the physics to be handled by the cpu, so users would be better keeping their cpu up to date, rather than spending more on their nvidia GC.

swiss 11-15-2010 04:20 PM

why should nv develop a program which does not use the resources they sell?

There isn't a AMD or Intel Physx version...

Baron 11-15-2010 04:24 PM

CPU`s doesnt even come close to GPU`s abillity to handle PhysX, proppably never will.


Thats why NVidia is working so hard to make it a feature in games and nvidia gpu`s. The fact thet everyone who doesnt buy nvidia is whining about them dealing dirty is, well childish.

No reason what so ever why Nvidia should just give away features they work hard to develop.

Especially not to a company (read AMD) who cant even be bother if it cost them the slightest. (SoW to name one)

Blackdog_kt 11-15-2010 08:11 PM

While nVidia is obviously doing what's best for them, us users complaining about it is anything but childish, it's about what's good for us.

For example, if 99% of the games 5 years from now use physX and you are forced to buy nVidia cards at grossly inflated prices due to lack of competition, you'll understand why people are complaining now in a effort to steer things the way of the customer while it's still early on ;)

swiss 11-15-2010 08:19 PM

Quote:

Originally Posted by Blackdog_kt (Post 198598)
While nVidia is obviously doing what's best for them, us users complaining about it is anything but childish, it's about what's good for us.

For example, if 99% of the games 5 years from now use physX and you are forced to buy nVidia cards at grossly inflated prices due to lack of competition, you'll understand why people are complaining now in a effort to steer things the way of the customer while it's still early on ;)

Isn't it the programmer/studio who decides to use physx or not?
So:

1st: blame the programmers
2nd: blame AMD for not having something similar
3rd: blame nvidia only if they OWN the programming studios.

julian265 11-15-2010 08:59 PM

Quote:

Originally Posted by swiss (Post 198603)
Isn't it the programmer/studio who decides to use physx or not?
So:

1st: blame the programmers
2nd: blame AMD for not having something similar
3rd: blame nvidia only if they OWN the programming studios.

+1

Also I'd only see a problem if a game NEEDED PhysX, rather than just benefitted from it, which I think is unlikely.

If PhysX became an important factor in the sales of Nvidia cards, the competitors would probably come out with their own solution... I'm not surprised that they haven't yet.

dduff442 11-15-2010 10:59 PM

Well first of all I pointed out that both AMD and nVidia are guilty of anti-competitive practices, so the accusations I'm part of the fanboisie are misplaced.

Secondly, I'd need to see a lot of evidence before I'd believe GPUs have some inherent advantage over CPUs for physics calculations; physics engines have been incorporated in numerous games for years and hardly any games are CPU limited on even the most basic machines. Il-2 has it's own rigid body model for crashes for example, one of many innovations.

People need to move past brand loyalty and see attempts to control the market for what they are: monopoly exploitation that will hurt consumers in the long run. PhysX, CUDA etc are just attempts to balkanise the industry in the exact same way Netscape and Microsoft tried to with the internet. They took open standards like HTML and added proprietary extensions; the idea was that websites would look bad or just be broken on their opponents software. This had nothing to do with helping consumers and everything to do with gaining power over them.

I don't believe that AMD are more innocent than nVidia, it's just that these tricks serve the interests of the dominant player rather than the underdog. Two companies are already insufficient for proper competition. If either gets a lock on the market, everybody loses.

dduff

Igo kyu 11-16-2010 12:17 AM

Quote:

Originally Posted by dduff442 (Post 198632)
Secondly, I'd need to see a lot of evidence before I'd believe GPUs have some inherent advantage over CPUs for physics calculations;

GPUs have advantages over CPUs for processing, full stop. The only advantage CPUs have is that they are optimised for x86 and x64 code, so you need them to run Windows (or Linux, or Apple's OSX).

People used to use multiple CPUs to make fast computers, now they use multiple GPUs.

http://en.wikipedia.org/wiki/GPGPU

CUDA is the nVidia name for it:

http://en.wikipedia.org/wiki/CUDA

The AMD name for it is Stream:

http://en.wikipedia.org/wiki/AMD_Fir...evelopment_Kit

swiss 11-16-2010 08:20 AM

Quote:

Originally Posted by Igo kyu (Post 198652)
GPUs have advantages over CPUs for processing, full stop. The only advantage CPUs have is that they are optimised for x86 and x64 code, so you need them to run Windows (or Linux, or Apple's OSX).

People used to use multiple CPUs to make fast computers, now they use multiple GPUs.

http://en.wikipedia.org/wiki/GPGPU

CUDA is the nVidia name for it:

http://en.wikipedia.org/wiki/CUDA

The AMD name for it is Stream:

http://en.wikipedia.org/wiki/AMD_Fir...evelopment_Kit

http://pressroom.nvidia.com/easyir/c...sp=release_157

dduff442 11-16-2010 09:34 AM

From Wikipedia.

On anti-competitive practices:

Versions 186 and newer of the ForceWare drivers disable PhysX hardware acceleration when a GPU from a different manufacturer, such as AMD, is present in the system.[14] Representatives at Nvidia stated to customers that the decision was made due to development expenses, and for quality assurance and business reasons.[11][15] This decision has caused a backlash from the community that led to the creation of a community patch for Windows 7, circumventing the GPU check in Nvidia's updated drivers. Nvidia also implemented a time bomb in versions 196 and 197 which slowed down hardware-accelerated PhysX and reversed the gravity, leading to unwanted physical effects[16] - which was again remedied by the updated version of the community patch.[17]

On 5 July 2010, Real World Technologies published an analysis[21] of the PhysX architecture. According to this analysis, most of the code used in PhysX applications is based on x87 instructions without any multi-threading optimization. This could cause significant performance drops when running PhysX code on the CPU. The article suggests that a PhysX rewrite using SSE instructions may substantially lessen the performance discrepancy between CPU PhysX and GPU PhysX.

In response to the Real World Technologies analysis, Mike Skolones, product manager of PhysX, said[22] that SSE support has been left behind because most games are developed for consoles first and then ported to the PC. As a result, modern computers run these games faster and better than the consoles even with little or no optimization. Senior PR manager of Nvidia, Bryan Del Rizzo, explained that multi-threading is already available with CPU PhysX 2.x and that it is up to the developer to make use of it. Automatic multi-threading and SSE will be introduced with version 3 of the PhysX SDK.[23]


It's hard to make sense of Mike Skolones' comment that "modern computers run these games faster and better than the consoles" because "most games are developed for consoles first and then ported to the PC".

Some forms of physics modelling are suitable for parallelisation and some are not. I don't recall Havok based games running into CPU bottlenecks. In fact CPU-limited games are rarer than hen's teeth.

This conversation badly needs to get away from the nVidia vs AMD thing. When Apple were underdogs they complained bitterly about Microsoft's dirty tricks. Now they're on top, they're as bad as Microsoft ever were. Microsoft haven't gotten much better either. Back when Netscape was on top in internet applications, it wrestled with all it's might for power over consumers; it only started complaining when it lost out to Microsoft. Such practices are nearly universal among companies that have the power to carry them out.

dduff

dduff442 11-16-2010 09:44 AM

The supercomputers referred to run a restricted set of programmes over and over again. The performance figures referred to apply only to these specially tailored programmes which are suitable for parallelisation. Many software applications, including numerous physics applications, cannot be parallelised in this way. That many others can is neither here nor there -- CPUs don't struggle with modern games.

Reference to supercomputers is in any case of no relevance to a discussion about computer games.

dduff

swiss 11-16-2010 10:00 AM

I still don't get it:
(all from wiki)

Quote:

PhysX is a proprietary realtime physics engine middleware SDK acquired by Ageia (which itself was acquired by Nvidia in February 2008[1]) with the purchase of ETH Zurich spin-off NovodeX in 2004. The term PhysX can also refer to the PPU add-in card designed by Ageia to accelerate PhysX-enabled video games.
Their engine(NV), their company, their rules.

Give them one reason why they should spend a single nickel to optimise it for CPUs they don't sell or ATI cards, which is a competitor.

Quote:

the decision was made due to development expenses, and for quality assurance and business reasons
They they do what they think is best for their own company.
It's not like NV is a NPO - they paid for it too.
Maybe ATI want to shove some green over?


Now neither of the two is really dominating the market, tell ATI to remove the finger and give it go themselves.

also:

Quote:

Nvidia provides both the engine and SDK for free to Windows and Linux users and developers
I'd say this enough charity. ;)

dduff442 11-16-2010 12:19 PM

Oh God...

You've just ignored most of the points made, as well as the numerous comparisons drawn with other companies that are uncontroversially held to have engaged in uncompetitive practices in the past.

The time bomb is inexcusable -- it's just an attempt to make competitors hardware look like it's malfunctioning when in fact it's nVidia sabotaging its own equipment.

The disabling driver is equally inexcusable; the people affected own perfectly funcioning PhysX-capable nVidia cards, but because the driver detects an AMD card also on the system it shuts down. This isn't "nVidia's driver, nVidia's rules", it's "gamer's cards, nVidia's rules". People shelled out on secondary nVidia cards only for nVidia to sabotage them after the fact with a driver "update".

A tiny bit of attention paid to nVidia's excuses reveals them to be plain BS in each case.

If you're going to persist with the debate, please address the parallels drawn with the behaviour of Apple, Microsoft and Netscape described above.

dduff

speculum jockey 11-16-2010 01:12 PM

Quote:

Originally Posted by swiss (Post 198687)
I still don't get it:

I'll try and use an analogy so you can understand.

Mastercard give preferential rates to customers who shop at stores that are partners with them. If you use a visa you will not only NOT get that preferential rate, but it will actually cost you more money than it should.

Shell Oil decides to make cars. If you don't put Shell brand gas in that Shell car then you will get horrible mileage due to a design to ensure you only use their brand of gas.

Nvidia includes code that makes games run like crap if you don't use their cards!

K_Freddie 11-16-2010 01:42 PM

Quote:

Originally Posted by speculum jockey (Post 198721)
Nvidia includes code that makes games run like crap if you don't use their cards!

:confused::confused::rolleyes:
MSoft makes DirectX11(DX11) and freely distributes it.
NVidia, Radeon, etc.. make drivers that connect their cards to lower interface of DX11.
Game developers mostly use the top interface of DX11 to connect to any card, or connect directly to the card's driver itself.
The same idea applies PHysX and other types of interfaces.

What you're saying above is non-sense, unless of course you've installed a NVidia driver for a Radeon/ATI/etc card.... well, what more can I say ??
:grin:

speculum jockey 11-16-2010 02:22 PM

Quote:

Originally Posted by domian (Post 198729)
You compare apples to oranges. Thats complete and utter bullshit.

Sure in youre opinion Solarworld should give theirs solar panels for free to every person, because we all want to protect the environment. :roll:

This is in relation to release#'s 196 and 197 with regards to games using the physx programming and non NV cards.

Baron 11-16-2010 02:25 PM

Quote:

Originally Posted by speculum jockey (Post 198721)

Nvidia includes code that makes games run like crap if you don't use their cards!


U really belive that?

Maby NVidia should start develope Ati`s drivers to?

Maby NVidia includes code that makes games run as well as possible if u use thire cards? (shocker, i know)

Blackdog_kt 11-16-2010 02:55 PM

Quote:

Originally Posted by swiss (Post 198603)
Isn't it the programmer/studio who decides to use physx or not?
So:

1st: blame the programmers
2nd: blame AMD for not having something similar
3rd: blame nvidia only if they OWN the programming studios.

I don't really disagree with that, but that's just one side of the coin. The point stands that if we are ever left with a single GPU brand, a lot of things we take for granted will become prohibitive in price.

I think that currently nVidia's lead is a perceived one and not a real one, a lead that's mainly in the marketing department. That's why i object to such practices, because if they sell enough of a product that needs improvement technically, they will be less inclined to improve it.

It's not like i'm an nVidia hater either, up till my current PC all i've ever had was nVidia cards. However, i have no brand loyalty whatsoever. I pay good money to these people and i expect the product to suit me, if it doesn't then too bad for them. However, the reason i can do this is because there is a competitor. I wouldn't be able to if there wasn't one.

In simple terms, a spinning logo during game start-up or all the hype about a technology that's still in its early stages and used in a handful of games (tesselation) doesn't equal true technological benefits for me that will justify their prices.

I'd rather they used some of that money to improve their manufacturing techniques, bring down the cost per unit and lower their wattage and heat signature than buy advertising space for a logo on as much games as possible. Then they would be more competitive, their products would be even better and we would all benefit from it due to the price wars with Ati.

As it is now, Ati has been selling at the prices they are simply because they know it doesn't make sense to buy a single core card that costs almost as much as, is hotter than, draws similar or more watts and delivers comparable performance to their dual core flagship model. If they were feeling threatened they would have cut prices earlier.



EDIT:


Quote:

Originally Posted by dduff442 (Post 198632)
Well first of all I pointed out that both AMD and nVidia are guilty of anti-competitive practices
[...........]
People need to move past brand loyalty and see attempts to control the market for what they are: monopoly exploitation that will hurt consumers in the long run. PhysX, CUDA etc are just attempts to balkanise the industry in the exact same way Netscape and Microsoft tried to with the internet. They took open standards like HTML and added proprietary extensions; the idea was that websites would look bad or just be broken on their opponents software. This had nothing to do with helping consumers and everything to do with gaining power over them.

I don't believe that AMD are more innocent than nVidia, it's just that these tricks serve the interests of the dominant player rather than the underdog. Two companies are already insufficient for proper competition. If either gets a lock on the market, everybody loses.

dduff

That's exactly the point really.

A company develops software to further their own hardware sales: Good

A company actually spends money and time on sabotaging their own hardware if a competitor's hardware is also present on the system: Down right unacceptable and worth a big fat "screw you" to them next time i decide to buy :-P

addman 11-16-2010 03:03 PM

Haven't read the whole thread but I'd like to comment on the whole "SoW will be Nvidia optimized etc." subject. No game developer in their right mind will make a game that runs better on one brand of VGA cards than another. It doesn't make financial sense, if you wan't to reach out to as many customers as possible you have to make the game work equally good on all different brands. Maybe 10 years ago it was different when Nvidia all but owned the VGA market but these day when it's so evenly spread you have to satisfy ALL customers. Many games have an ATi or Nvidia stamp but they work just as well on both respectives in most cases, it's mostly just a marketing ploy. This is just my own humble opinion though :-)

Igo kyu 11-16-2010 03:05 PM

Quote:

Originally Posted by Baron (Post 198735)
U really belive that?

...

Maby NVidia includes code that makes games run as well as possible if u use thire cards? (shocker, i know)

No, that turn their cards off if there's a non nVidia card present. So that you can't use a radeon for graphics, and a Geforce for CUDA, which if they didn't turn their card off, you could do. CUDA is currently better than AMD's Stream apparently, so it could make sense to try that, if you could, but no, if you want CUDA, you have to use a nVidia card for graphics. Which means, if you have an expensive AMD card already, you can't get a cheap GeForce as a physics co-pro. You could get a cheap radeon, but it probably won't work as well as a physics co-pro as a GeForce would, and the programing interface for the radeon physics set-up is probably different too, meaning more work for developers to use that. Years from now, the AMD way which is more about complying with industry standards may turn out to be better, but at this point in time as I understand it, the proprietory CUDA interface is leading.

Triggaaar 11-16-2010 03:16 PM

Quote:

Originally Posted by Baron (Post 198735)
U really belive that?

Yes. Some of us do believe that many of these companies do whatever they can to maximise their profits. Companies, as mentioned by duff, such as Microsoft, Apple, nVidia, AMD, Netscape etc.

In the example of nVidia, they have paid developers to optimise their code to suit nVidia cards, and it's suggested that they also (and this doesn't require much imagination for any of the companies listed) do extra work to prevent the cards working well on a competitors card. This is not surprising. It makes their card look good, and leads to more sales and bigger profits, which is really the only thing most (all) of these companies care about.

Quote:

Maby NVidia should start develope Ati`s drivers to?
Maybe you miss-understand the accusations, which are not made solely at nVidia.

As explained above, this is even just about making games work badly for customers of the competition - some people bought nVidia cards for PhysX, and nVidia stopped those cards working, if the customer also owned an ATI card.

klem 11-16-2010 04:06 PM

This is all a bit too deep for me.

Is anyone suggesting that SoW is being developed to deliver maximum benefit only if Nvidia's cards are used whilst ATI users have to settle for second best?

brando 11-16-2010 04:30 PM

Perhaps the Lucid Hydra chip, as seen on the new Asus Crosshair IV Extreme mobo, will go some way towards levelling this bumpy field. (Not that I'm recommending that particular board, which is aimed mainly at the over-clocking market) Once they have the software sorted it should be possible to combine green and red cards, hopefully to exploit the advantages of each type. Using the CUDA and PhysX features was mentioned in the review I read @ http://www.pureoverclock.com/review.php?id=1134&page=1

Triggaaar 11-16-2010 06:03 PM

Quote:

Originally Posted by klem (Post 198751)
This is all a bit too deep for me.

Is anyone suggesting that SoW is being developed to deliver maximum benefit only if Nvidia's cards are used whilst ATI users have to settle for second best?

No. Nvidia own some games, and make large donations to others, and they are particularly affected. SoW is not one of them. It's just a discussion about what graphics card companies do, and what is good and bad for the consumer etc.

I understand that Nvidia have been helpful to the SoW development team, but we don't have any information to suggest the game will run better on nvidia cards (other than the fact that they may have worked on their drivers). Oleg has said the game will run best on a DX11 capable card.

swiss 11-16-2010 06:46 PM

Quote:

Originally Posted by Triggaaar (Post 198747)
In the example of nVidia, they have paid developers to optimise their code to suit nVidia cards, and it's suggested that they also (and this doesn't require much imagination for any of the companies listed) do extra work to prevent the cards working well on a competitors card. This is not surprising. It makes their card look good, and leads to more sales and bigger profits, which is really the only thing most (all) of these companies care about.


No kidding?
Why else would I found a company other than for making as much money as possible?
Turn the world into better place?
That's the domain of NPOs and public funded institutes.

I have an ATI onboard card and a nv pci - I didn't know, and never expected, I can install both driver on the same machine and expect it to run properly.
Can I?
Speaking of it - in this combo I couldn't run SLI - but there's a feature called hybrid crossfire.
I cant use that. DAMN ATI for not making hybrid Xfire compliant with my NV card.


Do you guys actually realize, that if the two would act the way you wish, we'd be left with a single company? If everything runs as good on one card as the other, all that's left is the hardware - why should they invest huge sums to develop it separately if in the end it doesn't make any difference?
A fusion in this case makes waaaay more sense....

Conclusion: As along as they are giving each other sh1t, we have a perfectly working market with competition.
Once that stops, then you have to be scared.

swiss 11-16-2010 07:12 PM

.

carl 11-16-2010 07:59 PM

i have an old GeForce4 MX4000 with 64mb ram on board, do you thinl i will be able to run this game on lowish settings or should i upgade to an amd? card to get good framerates,
please advice me as i believe game is coming out soon and worried my pc may struggle a bit.
as a side note do you think i will it be windows 98 compatible as i would hate to have to upgrade up to that xp thingy.
cheers

dduff442 11-16-2010 08:02 PM

Quote:

Originally Posted by swiss (Post 198800)
No kidding?
Why else would I found a company other than for making as much money as possible?
Turn the world into better place?
That's the domain of NPOs and public funded institutes.

I have an ATI onboard card and a nv pci - I didn't know, and never expected, I can install both driver on the same machine and expect it to run properly.
Can I?
Speaking of it - in this combo I couldn't run SLI - but there's a feature called hybrid crossfire.
I cant use that. DAMN ATI for not making hybrid Xfire compliant with my NV card.


Do you guys actually realize, that if the two would act the way you wish, we'd be left with a single company? If everything runs as good on one card as the other, all that's left is the hardware - why should they invest huge sums to develop it separately if in the end it doesn't make any difference?
A fusion in this case makes waaaay more sense....

Conclusion: As along as they are giving each other sh1t, we have a perfectly working market with competition.
Once that stops, then you have to be scared.

Your post exhibits a contempt for fair argument.

What you refer to as "hybrid crossfire" doesn't exist; it would require a new API from scratch at a cost of millions.

People who bought a cheap auxiliary nVidia card for PhysX support on the other hand had a working system at an actual profit to nVidia. It's not that some extra work was required for nVidia -- quite the opposite. They foisted a driver "update" onto their own customers that disabled certain functionality of hardware users had bought and paid for.

You simply can't be taken seriously if you're going to consciously engage in disingenuous nonsense like this. It's furthermore notable that you've ignored nearly all the points raised earlier.

dduff

Triggaaar 11-16-2010 08:30 PM

Quote:

Originally Posted by swiss (Post 198800)
No kidding?
Why else would I found a company other than for making as much money as possible?
Turn the world into better place?

Well some that start companies might want to keep within the law (there are laws on monopolies and fair trade), and some might want to treat customers well, particularly if that is likely to repay them in the future.

klem 11-16-2010 09:58 PM

Quote:

Originally Posted by Triggaaar (Post 198791)
No. Nvidia own some games, and make large donations to others, and they are particularly affected. SoW is not one of them. It's just a discussion about what graphics card companies do, and what is good and bad for the consumer etc...................... Oleg has said the game will run best on a DX11 capable card.

Thanks for that Triggaaar

klem 11-16-2010 10:01 PM

Quote:

Originally Posted by carl (Post 198821)
i have an old GeForce4 MX4000 with 64mb ram on board, do you thinl i will be able to run this game on lowish settings or should i upgade to an amd? card to get good framerates,
please advice me as i believe game is coming out soon and worried my pc may struggle a bit.
as a side note do you think i will it be windows 98 compatible as i would hate to have to upgrade up to that xp thingy.
cheers

carl Im surprised it's still running IL-2. Do you have the settings turned right down?

No-one can say about SoW yet but I would be very surprised if you didn't need a major upgrade (and you may as well go to Windows 7) - but wait and see :)

dduff442 11-16-2010 10:06 PM

Quote:

Originally Posted by domian (Post 198825)
@ dduff442

o man

Your mindset is one of the reasons, why Ireland is on the rocks!

A low blow -- one that would be offensive if it wasn't so deeply stupid.

Triggaaar 11-16-2010 10:09 PM

Quote:

Originally Posted by carl (Post 198821)
i have an old GeForce4 MX4000 with 64mb ram on board, do you thinl i will be able to run this game on lowish settings or should i upgade to an amd? card to get good framerates,
please advice me as i believe game is coming out soon and worried my pc may struggle a bit.
as a side note do you think i will it be windows 98 compatible as i would hate to have to upgrade up to that xp thingy.

Quote:

Originally Posted by klem (Post 198857)
carl Im surprised it's still running IL-2. Do you have the settings turned right down?

No-one can say about SoW yet but I would be very surprised if you didn't need a major upgrade (and you may as well go to Windows 7) - but wait and see :)

I think Carl is having a little joke :)

swiss 11-17-2010 08:46 AM

Quote:

Originally Posted by Triggaaar (Post 198829)
Well some that start companies might want to keep within the law (there are laws on monopolies and fair trade), and some might want to treat customers well, particularly if that is likely to repay them in the future.

I cant see a monopoly anywhere here - it's a duopoly.
That is already lowest level of competition - the last I as customer want is those two companies working together, which already might be case on pricing - they just need one of each company meet for lunch.
Cartel rings a bell? In which market situation would you expect them?

PS:
Quote:

and some might want to treat customers well
Well in fact they do. They just force you make a decision for one or the other, but isn't that what competition is all about?

JG52Uther 11-17-2010 09:38 AM

Did anyone notice the title of this thread:
FAQ-QUESTIONS,release date,system specs, for SOW

swiss 11-17-2010 10:02 AM

Quote:

Originally Posted by JG52Uther (Post 198931)
Did anyone notice the title of this thread:
FAQ-QUESTIONS,release date,system specs, for SOW

then this thread should have been locked from the beginning - we can't expect any of those questions to be answered before release.

:rolleyes:

Richie 11-17-2010 11:42 PM

Has anybody been following this GTX 460 Hawk. What a bargain. If you have to spend 1200 dollars on a i7 975 why not see if this thing will work for 270 dollars instead of spending another 600 on a video card.

http://www.youtube.com/watch?v=SkrZPra2scg

Triggaaar 11-18-2010 09:40 AM

Quote:

Originally Posted by Richie (Post 199071)
Has anybody been following this GTX 460 Hawk. What a bargain. If you have to spend 1200 dollars on a i7 975 why not see if this thing will work for 270 dollars instead of spending another 600 on a video card.

To be honest, if you're spending that much on an i7 975, you probably want a faster graphics card. Have you got any benchmarks for the Hawk?

Richie 11-18-2010 10:12 AM

Did very well.

http://www.youtube.com/watch?v=Ay0gG...eature=channel

Richie 11-18-2010 10:16 AM

I'll get it and if it's not so hot I'll get a 470 but I bet it will do well. I'll get a wackier one later.

Triggaaar 11-18-2010 12:50 PM

So it's pretty good for a 460, but it's still only a 460. Probabably a bit quicker than a 6850 and a bit slower than a 6870, and priced in between the two (at least here in the UK - £146, £164, £183).

Unfortunately we're still waiting to learn what SoW really wants, but I don't think it will run at maximum quality at a high res with high fps on a 460, so if you had a 6 core i7, I wouldn't skimp so much on the GC.

swiss 11-18-2010 02:17 PM

Quote:

Originally Posted by Richie (Post 199145)
I'll get it and if it's not so hot I'll get a 470 but I bet it will do well. I'll get a wackier one later.

The 470 is probably the only card without any right for existence.
If you really want to go for the middle, wait for the 570.

klem 11-18-2010 06:43 PM

Quote:

Originally Posted by Triggaaar (Post 198860)
I think Carl is having a little joke :)

Doh!

Richie 11-18-2010 06:49 PM

I was hearing a review on a 470. "Don't handle the back of it..."IT WILL BURN YOU"..lol. I guess they all get warm.

Qpassa 11-18-2010 07:32 PM

amd have done a 5000 serie, cool and powerfull.
5870 is a nice video card

julian265 11-18-2010 10:50 PM

Quote:

Originally Posted by JG52Uther (Post 198931)
Did anyone notice the title of this thread:
FAQ-QUESTIONS,release date,system specs, for SOW

I noticed that the thread was an attempt to contain unanswerable questions...

Baron 11-19-2010 12:23 AM

Quote:

Originally Posted by Richie (Post 199244)
I was hearing a review on a 470. "Don't handle the back of it..."IT WILL BURN YOU"..lol. I guess they all get warm.


Evga 470 running RoF demo everything maxed out 1680x1050 online for 90 min: Fan speed 39% temp 59 degrees, core usage 57%. Idle temp with 30% fan (silant) 33 degrees.


Dont belive everything u read on the net, "rewievers" getting thoose insane temps and wattages would proppably do better as a busboy.

Richie 11-19-2010 01:03 AM

It's a 480 but you're right things seem ok but I'd still rather get the other even if I had the money.

http://www.youtube.com/user/Hardware...25/Jh1HQ64HcmE

WTE_Galway 11-19-2010 01:41 AM

Over the years I have found Tom's charts fairly useful:

http://www.tomshardware.com/charts/2...1200,2491.html

louisv 11-19-2010 01:35 PM

Getting a GTX580 from my EVGA GTX480 step up program...can't wait.
Cooler, faster and quiet. The 480 is a good 6c cooler than in the reviews (25c in here) and the "noise" problem is greatly exagerated. The 580 is what the 480 should have been.

F19_lacrits 11-19-2010 01:58 PM

Quote:

Originally Posted by swiss (Post 198502)
why should nv develop a program which does not use the resources they sell?

There isn't a AMD or Intel Physx version...

AMD has had HAVOK for some time now.. This is another code for simulated physics. Though AMD has not really done much with it, they are now looking at doing physics in Open-CL.. Though I wouldn't go jumping up and down for it, it remains to see if AMD will make a difference this time around or if this will fall flat too. AMD wants an open source physics code for any one to be able to develope on..

Scott@bjorn3d 11-20-2010 10:34 AM

Well my GTX 580 will be here next week. I think that should handle the game well. I know my 480 is handling anything I have thrown at it while also running the Folding@Home GPU3 client.

jf1981 11-20-2010 11:40 AM

Hi,

As of now do we have a clearer idea about approx when the game will be available ?

Regards

Jean-François

baronWastelan 11-21-2010 09:14 AM

Quote:

Originally Posted by jf1981 (Post 199719)
Hi,

As of now do we have a clearer idea about approx when the game will be available ?

Regards

Jean-François

6 mos - 1 yr from now, best estimate. Cheers and welcome to the forum! Bonjour!

JAMF 11-21-2010 10:24 AM

Quote:

Originally Posted by F19_lacrits (Post 199415)
AMD has had HAVOK for some time now.. This is another code for simulated physics. Though AMD has not really done much with it, they are now looking at doing physics in Open-CL.. Though I wouldn't go jumping up and down for it, it remains to see if AMD will make a difference this time around or if this will fall flat too. AMD wants an open source physics code for any one to be able to develope on..

Intel bought Havok from the developer. Most here saw it the first time when they played Halflife 2.

Triggaaar 11-21-2010 10:26 AM

Quote:

Originally Posted by baronWastelan (Post 199879)
6 mos - 1 yr from now, best estimate.

I think that's a failry persimistic estimate, it really shouldn't take a year from when it was demonstrated to the press. I'd estimate about 4 - 7 months.

The Kraken 11-21-2010 10:58 AM

Quote:

Originally Posted by Triggaaar (Post 199892)
I think that's a failry persimistic estimate, it really shouldn't take a year from when it was demonstrated to the press. I'd estimate about 4 - 7 months.

The Russian publisher says "Spring", which is the most official information we have so far short of educated guesses.

Aquarius 11-24-2010 05:02 AM

video from Iglomir
 
hello guys, found this quite interesting video of SoW from that action, where had Oleg the presentation. Btw, check related videos...

I didnt know where to put it for you, so hope I havent done something wrong...

enjoy these 10 minutes of quite annoying music, but some interesting and funny moments (total noobs trying to fly:))

http://www.youtube.com/watch?v=AyTMj...eature=related

regards

edit: My opinion is that we should be happy, if the proclamation with summer of 2011 is true...

Skoshi Tiger 11-24-2010 08:14 AM

Quote:

Originally Posted by Aquarius (Post 200466)
hello guys, found this quite interesting video of SoW from that action, where had Oleg the presentation. Btw, check related videos...

...
edit: My opinion is that we should be happy, if the proclamation with summer of 2011 is true...

Theres some interesting footage in that mini-update you've supplied!

It must have been fun going to the expo!

Thankyou very much!


One thing that I noticed in the footage of the water is that there are plenty of waves but I didn't see any swell.

I wonder if waves will be a factor when we (I!!!!!) ditch in the water. If we hit into the face of a wave will or plane nose-in rather than skim along the surface?

Cheers!

Aquarius 11-24-2010 09:12 AM

glad to hear its useful:)

About the waves...i think if oleg and team is going to create more sizes of waves (bigger), then the waves could be a factor during hitting them

I was thinking about the liquid on the windscreen, cant tell you the minute exactly, somewhere in the middle of video...it seems like Predators blood to me...any other suggestions?

Blackdog_kt 11-24-2010 12:51 PM

I've seen that video before in the Igromir thread, but it's probably the best and longest video from the expo, so thanks for linking it again.

I didn't re-watch it yet, but if i remember correctly that liquid you are talking about is engine oil getting splashed on the windscreen.

peterwoods@supanet.com 11-24-2010 05:05 PM

"Predator's Blood"
 
There is certainly engine oil on the windscreen at 5:10, but there is also the green liquid referred to as "Predator's Blood" by Aquarious. Could this be engine coolant? The Mk XII Merlin used a 70/30% water/glycol mix which improved cooling and also avoided the risk of fire from pure glycol used previously. The bright clear green colour is achieved by adding potasium permanganate to the the glycol to persuade fools from drinking it. Ethylene glycol is toxic, and ingestion can result in death.

machoo 11-28-2010 10:23 AM

When is the release date , quit screwing around with " we don't know" I heard 2008 , 2009 , October 2010 , November 2010 was the CUT OFF date before the project would not be profitable.

If not now , when.

JG52Uther 11-28-2010 10:39 AM

Quote:

Originally Posted by machoo (Post 201373)
When is the release date , quit screwing around with " we don't know" I heard 2008 , 2009 , October 2010 , November 2010 was the CUT OFF date before the project would not be profitable.

If not now , when.

I assume you don't read the posts in this forum? Latest info is spring 2011.

Tree_UK 11-28-2010 12:40 PM

Quote:

Originally Posted by JG52Uther (Post 201374)
I assume you don't read the posts in this forum? Latest info is spring 2011.

Which means that a few people in here are going to be eating thier own shorts! :grin::grin: Zapitista, hope your hungry buddy :grin::grin:

II/JG54_Emil 11-28-2010 02:26 PM

Quote:

Originally Posted by JG52Uther (Post 201374)
I assume you don't read the posts in this forum? Latest info is spring 2011.

Oh man we hear this for years now.

LukeFF 11-29-2010 06:17 AM

Quote:

Originally Posted by II/JG54_Emil (Post 201401)
Oh man we hear this for years now.

Hardly so, and even if it was true, what would it matter? Do you have a financial stake in SoW's marketing success?

Triggaaar 11-29-2010 08:59 AM

Quote:

Originally Posted by LukeFF (Post 201502)
Hardly so, and even if it was true, what would it matter? Do you have a financial stake in SoW's marketing success?

Presumably not, but he's a customer who feels frustrated with the on-going date mystery, which is understandable.

Chivas 11-29-2010 06:00 PM

Its been a long wait, but understandable if you consider all the pitfalls in developing such a complex game with a small development team. We all have a stake in SoW's marketing success, financially no, but if we ever want to see the continuation of the SOW series, BOB will have to be a success. Those not bothering to invest in BOB and waiting for a Pacific addon better think again.

ElAurens 11-29-2010 09:53 PM

What Chivas said, times 100000000000000000000000000000000.

If the SoW series does not progress beyond BoB, we will be stuck with IL2. And I'm sorry gents, but IL2 is long in the tooth and is showing it's age, badly. Don't get me wrong, I still love the old girl, but all the mod band aids in the world are not going to get around the fact that the sim is running on borrowed time.

Triggaaar 12-03-2010 03:19 PM

OMG, just looked through the maps and mission building info in the 3rd of December update, and I'm finding the waiting so hard. Please publishers, hurry up with an estimated release date.

Richie 12-04-2010 07:35 AM

It's soon enough for us old guys it's not years anymore, 2011 spring is a pretty good guess.

RaVe 12-06-2010 01:27 PM

Sow Can't be moded
 
It it true that Sow can't, can not, won't be moded for online Play.
I really hope not. It will be a great disappointment if Sow is moded
Is there any word on this from OM.
I’m not looking for a debate a simple Yes or No will suffice


All times are GMT. The time now is 11:11 AM.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.