![]() |
Quote:
|
Quote:
|
I think it's very interesting to note how long this game has been in development, considering it's neither CoD nor Duke Nukem -which WILL be released next year by Gearbox who now owns the IP-. They -Maddox Games- must have relatively low burn rate since the IL-2 series is by no means is a triple-A series of games.
Whoever their publisher is, they must have A LOT of confidence in O-Team. Everybody's saying Ubisoft is the publisher but I find that hard to believe since Ubisoft since a while back has stated that their main focus for games are in the console market -piracy- and rather then bringing out many different IP's they'll be focusing on a smaller range of triple-A titles instead i.e Assassins Creed. Oleg and his team is a dying breed IMO, game studios have more money then ever but the sales of the games not only has to match the expenditures but also produce waaaay more profit if there will be any sequels/second chances. This is where DLC makes it's entrance me thinks, forget about expecting free patches with additional content, those days are over. A lot has happened since the first releases of the IL-2 series. Bug fixing patches will be free of course but expect to pay for all the other stuff, planes, map packs etc. Finally Oleg can actually make some money from us cry babies ;) I'm more then happy to pay for stuff coming out of Olegs team because I know it's always top quality. These are just some educated or less educated guesses IMHO :) |
Quote:
And the business model that requires a mandatory release date independent of the actual state of the software is, fortunately, not the only possible approach. |
Quote:
|
"thousands will be playing SoW BoB by October" - Maybe October 2011? ;)
|
Quote:
Quote: Originally Posted by steam View Post Oleg, does this mean that the game will be two versions of 64 and 32 bit? Currently the answer is no, it doesn't. In future - maybe. :( |
A developer can have the all the work schedules you want but they are almost impossible to keep.
SOW announced by Oleg in 2003 to be released in 2005. .....this did not happen as the developer decided to help Luthier build Pacific Fighters into a full addon. SOW announced by Ubi in 2006 to be released in 2007 .....this did not happen as there seems to be more money to be gained with further addons to the Il-2 series, ie IL-2 1946 etc. SOW development again started after the release of Il-2 1946 in 2008. ....there were a few setbacks in this period including employee problems on a small team ....it looked as if it could be released by Oct 2010, but anyone who has developed anything understands there are still allot of minefields to clear. ....now it all depends on how long it will take to clear the beta bugs. Its certainly been along time, but it appears the wait will be worth it. Hopefully there will be enough sales to make building further additions to the SOW series worth the time and effort. |
Directx11 explained well here.
====================== http://www.youtube.com/watch?v=6Wp4Y-u8-Qw Now that I've checked out Chias's post I'm glad it's been delayed and it's a Dx11 game |
Quote:
Well, from that video (the bit about the cobblestones) Parallax Occlusion Mapping seems something to look out for! Looks 3D - but is easy on computer resources. http://en.wikipedia.org/wiki/Parallax_occlusion_mapping "Parallax occlusion mapping (POM) is an enhancement of the parallax mapping technique. Parallax occlusion mapping is used to procedurally create 3D definition in textured surfaces, using a displacement map (similar to a topography map) instead of through the generation of new geometry. This allows developers of 3D rendering applications to add 3D complexity in textures, which correctly change relative to perspective and with self occlusion in real time (Self-shadowing is additionally possible), without sacrificing the processor cycles required to create the same effect with geometry calculations." http://www.youtube.com/watch?v=gcAsJ...eature=related More on the side-bar http://www.youtube.com/watch?v=KKe1p...eature=related |
I'm thinking it may have a positive effect on the instrument panel with the framing around the different gauges?
|
Quote:
|
Quote:
Does it have to be 'programmed in' by Oleg or does it 'come with Directx11'? Excuse my ignorance but as you'll know from earlier posts I am divided over ATI's 6970 (capable of Tesselation etc) which may not be out for a while or the 5870. |
Klem - I think all recent ATI cards have tesselation built in, whereas it's only the lastest family of NVidia cards that have it.
You are correct, Oleg would have to program DX11 tesselation in the game, rather than using the LODs (level of detail) models which switch between models. That's why the Stuka (in particular) jumps in resolution suddenly when you approach it. Tesselation is a way of offloading increased detail to the graphics card rather than the CPU. But as an NVidia engineer once said to me, at the end of the day the card eventually does a traditional triangle-lighting-pixellation render - it's more of an programming interface and work distribution issue. I think I remember Oleg saying it wouldn't have tesselation, but I'm not 100% sure. I suggest to you that he'd be ruling out a fair number of NVidia users with older cards if he went for it. 56RAF_phoenix |
Thanks phoenix
I spoke to ATI and they emphasis that the 6000 series is cheaper than the 5000 and is optimised for Tesselation. There was a bit of a language problem but I think he was saying the 5000s also deliver Tesselation (I've seen this in other posts) and he defintiely said the game has to be written for it. I think Oleg has said it would be Dirext11 (hearsay from other threads) but I believe that won't deliver Tesselation on its own. According to a couple of websites the 5870 is 'faster' than the 6870 except where Tesselation is used http://www.hardware-infos.com/grafikkarten_charts.php http://www.tomshardware.com/reviews/...50,2782-7.html http://www.tomshardware.com/reviews/...arts,2776.html |
Quote:
http://www.xbitlabs.com/images/video...d6800_tess.png .. Though AMD are still behind nvidia Fermi GF100 when it comes to tesselation. You will find that the 6870 and 6850 are not far behind their predecessors overall, and in CrossFire they scale even better than "old" 58xx.. though they have less raw "GPU"-power, they are better optimized for their task. |
Quote:
|
Quote:
|
That Dx11 video is sooooo pathetic.
What did he actually tell us,.... 'Tesselation' :rolleyes: And what about the rest of DX11... Now that would be informative. ;) |
Quote:
|
Quote:
And you basically said phoenix1963 claim was wrong, where it was you who was wrong. That's what I was referring to. The reason there were no positive contributions, because not many game developers did make use of it. If you had a Radeon 9500, you would have seen it in Quake, Unreal Tournament, Rainbow Six, Morrowind and some others. The other point was that AMD is innovating. nVidia isn't. nVidia does the "oooh, let's do that too, but a bit better" a generation after AMD. Same with surround gaming. |
Quote:
|
Quote:
|
Quote:
|
S!
AMD or nVidia, you can not go wrong these days. Anything above 60FPS at your desired screen resolution and details is good IMO. Comparing cards how they perform in IL-2, for example, is just plain stupid as we know AMD has some issues with it AND back then IL-2 was optimized for nVidia. I've been using both brands and not a single game has performed badly in the games I play. I prefer AMd because it is more silent, cooler and draws less power for almost equal performance to the green team. For me a few % means a squat, I play not live for benchmark scores :D |
AMD and nVidia are both guilty of anti-competitive practices -- that's what optimisations for a specific card are all about. Competition in this way rather than on price hurts buyers.
PhysX and CUDA are examples of anti-competitive practices as well. Nvidia want gamers to hold on to old CPUs and buy new GPUs to cement their control of the market. CPU power is very cheap compared with GPU power, and physics belongs on the CPU. dduff |
Quote:
Pls explain. |
Quote:
Quote:
|
why should nv develop a program which does not use the resources they sell?
There isn't a AMD or Intel Physx version... |
CPU`s doesnt even come close to GPU`s abillity to handle PhysX, proppably never will.
Thats why NVidia is working so hard to make it a feature in games and nvidia gpu`s. The fact thet everyone who doesnt buy nvidia is whining about them dealing dirty is, well childish. No reason what so ever why Nvidia should just give away features they work hard to develop. Especially not to a company (read AMD) who cant even be bother if it cost them the slightest. (SoW to name one) |
While nVidia is obviously doing what's best for them, us users complaining about it is anything but childish, it's about what's good for us.
For example, if 99% of the games 5 years from now use physX and you are forced to buy nVidia cards at grossly inflated prices due to lack of competition, you'll understand why people are complaining now in a effort to steer things the way of the customer while it's still early on ;) |
Quote:
So: 1st: blame the programmers 2nd: blame AMD for not having something similar 3rd: blame nvidia only if they OWN the programming studios. |
Quote:
Also I'd only see a problem if a game NEEDED PhysX, rather than just benefitted from it, which I think is unlikely. If PhysX became an important factor in the sales of Nvidia cards, the competitors would probably come out with their own solution... I'm not surprised that they haven't yet. |
Well first of all I pointed out that both AMD and nVidia are guilty of anti-competitive practices, so the accusations I'm part of the fanboisie are misplaced.
Secondly, I'd need to see a lot of evidence before I'd believe GPUs have some inherent advantage over CPUs for physics calculations; physics engines have been incorporated in numerous games for years and hardly any games are CPU limited on even the most basic machines. Il-2 has it's own rigid body model for crashes for example, one of many innovations. People need to move past brand loyalty and see attempts to control the market for what they are: monopoly exploitation that will hurt consumers in the long run. PhysX, CUDA etc are just attempts to balkanise the industry in the exact same way Netscape and Microsoft tried to with the internet. They took open standards like HTML and added proprietary extensions; the idea was that websites would look bad or just be broken on their opponents software. This had nothing to do with helping consumers and everything to do with gaining power over them. I don't believe that AMD are more innocent than nVidia, it's just that these tricks serve the interests of the dominant player rather than the underdog. Two companies are already insufficient for proper competition. If either gets a lock on the market, everybody loses. dduff |
Quote:
People used to use multiple CPUs to make fast computers, now they use multiple GPUs. http://en.wikipedia.org/wiki/GPGPU CUDA is the nVidia name for it: http://en.wikipedia.org/wiki/CUDA The AMD name for it is Stream: http://en.wikipedia.org/wiki/AMD_Fir...evelopment_Kit |
Quote:
|
From Wikipedia.
On anti-competitive practices: Versions 186 and newer of the ForceWare drivers disable PhysX hardware acceleration when a GPU from a different manufacturer, such as AMD, is present in the system.[14] Representatives at Nvidia stated to customers that the decision was made due to development expenses, and for quality assurance and business reasons.[11][15] This decision has caused a backlash from the community that led to the creation of a community patch for Windows 7, circumventing the GPU check in Nvidia's updated drivers. Nvidia also implemented a time bomb in versions 196 and 197 which slowed down hardware-accelerated PhysX and reversed the gravity, leading to unwanted physical effects[16] - which was again remedied by the updated version of the community patch.[17] On 5 July 2010, Real World Technologies published an analysis[21] of the PhysX architecture. According to this analysis, most of the code used in PhysX applications is based on x87 instructions without any multi-threading optimization. This could cause significant performance drops when running PhysX code on the CPU. The article suggests that a PhysX rewrite using SSE instructions may substantially lessen the performance discrepancy between CPU PhysX and GPU PhysX. In response to the Real World Technologies analysis, Mike Skolones, product manager of PhysX, said[22] that SSE support has been left behind because most games are developed for consoles first and then ported to the PC. As a result, modern computers run these games faster and better than the consoles even with little or no optimization. Senior PR manager of Nvidia, Bryan Del Rizzo, explained that multi-threading is already available with CPU PhysX 2.x and that it is up to the developer to make use of it. Automatic multi-threading and SSE will be introduced with version 3 of the PhysX SDK.[23] It's hard to make sense of Mike Skolones' comment that "modern computers run these games faster and better than the consoles" because "most games are developed for consoles first and then ported to the PC". Some forms of physics modelling are suitable for parallelisation and some are not. I don't recall Havok based games running into CPU bottlenecks. In fact CPU-limited games are rarer than hen's teeth. This conversation badly needs to get away from the nVidia vs AMD thing. When Apple were underdogs they complained bitterly about Microsoft's dirty tricks. Now they're on top, they're as bad as Microsoft ever were. Microsoft haven't gotten much better either. Back when Netscape was on top in internet applications, it wrestled with all it's might for power over consumers; it only started complaining when it lost out to Microsoft. Such practices are nearly universal among companies that have the power to carry them out. dduff |
The supercomputers referred to run a restricted set of programmes over and over again. The performance figures referred to apply only to these specially tailored programmes which are suitable for parallelisation. Many software applications, including numerous physics applications, cannot be parallelised in this way. That many others can is neither here nor there -- CPUs don't struggle with modern games.
Reference to supercomputers is in any case of no relevance to a discussion about computer games. dduff |
I still don't get it:
(all from wiki) Quote:
Give them one reason why they should spend a single nickel to optimise it for CPUs they don't sell or ATI cards, which is a competitor. Quote:
It's not like NV is a NPO - they paid for it too. Maybe ATI want to shove some green over? Now neither of the two is really dominating the market, tell ATI to remove the finger and give it go themselves. also: Quote:
|
Oh God...
You've just ignored most of the points made, as well as the numerous comparisons drawn with other companies that are uncontroversially held to have engaged in uncompetitive practices in the past. The time bomb is inexcusable -- it's just an attempt to make competitors hardware look like it's malfunctioning when in fact it's nVidia sabotaging its own equipment. The disabling driver is equally inexcusable; the people affected own perfectly funcioning PhysX-capable nVidia cards, but because the driver detects an AMD card also on the system it shuts down. This isn't "nVidia's driver, nVidia's rules", it's "gamer's cards, nVidia's rules". People shelled out on secondary nVidia cards only for nVidia to sabotage them after the fact with a driver "update". A tiny bit of attention paid to nVidia's excuses reveals them to be plain BS in each case. If you're going to persist with the debate, please address the parallels drawn with the behaviour of Apple, Microsoft and Netscape described above. dduff |
Quote:
Mastercard give preferential rates to customers who shop at stores that are partners with them. If you use a visa you will not only NOT get that preferential rate, but it will actually cost you more money than it should. Shell Oil decides to make cars. If you don't put Shell brand gas in that Shell car then you will get horrible mileage due to a design to ensure you only use their brand of gas. Nvidia includes code that makes games run like crap if you don't use their cards! |
Quote:
MSoft makes DirectX11(DX11) and freely distributes it. NVidia, Radeon, etc.. make drivers that connect their cards to lower interface of DX11. Game developers mostly use the top interface of DX11 to connect to any card, or connect directly to the card's driver itself. The same idea applies PHysX and other types of interfaces. What you're saying above is non-sense, unless of course you've installed a NVidia driver for a Radeon/ATI/etc card.... well, what more can I say ?? :grin: |
Quote:
|
Quote:
U really belive that? Maby NVidia should start develope Ati`s drivers to? Maby NVidia includes code that makes games run as well as possible if u use thire cards? (shocker, i know) |
Quote:
I think that currently nVidia's lead is a perceived one and not a real one, a lead that's mainly in the marketing department. That's why i object to such practices, because if they sell enough of a product that needs improvement technically, they will be less inclined to improve it. It's not like i'm an nVidia hater either, up till my current PC all i've ever had was nVidia cards. However, i have no brand loyalty whatsoever. I pay good money to these people and i expect the product to suit me, if it doesn't then too bad for them. However, the reason i can do this is because there is a competitor. I wouldn't be able to if there wasn't one. In simple terms, a spinning logo during game start-up or all the hype about a technology that's still in its early stages and used in a handful of games (tesselation) doesn't equal true technological benefits for me that will justify their prices. I'd rather they used some of that money to improve their manufacturing techniques, bring down the cost per unit and lower their wattage and heat signature than buy advertising space for a logo on as much games as possible. Then they would be more competitive, their products would be even better and we would all benefit from it due to the price wars with Ati. As it is now, Ati has been selling at the prices they are simply because they know it doesn't make sense to buy a single core card that costs almost as much as, is hotter than, draws similar or more watts and delivers comparable performance to their dual core flagship model. If they were feeling threatened they would have cut prices earlier. EDIT: Quote:
A company develops software to further their own hardware sales: Good A company actually spends money and time on sabotaging their own hardware if a competitor's hardware is also present on the system: Down right unacceptable and worth a big fat "screw you" to them next time i decide to buy :-P |
Haven't read the whole thread but I'd like to comment on the whole "SoW will be Nvidia optimized etc." subject. No game developer in their right mind will make a game that runs better on one brand of VGA cards than another. It doesn't make financial sense, if you wan't to reach out to as many customers as possible you have to make the game work equally good on all different brands. Maybe 10 years ago it was different when Nvidia all but owned the VGA market but these day when it's so evenly spread you have to satisfy ALL customers. Many games have an ATi or Nvidia stamp but they work just as well on both respectives in most cases, it's mostly just a marketing ploy. This is just my own humble opinion though :-)
|
Quote:
|
Quote:
In the example of nVidia, they have paid developers to optimise their code to suit nVidia cards, and it's suggested that they also (and this doesn't require much imagination for any of the companies listed) do extra work to prevent the cards working well on a competitors card. This is not surprising. It makes their card look good, and leads to more sales and bigger profits, which is really the only thing most (all) of these companies care about. Quote:
As explained above, this is even just about making games work badly for customers of the competition - some people bought nVidia cards for PhysX, and nVidia stopped those cards working, if the customer also owned an ATI card. |
This is all a bit too deep for me.
Is anyone suggesting that SoW is being developed to deliver maximum benefit only if Nvidia's cards are used whilst ATI users have to settle for second best? |
Perhaps the Lucid Hydra chip, as seen on the new Asus Crosshair IV Extreme mobo, will go some way towards levelling this bumpy field. (Not that I'm recommending that particular board, which is aimed mainly at the over-clocking market) Once they have the software sorted it should be possible to combine green and red cards, hopefully to exploit the advantages of each type. Using the CUDA and PhysX features was mentioned in the review I read @ http://www.pureoverclock.com/review.php?id=1134&page=1
|
Quote:
I understand that Nvidia have been helpful to the SoW development team, but we don't have any information to suggest the game will run better on nvidia cards (other than the fact that they may have worked on their drivers). Oleg has said the game will run best on a DX11 capable card. |
Quote:
No kidding? Why else would I found a company other than for making as much money as possible? Turn the world into better place? That's the domain of NPOs and public funded institutes. I have an ATI onboard card and a nv pci - I didn't know, and never expected, I can install both driver on the same machine and expect it to run properly. Can I? Speaking of it - in this combo I couldn't run SLI - but there's a feature called hybrid crossfire. I cant use that. DAMN ATI for not making hybrid Xfire compliant with my NV card. Do you guys actually realize, that if the two would act the way you wish, we'd be left with a single company? If everything runs as good on one card as the other, all that's left is the hardware - why should they invest huge sums to develop it separately if in the end it doesn't make any difference? A fusion in this case makes waaaay more sense.... Conclusion: As along as they are giving each other sh1t, we have a perfectly working market with competition. Once that stops, then you have to be scared. |
.
|
i have an old GeForce4 MX4000 with 64mb ram on board, do you thinl i will be able to run this game on lowish settings or should i upgade to an amd? card to get good framerates,
please advice me as i believe game is coming out soon and worried my pc may struggle a bit. as a side note do you think i will it be windows 98 compatible as i would hate to have to upgrade up to that xp thingy. cheers |
Quote:
What you refer to as "hybrid crossfire" doesn't exist; it would require a new API from scratch at a cost of millions. People who bought a cheap auxiliary nVidia card for PhysX support on the other hand had a working system at an actual profit to nVidia. It's not that some extra work was required for nVidia -- quite the opposite. They foisted a driver "update" onto their own customers that disabled certain functionality of hardware users had bought and paid for. You simply can't be taken seriously if you're going to consciously engage in disingenuous nonsense like this. It's furthermore notable that you've ignored nearly all the points raised earlier. dduff |
Quote:
|
Quote:
|
Quote:
No-one can say about SoW yet but I would be very surprised if you didn't need a major upgrade (and you may as well go to Windows 7) - but wait and see :) |
Quote:
|
Quote:
Quote:
|
Quote:
That is already lowest level of competition - the last I as customer want is those two companies working together, which already might be case on pricing - they just need one of each company meet for lunch. Cartel rings a bell? In which market situation would you expect them? PS: Quote:
|
Did anyone notice the title of this thread:
FAQ-QUESTIONS,release date,system specs, for SOW |
Quote:
:rolleyes: |
Has anybody been following this GTX 460 Hawk. What a bargain. If you have to spend 1200 dollars on a i7 975 why not see if this thing will work for 270 dollars instead of spending another 600 on a video card.
http://www.youtube.com/watch?v=SkrZPra2scg |
Quote:
|
|
I'll get it and if it's not so hot I'll get a 470 but I bet it will do well. I'll get a wackier one later.
|
So it's pretty good for a 460, but it's still only a 460. Probabably a bit quicker than a 6850 and a bit slower than a 6870, and priced in between the two (at least here in the UK - £146, £164, £183).
Unfortunately we're still waiting to learn what SoW really wants, but I don't think it will run at maximum quality at a high res with high fps on a 460, so if you had a 6 core i7, I wouldn't skimp so much on the GC. |
Quote:
If you really want to go for the middle, wait for the 570. |
Quote:
|
I was hearing a review on a 470. "Don't handle the back of it..."IT WILL BURN YOU"..lol. I guess they all get warm.
|
amd have done a 5000 serie, cool and powerfull.
5870 is a nice video card |
Quote:
|
Quote:
Evga 470 running RoF demo everything maxed out 1680x1050 online for 90 min: Fan speed 39% temp 59 degrees, core usage 57%. Idle temp with 30% fan (silant) 33 degrees. Dont belive everything u read on the net, "rewievers" getting thoose insane temps and wattages would proppably do better as a busboy. |
It's a 480 but you're right things seem ok but I'd still rather get the other even if I had the money.
http://www.youtube.com/user/Hardware...25/Jh1HQ64HcmE |
Over the years I have found Tom's charts fairly useful:
http://www.tomshardware.com/charts/2...1200,2491.html |
Getting a GTX580 from my EVGA GTX480 step up program...can't wait.
Cooler, faster and quiet. The 480 is a good 6c cooler than in the reviews (25c in here) and the "noise" problem is greatly exagerated. The 580 is what the 480 should have been. |
Quote:
|
Well my GTX 580 will be here next week. I think that should handle the game well. I know my 480 is handling anything I have thrown at it while also running the Folding@Home GPU3 client.
|
Hi,
As of now do we have a clearer idea about approx when the game will be available ? Regards Jean-François |
Quote:
|
Quote:
|
Quote:
|
Quote:
|
video from Iglomir
hello guys, found this quite interesting video of SoW from that action, where had Oleg the presentation. Btw, check related videos...
I didnt know where to put it for you, so hope I havent done something wrong... enjoy these 10 minutes of quite annoying music, but some interesting and funny moments (total noobs trying to fly:)) http://www.youtube.com/watch?v=AyTMj...eature=related regards edit: My opinion is that we should be happy, if the proclamation with summer of 2011 is true... |
Quote:
It must have been fun going to the expo! Thankyou very much! One thing that I noticed in the footage of the water is that there are plenty of waves but I didn't see any swell. I wonder if waves will be a factor when we (I!!!!!) ditch in the water. If we hit into the face of a wave will or plane nose-in rather than skim along the surface? Cheers! |
glad to hear its useful:)
About the waves...i think if oleg and team is going to create more sizes of waves (bigger), then the waves could be a factor during hitting them I was thinking about the liquid on the windscreen, cant tell you the minute exactly, somewhere in the middle of video...it seems like Predators blood to me...any other suggestions? |
I've seen that video before in the Igromir thread, but it's probably the best and longest video from the expo, so thanks for linking it again.
I didn't re-watch it yet, but if i remember correctly that liquid you are talking about is engine oil getting splashed on the windscreen. |
"Predator's Blood"
There is certainly engine oil on the windscreen at 5:10, but there is also the green liquid referred to as "Predator's Blood" by Aquarious. Could this be engine coolant? The Mk XII Merlin used a 70/30% water/glycol mix which improved cooling and also avoided the risk of fire from pure glycol used previously. The bright clear green colour is achieved by adding potasium permanganate to the the glycol to persuade fools from drinking it. Ethylene glycol is toxic, and ingestion can result in death.
|
When is the release date , quit screwing around with " we don't know" I heard 2008 , 2009 , October 2010 , November 2010 was the CUT OFF date before the project would not be profitable.
If not now , when. |
Quote:
|
Quote:
|
Quote:
|
Quote:
|
Quote:
|
Its been a long wait, but understandable if you consider all the pitfalls in developing such a complex game with a small development team. We all have a stake in SoW's marketing success, financially no, but if we ever want to see the continuation of the SOW series, BOB will have to be a success. Those not bothering to invest in BOB and waiting for a Pacific addon better think again.
|
What Chivas said, times 100000000000000000000000000000000.
If the SoW series does not progress beyond BoB, we will be stuck with IL2. And I'm sorry gents, but IL2 is long in the tooth and is showing it's age, badly. Don't get me wrong, I still love the old girl, but all the mod band aids in the world are not going to get around the fact that the sim is running on borrowed time. |
OMG, just looked through the maps and mission building info in the 3rd of December update, and I'm finding the waiting so hard. Please publishers, hurry up with an estimated release date.
|
It's soon enough for us old guys it's not years anymore, 2011 spring is a pretty good guess.
|
Sow Can't be moded
It it true that Sow can't, can not, won't be moded for online Play.
I really hope not. It will be a great disappointment if Sow is moded Is there any word on this from OM. I’m not looking for a debate a simple Yes or No will suffice |
All times are GMT. The time now is 11:11 AM. |
Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.