Fulqrum Publishing Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official Fulqrum Publishing forum > Fulqrum Publishing > IL-2 Sturmovik

IL-2 Sturmovik The famous combat flight simulator.

Reply
 
Thread Tools Display Modes
  #1  
Old 01-18-2010, 10:46 AM
dduff442 dduff442 is offline
Approved Member
 
Join Date: Jan 2010
Location: Ireland
Posts: 114
Default SoW:BoB 64-bit/Multicore optimised?

Hi All,

Will SoW:BoB take advantage of 64-bit, quad-core architecture, hyperthreading etc?

The PC gaming industry has gotten seriously out of whack. $800 twin-gpu setups are common, but no more than a tiny handful of games use more than 2 CPU cores, more than 2GB of ram or are multi-threaded. A great opportunity for deep AI and immersive, interactive and open worlds is being totally ignored by the industry. This is all the stranger as RAM and CPUs are relatively cheap compared with high-end graphics cards.

I'm hoping SoW:BoB will make use of the PC features that will be mainstream in the coming years. FSX is the only game that does at the moment (seriously, this is a fact). SoW:BoB is ideal for hyper-threading and the 16GB of RAM the Home Premium editions of Win 7/Vista allow would permit environments of incredible detail to be portrayed, along with intensive layered AI processing etc.

There's a lot of expectation out there for BoB, and I'm no different. I held off buying a new PC until it became imminent. (I'm very reassured at the busy look of your offices BTW!) A bit of info about the processing capabilities of BoB at this point might whet people's appetites even more....

Regards,
dduff442
(have played online as Kestrel666)
Reply With Quote
  #2  
Old 01-18-2010, 11:07 AM
Tree_UK
Guest
 
Posts: n/a
Default

I think it may be some time yet before we know what system spec will be required to run the sim in all its glory, to date we haven't seen anything that would tax any modern system.
Reply With Quote
  #3  
Old 01-18-2010, 08:06 PM
flyingbullseye flyingbullseye is offline
Approved Member
 
Join Date: Jul 2008
Posts: 185
Default

Quote:
Originally Posted by Tree_UK View Post
to date we haven't seen anything that would tax any modern system.
Crysis will.

Flyingbullseye
Reply With Quote
  #4  
Old 01-18-2010, 08:55 PM
Qpassa's Avatar
Qpassa Qpassa is offline
Approved Member
 
Join Date: Jan 2010
Location: Valladolid-Spain-EU
Posts: 700
Default

Also FS:X with a lot of extras like : airports like Barajas(Madrid),Heathrow(London) etc
Reply With Quote
  #5  
Old 01-18-2010, 08:59 PM
airmalik airmalik is offline
Approved Member
 
Join Date: Feb 2009
Posts: 150
Default

Quote:
Originally Posted by flyingbullseye View Post
Crysis will.

Flyingbullseye
I'm pretty sure he meant that he hasn't seen anything from SoW updates that will challenge a modern system. I agree.
Reply With Quote
  #6  
Old 01-19-2010, 03:18 AM
flyingbullseye flyingbullseye is offline
Approved Member
 
Join Date: Jul 2008
Posts: 185
Default

Ahh, got ya. In that case then I agree with Tree.

Flyingbullseye
Reply With Quote
  #7  
Old 01-19-2010, 11:41 AM
Flyby's Avatar
Flyby Flyby is offline
Approved Member
 
Join Date: Oct 2007
Posts: 701
Default

I hope Sow_BoB does support 64bit multicore processors. I don't think we've seen anything yet from Oleg that will tax such a system, but perhaps that should worry us a bit? Dynamic clouds and weather. Water, complex AI, plus improvements to FMs and DMs FMBs (if they are in there) may tax a multi-core system. I think it's most of the behind the scenes stuff that will put a hurt on even a modern system. I recall that IL2 taxed my old 2.8P4 and 6800 Ultra in lots of ways. Couldn't have too much flak, avoid flying over big cities, not too many paratroops in the air at once, or watch the slide show. I know. Mere speculation until O.M speaks. Fair enough.
Flyby out
__________________
the warrior creed: crap happens to the other guy!
Reply With Quote
  #8  
Old 01-19-2010, 09:44 PM
mazex's Avatar
mazex mazex is offline
Approved Member
 
Join Date: Oct 2007
Location: Sweden
Posts: 1,342
Default

Quote:
Originally Posted by dduff442 View Post
Hi All,

Will SoW:BoB take advantage of 64-bit, quad-core architecture, hyperthreading etc?

The PC gaming industry has gotten seriously out of whack. $800 twin-gpu setups are common, but no more than a tiny handful of games use more than 2 CPU cores, more than 2GB of ram or are multi-threaded. A great opportunity for deep AI and immersive, interactive and open worlds is being totally ignored by the industry. This is all the stranger as RAM and CPUs are relatively cheap compared with high-end graphics cards.

I'm hoping SoW:BoB will make use of the PC features that will be mainstream in the coming years. FSX is the only game that does at the moment (seriously, this is a fact). SoW:BoB is ideal for hyper-threading and the 16GB of RAM the Home Premium editions of Win 7/Vista allow would permit environments of incredible detail to be portrayed, along with intensive layered AI processing etc.

There's a lot of expectation out there for BoB, and I'm no different. I held off buying a new PC until it became imminent. (I'm very reassured at the busy look of your offices BTW!) A bit of info about the processing capabilities of BoB at this point might whet people's appetites even more....

Regards,
dduff442
(have played online as Kestrel666)
Well, you describe the "strange" scenario well above - how come that so few games use multiple cores effectively but rely on beefy GPU:s? I guess you are not a programmer? It's really hard to use many threads effectively in an application that uses a main render loop that runs the screen updates at very high steady speed (many threads doing the render is even worse - though in some cases possible). When you are going to synchronize/join data for that render thread with a number of separate other threads (AI, network, loading stuff etc), you will have to keep you tounge in the right mouth The threads running on different cores really live their own life (and should do so) so the syncronization has to be effective to not get threads waiting for other threads etc (producing stutters and lag). Getting a bunch of threads to dance together while the main render thread runs at a steady 60 loops/sec is like a tango with four wifes - without annoying any one of them

EDIT: It's naturally a grail to be able to be able to write the optimal game engine using all cores available in a system... Even writing a game that runs two threads well can break programmers and companies backs - F4/Microprose anyone?

Last edited by mazex; 01-19-2010 at 09:52 PM.
Reply With Quote
  #9  
Old 01-19-2010, 09:52 PM
Tree_UK
Guest
 
Posts: n/a
Default

Quote:
Originally Posted by mazex View Post
Getting a bunch of threads to dance together while the main render thread runs at a steady 60 loops/sec is like a tango with four wifes - without annoying any one of them
LOL, got to be quote of the year, nice one Mazex S!
Reply With Quote
  #10  
Old 01-20-2010, 12:46 AM
dduff442 dduff442 is offline
Approved Member
 
Join Date: Jan 2010
Location: Ireland
Posts: 114
Default

Quote:
Originally Posted by mazex View Post
Well, you describe the "strange" scenario well above - how come that so few games use multiple cores effectively but rely on beefy GPU:s? I guess you are not a programmer? It's really hard to use many threads effectively in an application that uses a main render loop that runs the screen updates at very high steady speed (many threads doing the render is even worse - though in some cases possible). When you are going to synchronize/join data for that render thread with a number of separate other threads (AI, network, loading stuff etc), you will have to keep you tounge in the right mouth The threads running on different cores really live their own life (and should do so) so the syncronization has to be effective to not get threads waiting for other threads etc (producing stutters and lag). Getting a bunch of threads to dance together while the main render thread runs at a steady 60 loops/sec is like a tango with four wifes - without annoying any one of them
This is all just nonsense. The reasons games don't employ computing power effectively are:

A) The development cycle of nearly all PC games is tied to that of console games. Consoles focus almost totally on graphical bells and whistles to the exclusion of all else.

B) Game devs don't profit (or rather feel they will) from a focus on long-term, evolving codebases. Graphics engines etc are stable technology. Individual game engines on the other hand are throwaway crap for the most part, with few going through more than 2 or three iterations. If Blah I is a classic, Blah II will be derivative, and Blah III is sure to add loads of half-assed junk options as the devs know the wheezing wreck of a codebase is already beyond rescue. Most codebases are dead already if the original design team is disolved. This hasn't been the 1C:Maddox way in the past and I hope it doesn't go down this route now.

C) The development cycle of most games has little to do with engineering. It nearly always starts with the concept art, followed by rounds of meetings by the suits as the company considers distribution, market segment issues, the company's portfolio of other titles etc. The result is never something anyone might love: it's something random disinterested businessmen think is okay, the lowest common denominator creatively speaking.

D) The actual coders are only involved at a low level and only get their say after (C). 'Game engines', where they exist at all, cover graphics and (at best) some scattered elements of the modeled environment. The wheel is reinvented for each new release as far as gameplay is concerned. Why do you think gameplay in FPS games has barely moved forward since Thief in 1998?

Multi threading on multiple cores needn't even be that efficient on all processors with modern PCs to still make a big difference to gameplay. Not all processing need be done on a frame-by-frame basis. For example, top-level AI heuristics could provide direction to lower-level manoeuvre AI which could in turn guide AI behaviour on each frame. Synchronisation of the first two strands isn't critical and memory bandwith issues etc. would not be significant.

Developer commitment to high-quality engineering and a sustainable codebase can be very profitable as the Oracle example cited elsewhere proves. The reason it features so little in the gaming industry is that it no more guarantees profitable games than do good lighting and camerawork in the movies, so it takes a back seat to the often stupid ideas of the business 'creatives'. This attitude is short-sighted, however.

A stable, tightly-knit dev team committed to incrementally improving a sustainable product could blow away the competition in many areas of gaming. Flight sims are evidence of this. Image processing and compression tech are other good examples. Look at the humble jpeg, basically the technology that made the internet possible. JPEG (and MPEG, MP3, MP4 and other derivatives) will be 18 this year. That sort of thing is real technology, not the throwaway stuff pumped out by the games industry.

Regards,
dduff
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 09:30 PM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.