![]() |
So, Nehalem is OK?
I've been reading a few of the reviews of the Nehalem processors today, and it seems that for only a billion dollars a good system can be had. The processors aren't terribly expensive but the motherboards and the ram seem to cost enough.Some sights are now saying that games will be GPU-bound (here we go again with that). But I'm wondering what you guys think about a Nehalem system to run SoW_BoB? Perhaps by the time it comes out the prices on these new components will drop down a bit. I'll start selling blood now so I can save up enough cash! :)
Flyby out |
In the review I read they were talking about a price of $2500AUD for the I7 CPU.
That's fairly expensive in my book. (My e8500 which was fairly new when I got it cost $390AUD) It's Performace dominates in multi threaded applications though on a single core the rewiew said it was only marginly better than a E8600 at stock speeds (Cinebench bench mark 4250 points v's 4150) http://www.atomicmpc.com.au/Review/1...7-extreme.aspx |
hey Tiger, (Skoshi means "little", correct?)
I'm not able to pay such a high price for the top line Nehalem. Hell, my budget keeps slipping back anyway! :D But one day when I can afford a system I think I'll want an Intel quad that can at leastbe overclocked to 3.6ghz. That way I can run my favorite combat sims that only run on one core while being able to run Sow_Bob somewhat decently (assuming it will make use of four cores). I'm actually hoping Nehalem will force down the price of Penryn quads. Yet the high cost of Nehalem components may keep prices for Penryns high as people opt for more affordable systems. We'll see. Flyby out |
The core i7 looks to be a solid advancement in CPU performance and technology. I remember paying close to $400USD for a P4 3.4 a few years ago and did it gladly. Cost is relative, but with the fast processors out right now at comparatively low prices many will still wait until mid-late 2009 to upgrade. I know I will. Not because of the cost but because everything I play right now runs butter-smooth on my current rig.
As far as BoB:SoW is concerned, consider that it's supposedly still close to 12 months from prospective release and consider all the tech advancements that will take place in that time. Also consider that it will likely still be too much for most machines at release to run on "high" settings which is nothing new. I plan on building a new rig in April/May next year simply because that's my cycle, but even so, chances are it'll still be a bit sluggish for titles slated for release Q4 '09. |
Good feedback T-bolt,
But it's hard to imagine that Oleg's sim, which has been in development for so long now, might stress systems built in the time frame for your next upgrade. I understand that in times gone by people were burned at the stake for practicing the kind of witchcraft he must be using to write code for Sow! Flyby out |
Maximun PC review
Here is the link for Maximum PC's review on what they know about the Nahalem I7 processor. It's seems like Intel is getting out and dusting off some old ideas with Hyper threading and doing away with the FSB memory controller and looking more like an AMD chip. I like what I have read so far, but this chip on an Intel Mobo will not support SLI with nvidia GPU's. Hopefully by the time SOW hits the streets, the nvidia chipset to support SLI will be available on other Mobos.
Enjoy: http://www.maximumpc.com/article/fea...cpu?page=0%2C1 |
Quote:
Quote:
Over time the prices will come down (world Economic crisis permitting) and the PC's that will be mainstream by the time BoB comes out will leave mine for dust! |
Nehalem links
Quote:
|
i7 is 25% faster than Core Duo clock per clock... so yeah its more than ok for this sim... and possibly SOW
As for SOW pushing the i7... probably it will bring it to its knees if all the bells and whistles are turned on... remember they build a sim that can last 5-10 years.... Read here: http://www.tomshardware.com/reviews/...alem,2057.html http://www.tomshardware.com/reviews/...ming,2061.html http://www.tomshardware.com/reviews/...e-i7,2063.html |
The review below sort of says that we'll only be seeing the benifit of the i7 with high performance SLI or Crossfire graphics setups!
http://www.atomicmpc.com.au/Feature/...rformance.aspx Using single card graphics on overclocked systems you got comparable results with a QX9650 CPU. (It isn't an exaustive test) Is there any reason to go SLI or Crossfire if your monitor can't display really high resolutions? Playing IL2 1946, my 22" LCD at 1650 *1080 gives me about 61fps most of the time dropping down to about 22 in the Black Death track when one of the airfields gets attacked. I supose its all depends how the applications are coded and optimised. I've been running dual core and multiprocessor pc's for a while and and I really can't think of many programs that have actually use the technology to it's full advantage. ( It's depressing having a dual processor rig and only play games that use one! :( ) I'm sort of hoping titles like Bob will show us what these puppies can really do! I've been waiting a while! |
hey Skosh,
Quote:
I also agree with you about monitors and dual GPUs. Unless it's a humongous monitor like a 30 inch being played at 2560x and trying to get a game like Crysis to run smooth at that rez, I think a good single GPU with 1ghz of ram, like the GTX 280 is more than enough card for lesser monitors; certainly enough for a single 22 inch monitor anyway. Be mindful that not all 1ghz GPUs are created equal either. One day I'd like to get a nice speedy quad Penryn. Maybe the prices will continue to drop, or I can trade in blood? I'm thinking a single GPU on a P45 chipset, and a quad that I can easily overclock to, oh, 3.6ghz (on air cooling). OK. I need to shut up now. Opinions expressed here are not my own as I am a mindless minion of the Evil Empire. Flyby out |
Quote:
In short you guys are both right about the resolution scaling. Unless you're running 1600X and above I don't really see the point of going SLI or a super high end video card but here is the caveat. When it comes to scaling we are somewhat misled by looking at the FPS numbers. Most guys like ATI and Nvidia want you to see the high numbers but that's only part of the whole CPU GPU equation. For most games it's not about the high number. It really comes down to the MINIMUM number of FPS you experience in a game. That's the real kicker isn't it. We could care less when things are running above 60fps but we all see it when it's less. This is really driven out of all the systems and subsystems of a computer. Nothing new here. SLI does scale quite nicely when you're dealing with games that are heavy on the GPU side. Which quite frankly is the majority of the games. Sims are typically CPU bound. But it would be interesting to see how well a sim would fly if it took advantage of the number crunching capabilities of a GPU. Moving forward. I went from a single 8800GT to SLI running 1920X1200 and it literally doubled my performance. I could then turn on AA and AF and even take advantage of the higher terrain setting. I've turned off my other card to see what the difference was and the sim crawled. Also having gone SLI I've had Zero problems running some of the latest and greatest games with the exception of one game. Crysis was the only one that SLI didn't improve anything. Now that could be somewhat driver related but it really didn't do much. Games like COD4 doubled in FPS so I think it's a matter of games that take advantage of SLI or CF. My 2 Pennies |
good points, Cap'n. Some games do scale well with either SLI or Crossfire. But some don't, and that's my complaint about the multi-GPU technology. I concede the point that minimum fps is the more important issue. Combat flight sims, I also agree, are (now) mainly CPU-bound, what with complex AI, FM, etc. The performance pendulum seems to have swung away from modern GPUs. It would be great if some of the calculations could be handed off to the GPU, if that would make a difference in the performance of our beloved flight sims. Only F4 seems to be able to make use of multiple cores. Black Shark has been released in Russia, and the word seems to be that a fast CPU makes a lot of difference, and that's an old DX8-modified-to-DX9 graphics engine.
So I guess I come full circle in this discussion. I thus relent. Modern GPUs are not the issue for combat flight sims on the (22-inch) average monitor. It's the CPU, and the code being written, or not written by combat sims to take advantage of multiple core. Glad I still have F4! But I'm looking forward to SoW_BoB. Guess I'll have to contract out for a liquid nitrogen storage tank in the basement! Flyby out :D |
To utilize multiple CPUs, the task you are programming must meet requirements for parallelization. Each part of task must be independent from each other and each others results. It sounds simple, until you have to implement it in something like simulator, where most calculation must use not only each other results, but also must be completed in particular order. It is not the problem of laziness; it is problem of science.
P.S.: Falcon 4 uses multiple CPUs not for FM, but for dynamic campaign (units movement and etc.). While you are happily flying in your player bubble, all the AI controlled units outside it play without you alone. In other words, player and these units are independent, until they come into contact. |
i would prefer the next amd phenom 2 940 with 4x 3ghz.
this will be fast enough for all and i guess it will be much cheaper then the new intel system. the new opterons 45nm (servercpu) are faster then the intel xenon with same clock in the newest benchmarks (vm mark eg) i am sure that nobody needs more power. the phenom 2 will be out at january next year. |
Quote:
Flyby out |
Flight Sim fans bitching about upcoming games requiring hardware they personally cannot afford is actually a bit odd.
It is a bit like a car enthusiast wanting to ban the next model of Ferrari and trying to force everyone to drive something similar to the second car they just bought on ebay out of some idea of "fairness". |
Quote:
|
Although Oleg said that SoW will utilize multicore CPU, it's not clear if he meant 2core or quad. It's verz hard to develop sim that will use quadcore at full power. I believe that E8400 will still be enough for very decent siming (+apropriate GF of course). 4x3Ghz CPU seems to me as overkill.
|
well if not a quad core...
Quote:
|
More gigahertz is going to be a necessity? How much "more", even the best today CPU cant go over 4GHz too much and game that wont run decently on most mid-range machines with vital features on, would be marketing disaster.
Future lies in multicore CPU, that's sure, but SoW develpment was started in sometimes in 2005 and is hard to elieve that Oleg targeted "recomended specs" at CoreI7 Extreme OC to >4Ghz.. |
Quote:
|
I got you point, in SoW there wiil be probably a lot of "hidden" features or at least "hooks" for them for further develpment that will be activated as needed. So, there is no single PC config for whole SoW lifetime just as it was with IL2 series.
Question now is: what rig as a starter? Buying quadcore prematurely might be costly mistake while not futureproof by any sense, maybe Oleg will tell us more in May. |
Quote:
Flyby out |
Quote:
Now that he's had that success he will build an engine that can be extended or a better way to put it... Scale over time. Developers make products nothing more. He will build it and if there is a level of sucess you will see addons and support. I think you guys are a bit too worried about the hardware specs. If the game lasts as long a IL2 did then all this discussion is a waste of time. PCs are going to be different 8 years from now. |
Quote:
Exactly... but more like 2 years. |
Hardware! Hardware! Hardware! (I don't think this is Kansas, Toto) :D
|
Missing bottleneck....
In my feeble mind the missing link is the monitor. Consider that our current fastest LCD Monitors cannot "in effect" properly present much more than about 60 Hertz "refresh rates". To my thinking then any time FPS is above this number you will not see it because the LCD monitor cannot update the screen more than 60 or 75 times a second. In the remaining discussion I will refer to 60 Mhz as I cannot see the diff between 75 and 60 on my "2ms" Viewsonic which is highly rated by those that have the gear to rank responsiveness of LCD's. Note that as of yet even the best "2ms" LCDs are closer to 8 or 10 ms average since all colors are not are not as responsive as the fastest.
For those of you out there that have a "2ms" refresh rate monitor and are showing 60 FPS at high detail in your app/game just go disable VSYNC and see if your FPS are significantly higher. When I do this my FPS jump from 60 to over 1xx in some of the more taxing games. Watch something like a plane or car move across the screen. When comparing 60 FPS vs 100 FPS does the object look like a solid moving object or does it appear to ghost/flicker/not be a solid focused object that moves? My observations are that at 100 FPS the object does not appear any more solid/focused than at 60, the ojects/game is not any smoother. At these higher FPS you may see a byproduct of the fact that LCD cannot refresh fast enough (above 60 FPS) you may observe monitor artifacts like texture smearing/tearing. This is why games that with given hardware are often suggested that we switch VSYNC to on, note that when you do so your FPS drop to either 60 or 75. In contrast for LCD's 30 vs 60 FPS is noticeably different since the LCD can effect double its screen writing/refreshing when going from 30 to 60 within its technological capacity. Do the above with a good CRT monitor that truely refreshes at 60, 75, 80, above = you will see the differences as the CRT can truely process these higher FPS. Unfortunately the LCD of today's technology at "2 ms" cannot present your eyeballs the benefit of hardware that can process higher FPS...UNLESS YOUR LCD CAN SHOW IT! I wont even start to address Input Lag.....this only adds to my observations and conclusion above. You may wish to research: Refresh Rate VSYNC Screen Tearing CRT refresh rates LCD's and how they are very different from CRTs GTG : Grey to Grey On Off On Input Lag Pixelanne ( I think is the name) - a great little program that is used to compare "refresh rates"/responsitivity of both CRT and LCD monitors. I cannot locate this as my spelling may be wrong. Please correct me if you know. All these contribute to my thoughts of the "Missing Bottleneck" (limitations of current LCD technology)....my opinion posted as it might be of value to one or two of you who might prefer not to waste money on higher FPS that you never see/enjoy. |
Spudley, good posting there. I have been looking at monitors trying to divine what brand/model I should get. I'll miss the crt's capability in some ways but not the footprint or the weight. My wife's pc still has a crt, which I am using.
As I understand input lag, isn't thata technology issue for some lcd monitors? I mean I think some monitors suffer input lag worse than other when displaying other than the native resolution. Isn't that where the problem lies: resizing to a differerent resolution causes input lag? Additional to GTG, isn't there a similar measurement for black to white? I suppose anyone who can afford one might invest in a new lcd monitor capable of 120hz. I'm not talking about an lcd tv, but a pc monitor. I think I read about one somewhere recently. At any rate, I agree with your premise that the monitor can be the missing bottleneck. I'll have to choose wisely. Flyby out |
Quote:
It's the minimum frame rates that we can notice and complain about. Even so, with each new generation of hardware that come our way, most of us will be cranking up the detail level higher and higher until we get unacceptable frame rates and then ease off abit. I view that as just a fact of life and one of the things that makes it good to be alive now, and be in a position to take advantage of it. |
Quote:
Well written but I think you're missing a very important point about any game when it comes to FPS. What really matters in a game is the LOW number. At some point the difference between 60-85 matters all that much. It's when you see the game dip to 20 FPS that really notice. The key to any good game is the low number. |
...research....
CAPN STUBING:
True the lowest FPS during gameplay can easily be a deal breaker. FLYBY:Hope my ramblings help.... "... Additional to GTG, isn't there a similar measurement for black to white? " >> Been a while since I did my homework to learn bout all this, I suggest you search for the third party testers who actually have the fancy equipment, present the results of the various analysis/comparisons of monitors, as well as their informed explanation of the variables and terms. LCD's and CRT's pixles work totally different. Current LCD technology cannot hold a candle to a decent CRT oy days gone by when it comes to gaming. An example of what appears to be very good/informative type of metrics to consider: http://www.xbitlabs.com/articles/mon..._17.html#sect0 Kudos to XBITLABS as I have always found their shootouts/analysis of videocards and monitors the most objective and thorough - suppported by hi tech analysis! I recall that Toms Hardware might also have some thorough technical analysis. Be advised that I purchased a Viewsonic VX922 (rated 2ms) but prob averages round 6 on avg ( I don't recall) and am happy with it but miss my CRT for gaming :(. "I suppose anyone who can afford one might invest in a new lcd monitor capable of 120hz..." >> From the initial snippets I read bout this monitor which is NOT available: I do NOT beieve it will be anywhere close to 120, I bet it will still be the 60 variety with some new gimmick/twist/alternating vertical interlacing/sumthin (sales puffing - misrepresentation). The same way these advertised 2ms LCD's don't present an average responsitivity any where near that. They never clearly define what the 2ms referrs to, it might be only one color gamut/hue/frequency/whatever of thousands which is the most responsive. " I'll have to choose wisely...." >>>Precisely, there is a lot of subjective "textual spewage " A.K.A. BS floating about, I prefer objective technical analysis, I try to select the best on paper, then compare them side by side and let my eyes decide. YMMV Be advised that a typically good LCD can appear shoddy due to a bad new cable, poor quality control, handling, some off the line are A++ some are not. Hope to fly with you guys some time in UBI lobby, watch for me and give me a hollar! |
perhaps here for more info?
maybe this site can be of use when trying to find an adequate monitor: http://www.widescreengamingforum.com....php/Main_Page
Flyby out |
So, Nehalem is OK? (course correction back to the OP)
Buying a core 2 duo/quad system now locks you into tech that is at end of life. There won't be any significantly faster/better CPUs in this line, or socket, coming out in the months ahead. In this regard Nehalem is the way to go, but it's worth waiting a few months for one if you can. As for dual/quad processing. Being a software developer whose been developing multi-threaded multi-processor, and multi-host applications for years now, I can't imagine Oleg's team developing a sim that does not fully take advantage of the potential computational environment that multi-core CPUS provide. In this I mean that the sim could be designed to scale across as many cores as are available. As an example, consider that each AI aircraft (or online for that matter) were a separately schedulable execution thread (that's the way I'd design it, and I've done this exact sort of thing with analytic sims professionally). With proper design, each of these threads would drift off to the least busy core to execute, and thus more cores give you that much more AI computational bandwidth. And finally a word to those who think vsync is for sissy: you are deluding yourselves. Why? A computer creates animation in the exact same manner as a film movie, i.e. animation is a series of still pictures shown at a fixed rate which when viewed by the human eye appear to create an image that moves. That fixed rate for a computer is the maximum refresh rate of your monitor. When vsync is turned off, the computer is allowed to display one of the still pictures on the monitor while the computer is still filling in that same picture, thus creating one or more tear lines in the displayed image (the previous image is overwritten during the fill operation). And even if you are not seeing tearing, the max rate that the picture changes on your monitor is its max refresh rate: all those FPS that exceed your monitor's refresh rate are NEVER even displayed! What vsync does is lock the update of a picture such that it won't be displayed until the update is complete; but there is also more than one picture buffer, so one is being displayed while one or more are being updated. But hey, that warm fuzzy feeling of seeing that FRAPS display of 140 fps (and don't forget that hefty average) on your 75htz monitor can't be beat...right? Edit: Potentially insulting slurs removed (see below) :oops: |
did he say that?
Quote:
|
Hi all
Didn't see it here so this is very relevant : http://www.simhq.com/_technology2/technology_111a.html Good news Cheers gprr |
hmmmm I posted that link to the DCS Black Shark forum. Don;t know how I forgot to post it here. Good looking out, gprr.:D I came away from the article with the impression that the i7 processors well, this is a quote from the article, and I share this viewpoint.: "Despite the platform costs of upgrading to Core i7, Intel has engineered a CPU design with such massive parallelism that the PC community could be waiting years before a game developer truly takes advantage of its potential. In the immortal words of Ferris Bueller, “It is so choice. If you have the means, I highly recommend picking one up.” I interpret this as saying the i7 is not a best bang-for-buck system just yet, performance-wise. Check out this report from HardOCP: http://www.hardocp.com/article.html?...hlbnRodXNpYXN0
In some of the tested games (though not in all the tested ones) even the E8500 is competitive at the 1900x resolution. In Lost Planet the E8500 draws even with the fastest i7 processor at 2560x1600 resolution. Of course my only concern is how well a processor performs in combat flight sims. IL2 seemed to do better with the i7-920 than the with the older qx9770 but the lower resolution, 1280x1040 seems like a crt monitor rather than even a 19 inch lcd monitor. I assume most fliers are using lcd monitors nowadays so SimHq's test is not so relevant in that manner, imho. I'd like to see a true combat flight sim test on the i7 at lcd resolutions, and using the i7-940 that more people can afford rather than the i7-965 at $1000.00 USD. The X58 motherboard prices will drop only when the manufacturers release mid-level boards. We'll see what happens. eh? Flyby out |
The thing is that the IL-2 series and undoubtedly the Storm of War series will be heavily CPU reliant as well as memory and graphics. Most games are heavy on the graphics or graphics and memory but generally less so with the CPU. So it sounds like the i7's raw power, quad core, and high speed triple channel memory, seem like a good match. But only if Oleg can successfully build some parallelism into the Storm of War engine.
|
i7 PREMIUM PRICE = SO WHAT DO YOU GET ?
FLYBY,
Was reading some other analysis earlier, sorry don't recall the link. They took an i7 920 2.66 Ghzee ($300 US) and compared in effect single, dual, triple, quad core processors impact on FPS. They disabled enough cores with each benchmark run in order to be using four, three, two, one cores. Kinda simple and sweet analysis. Seemingly across the board they saw significant gains from single to dual. From dual to triple, they observed insignificant gains. From three to four cores there was no incremental increase in output. Interestingly you can purchase a E8200 for $150 US. I don't recall the diff in FSB or onboard cache betweeen the two chipsets. Let's assume that the E8200 is equilavent to running the i7 on two cores. Consider that the i7 costs 2x that of the E8200. Now add to this premium the fact that i7 compatible mobos will run approx $100 premium. Let's leave ram out of the picture for the moment. So what do you get for that additional $250? Right now from the third party analysis I have read....no "significant " performance differences that you can touch/feel. You are on the next generation of chipset pin configuration which may or may not have any real value. You are buying the current cutting edge of CPU configuration which always costs a hefty premium for those who value the bragging rights. These rights are of value to some so this is not meant as a pejorative. Might this be why AMD came out with their triple cores instead of four? The choice one might take would depend on the specs of current system: is it inadequate?, is it just time for something new?, do I need to burn some cash?, can I wait six months to see what my options are to upgrade?, if I choose to wait a bit will I be happy to buy the i7 920 (currently $300) or Q9400 (currently $290) for half it's current premium price - equivalent to what a E8200 costs now (currently $160)? Consider waiting another few months, and whilst you wait read all the comparisons that more informed brains write up. And member to throw the costs into the mix as well as the other variables that you feel of value. The longer you wait the more you will save on the specific components and the more comfortable you will be with your decision. << Good sleep insurance<< Once you buy, do it, build it up, post up pics of the creation, and don't look at the prices till the next time the upgrade bug bites. Oh but by all means give us an objective in-depth write up whatever choice you make. PEACE |
Hi Spudley,
Good points you make. I'm actually going to wait another few months (since I've been waiting for quite some time for my $$ to get right anyway). I'm looking to finance a new build, not an upgrade, so waiting is in my interest. By the time I'm ready to spend I'm hopeful that prices will drop, Win7 will be out, and mainstream motherboards X58 mobos will be out, not to mention new GPUs like the RV870. So, I'm not looking to build a monster, but surely a capable system that might be closer to Oleg's dream system rather than further away from it. ;) Flyby out |
I afraid that prices of I7 setups wont drop so quickly as we assume and what more, dont forget that flight sims are real time simulations and it's very hard to implement parallel processing, maybe if OLeg will use Falcon4 like "bubble" concept for massive campaigns, but still I dont believe that more then 2 cores gonna be use at full steam.
And in Q3 09 Itel plan to introduce LGA1166 paltform (Lynnfield) base on I7 architecture but only with 2 cores but with lock clock, so it wont be easy to OC! Also, AMD is puttin all its strategy to PhenomII four cores CPU, two cores new Athlons will be for value market only. Maybe we are facing bleak times for PC gamers, multicores are not for games (and wont be ever) and two cores wont be maybe less powerfull then todays E8xxx... So my bet in on highly OC E8400 (8600), but maybe Oleg surprise us all.... |
It would be nice to know more from Oleg on this subject. I wonder if he realizes the dilemma?
Flyby out |
Quote:
Every one knows that over time hardware will become cheaper and more powerful. (Current economic meltdown permitting!) If you want the best experience playing a sim that won't be released for another year and you can afford to wait, then wait! Three or so months ago I HAD to upgrade my computer, So I worked out a budget and fitted into that. Trying to plan ahead for future upgrades. If I went through the process today the PC I'ld have built would be nothing like the one I made. In those 3 months, we've had a whole range of new technology introduced. New graphics sets, Solid state drives, a new family of CPU's and Motherboards, Tripple channel DDR3 memory. In a years time who knows what our PC's will be like? If you NEED to buy a computer now, try to make choices that will allow you an upgrade path. Just think, by the time SOW is released, you may not even be interested in Flight Sims! Who Knows? |
hi Skosh,
I don't think it would be irresponsible for Oleg to give a bit more guidance on system requirements at all. What's wrong with saying a Conroe series CPU would be more than sufficent? Or a Penryn? Or a Nehalem? I'm waiting a bit longer before building a system, but I had not planned to wait until next May when requirements are supposed to come out. So how to plan for an upgrade path when there is no target? Even Rise of Flight has posted system requirements. Surely Oleg has an idea. I know he won't recommend a Cray, but... Flyby out |
Quote:
Well a couple of reasons, Firstly, "Traditionally" development of of Hardware has followed the much misquoted "Moores Law", in which every 18 months or so the number of transistors in an integrated circuit doubled every 18 months while their price dropped by about half. So fair this has roughly coresponded to the performance of the processors and memory etc. Secondly Intel has been following their tick-tock cycle, We've only just had the 'Tock' wich was the introduction of the i7 Architecture in the 45nm silicon process, In the next year or so we'll have the next 'Tick' which will be implementing the new architecture on 32nm silicon. (And I bet AMD will be trying their hardest to out do intel!) So far, from what I've been able to see programs aren't pushing the new i7 chips to their limits because the code hasn't been optimised to run on the new architecture. That'll takes time for the developers to come to terms with. Who knows what sort of performance will be reached? I certainly don't It basically comes down to that in a years time The CPU's and memory avaliable will be more powerful and cheaper, have more cores etc. Oleg has already said that they are making a sim that will be around for a long time, hopefully as long as IL2. I'm sure I read somewhere that they are adding features that won't possibly run on current PC, and closer to release they be deciding which ones they will need to turn off to get the sim to run at acceptable frame rates and game play. Now if Oleg at this stage said the game will run on 'X' and a person runs out and buys 'X' now and when time comes to play the game and the person doesn't think that 25Fps is 'acceptable' They'ld be blaming Oleg for screwing the pooch with his machine specs and we'ld never hear the last of it. Also that PC that cost lets say 2000 Rubles today would only be workth 1300 Rubles in a years time. Then there'd be a 30 page thread says Oleg robbed them. This far away from release I wouldn't be releasing machine spec's. Cheers! (Sorry for my disjointed argument, my wifes run away to the shops and left me with winging children!!!!!!!!!!! I'm sure they can't be mine!!!!!!) |
Hey Skoshi I got your point, but still Oleg should at least tell us for ex "Yea it will be optimized for multicored CPU" or "sorry lads, maybe in the future you'll be fine with twocore at 4 Ghz...something like that.
|
Quote:
I would like to know if the problems with SLI will be sorted out in SOW (one possible upgrade for me ) and if PhysX will be supported, How much memory? Will I get by with my current CPU & motherboard (Over clocked of course!) )? But until the Sim is actually released and we get to see it in action how will we actually know for certain? |
Quote:
Hope the kids don't wear you out! :D Flyby out btw, how a sim runs with either Crossfire or SLi implementation is not a function of the sim, but rather of the GPU's technology. There have been some interesting articles around talking about a new technology that allows multiple GPUs to scale nearly 100%. It's called Hydra and it's been tested, but is still in development. Neither ATi nor Nvidia seem inclined to fix their multiple-GPU tech. It's really sad too because people spend their hard-earned bucks to buy two GPUs because of the implied promise that two work twice as fast as one (universally) and that's just not true. Which is also why some sims run better in SLi (or Crossfire) than others. IMO, Hydra may represent what ATi and Nvidia should have been about all along:true multi-GPU scaling no matter what the sim. |
Quote:
There is some amazing technology being released and I guess it's a bit frustrating when it doesn't realise its promised potential or is so new that there are no applications that use it fully. Cheers and all the best! |
the best back atcha! ;)
Flyby out |
Black death track results:
For those with interest:
Gigabyte EP35-DS4, bios ver F3 E8400 (2x 3.0G) , 2 Gig HyperX DDR2 Ram 1333, BFG 8800 GT OC 512 Meg, 625/900/1560 default (this default GT OC is already approx 5 %) VSYNC on at 60 (Max) Using Gigabyte Easy Tune Pro 5.0 to perform the overclocks, and confirmed with GPU-Z. Il-2 2.04 Video Mode: ALL HIGHEST OPTIONS Video Option: 1280 x 960 x 32 A couple FPS change is really meaningless as I did not run each 10 times and take an average. With multiple runs of same settings one might see two FPS change between the runs. RESULTS FOR BLACK DEATH ( min:sec FPS ) 1. ALL STOCK CPU and Vid Card 1:07 56 1:45 49 2. CPU @ 3.3 G: 1:07 54 1:45 49 3. CPU @ 3.3 + VID CARD 630 / 930 1:07 54 1:45 49 4. CPU @4.0 G + VID CARD 640 /950 1:07 52 1:45 49 CONCLUSION: OC CPU 17% combined with total 7% OC of Vid Card (over non OC version) seems to have no benefit. I used GPU-Z to confirm the OC values. Undoubtedly there will be significant increases in heat buildup + reduced max theoretical life expectancy of the chip + increased energy consumption. Interestingly from the above metrics my FPS appears to drop when OC'ing.<< I certainly was not expecting that, was expecting either no change or a few frames of trending upward. Note that I have another sys (old specs) which was AMD (one core) 2.2 G + 6800 Ultra. Stepping up to my current PC specs was significant in terms of visuals / smoothness / etc. THe MINIMUM FPS in IL-2 were significantly improved (perhaps 2x if my memory serves). After reading the data on many of the new tech CPUs/VIDCARDS/MEMORY it is not of use nor value for me to spend any more money to improve my gaming. I am perfectly satisfied with my current rig gaming capacity. My min FPS are sufficiently close to 60 (LCD MAX CAPACITY). All the I7 MOBOS have had significant problems as shipped to consumers. This new platform will need at least a year to mature to the point of being a viable / reliable / stable option. The technology on paper of the I7 shows a significant potential step up but in my opinion it is not ready for market nor my $$$ (this is pretty typical). My testing for your benefit, no cost to you. I also play LOCKON and BF2. Note that LOCKON is also a huge resource hog as is IL-2. I have OC'd before and recall similiar non benefits for LOCKON as were presented above for IL-2. Now I will take the CPU and VID CARD back to their native settings. If I can just get my buddy SimDee to recover from his latest flight experiences with me on his 6 perhaps I will continue to home my pitiful flight skills. Peace |
then there's SoW_BoB which may be coded to work with multi-cored cpus. But perhaps your GPU was becoming a bottleneck during your overclocking, or maybe the ram? I don't know, but it looks like the cpu was waiting more and more when it was overclocking. Too bad I'm not an expert on such things. So it's just my take on your test data. Or maybe it is the PCI-e 1.0 P35 chipset with limited PCI-e bandwidth compared to say a P45 which supports PCI-e 2.0 with greater bandwidth pass-through? Maybe I'm an optimist? ;)
At any rate, are those the average FPS for the BD track? Pretty damn solild if they are, if you ask me. Well done! :D Flyby out |
Lowest fps...
At 1min 7 sec and 1 min 45 seconds into the track I see my lowest FPS in the Death Track with the default views and the settings as indicated in the post. So those are the lowest FPS at the two respective times. Note that VSYNC caps at 60.
Unknown which if any one particular component is the sole bottleneck. When I build em I do my research to ensure I don't make the mistake of not properly matching the capacity of the critical components. Otherwise the upgrade/build is a bit pointless. :) |
rgrt, Spudley,
I apperciate your work and the fact that you aren't charging for it! :D I do have a request. Could you run a similar test using the Kamikaze track? On my last system that track proved to be a tougher ride than the BD track. My last system had HyperX ddr400 ram, an Intel 2.8ht processor, and an Nvidia 6800 Ultra on an Abit IS7 mobo @800FSB. BD dragged min fps to 13 while Kamikaze went down to 7fps near the end. Be nice to see what a modern system can do with that puppy. ;) Flyby out |
kamikaze
Hey FLEAFLY,
Your wish is my command. Unfortunately I have deleted the Kamikaze.trk as well as all other tracks with exception of the BD track. I cannot locate it on the CD's and will not reinstall just to do so. If you can show me where I can just DLD it I would be happy to do so and perform the test for you. NO COST! PEACE |
Fleafly? Uh well..nevermind. ;)
Flyby out |
flyby results
OK,
Running Kamikaze on 4.04M with the following results: ALL settings MAXED, including Landscape = Perfect: Primarily 60 FPS +/- , a few drops to 37 with the lowest FPS circa 1:4x where it reads 25 FPS once or twice. All settings MAXED, except Landscape = EXCELLENT: (I prefer this setting since on Perfect the water and reflections really look a bit surrealistic and thus detract from my visuals) Primarily 60 FPS +/-, with the lowest FPS again circa 1:4x where it drops to 41. Hope this info is useful. YOURS TRULY TESTPILOT |
Spudley, you are indeed a test pilot. Thanks fo rgoing through the trouble, man. Your rig shows how far things have come since the days of Kamikaze chugging down to 7 fps on my defunct system. I don't recall the time events you quoted, but I'll say that towards the end of the track is where I saw the lowest fps, when the Zero was crashing the carrier, iirc. That explosion was an fps-killer. So yes, your test was useful to me, TESTPILOT!
thanks!! Flyby out |
Just came along to this little but neverthless telling snippet from Oleg's room:
"There will be many surprises in time. Like with Il-2 we put in engine many things that will be open later later depending of middle PC power on the market. With 4.09 we will stop any work with Il-2. Really we did it already... just waiting finalization of new maps." So it indicates, that highpowered Nehalem/extreme quad edition PC might be just waste of money and SoW "entry level" rig beased on DualCore Exxxx will be more then enough for decent gameplay and even the most powerful quad wont be enough when all those "thing are open later". |
Quote:
|
i7 vs CORE 2 QUADS vs e8600
http://techreport.com/articles.x/15818/6
FLY: Some interesting comparisons, granted they did't test a flight sim like LOCKON or IL-2. THought you might find them useful. PEACEs |
Quote:
|
thanks for the link SPUDLEY,
I'd actually read that article (using my secret identity on TechReport). But it was good to read the conclusion again as it points out a major dilema for anyone such as I considering a new gaming pc. That dilema is in the selection of either a dual core or a quadcore cpu and the dependency of that decision of what the gaming future holds. More to the point, I can't see sufficiently into the future to know if there will be any combat flight sims on the horizon which might help with that selection. One exception is the Rise-of-Flight WW1 air combat sim released it's recommended system specs on it's website: http://www.riseofflight.com/en/Gameinfo.html I'm not sure of the release date, but here is a Youtube video on the progress of the sim: http://de.youtube.com/watch?v=rtMw02fqr_g ***hint cough***Oleg!***cough***hint. Well, I've got time on my hands anyway as I am continually plagued with setbacks that keep me from financing a new system. But that's beside the point. thanks again! Flyby out |
Quote:
Flyby out edit: then there's this link that explains it all: http://www.oled-display.net/ I bet these things will be EXTREMELY expensive! |
oled'S oREOs .....
Someone mentioned this relationship:
OLEDS>LCD's>CRT I looked up the OLED link and there were discussions regarding brightness and contrast. No mention of 'refresh rates'. Remember the problem with LCD's is 'refresh rates' so I see nothing of interest here. DId I miss sumthin of interest to the gaming community? PEACE |
Quote:
Advantages The radically different manufacturing process of OLEDs lends itself to many advantages over flat-panel displays made with LCD technology. Since OLEDs can be printed onto any suitable substrate using an inkjet printer or even screen printing technologies,[33] they can theoretically have a significantly lower cost than LCDs or plasma displays. Printing OLEDs onto flexible substrates opens the door to new applications such as roll-up displays and displays embedded in fabrics or clothing. OLEDs enable a greater range of colors, brightness, and viewing angle than LCDs because OLED pixels directly emit light. OLED pixel colors appear correct and unshifted, even as the viewing angle approaches 90 degrees from normal. LCDs use a backlight and cannot show true black, while an "off" OLED element produces no light and consumes no power. Energy is also wasted in LCDs because they require polarizers OLEDs also have a faster response time than standard LCD screens. Whereas a standard LCD currently has an average of 4-8 millisecond response time, an OLED can have less than 0.01ms response time. This should be good enough me thinks. ;) |
codex,
Phew!! one can only hope that OLEDs can be tested soon and their gaming secrets revealed. I may have to start sellinh store-bought teeth to raise enough cahs to buy one though! :lol: gotta love new technology. I guess I can stop hunting for a good refurbished 24" CRT! :D Flyby out |
Quote:
|
Quote:
http://www.sonystyle.com/webapp/wcs/...52921665327724 http://www.oled-info.com/unitedkeys-...eyboard-review |
yeah, just a bit too soon, old bean. I found this too: http://www.jr.com/sony/pe/SON_XEL1/
Still I won't have the money to buy a 24 inch OLED monitor any time soo...wait. there's a bank just up the street, and I know a very good get-away driver. ;) Flyby out |
All times are GMT. The time now is 10:27 AM. |
Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.