View Single Post
  #1  
Old 09-24-2011, 07:14 PM
mazex's Avatar
mazex mazex is offline
Approved Member
 
Join Date: Oct 2007
Location: Sweden
Posts: 1,342
Default

Quote:
Originally Posted by pupo162 View Post
im starting to feel like a HOAX

i jsut tried multiple values for thsoe settings, including the 77/55 and the hold 24/16... truth is performance is the same stable thingy..

i really dont know how is this possible...
Well read my post a few pages back. If you set an invalid combination like color 24 and stencil 16 I guess it goes to some default setting in the "catch all" part of the logic in the code (the default: section of the switch case statement initializing the D3D device). It's probably a "low" setting like 16 bit with no stencil at all - thereby giving you more performance...

EDIT: The best way to test if my statement above holds is naturally to use some out of bounds value like you did above. If you get better performance with a setting like color=0 and stencil=0 than the default 24/8 then my theory holds? I seems like you did that so then the question is what it defaults to? 16/0? In that case you should notice that you have a more narrow color spectra in the game when you see the increased performance? As I see it there is no option in the game settings to use 16-bit color instead of 32-bit so I guess you found the way? Using color=16 and stencil=0 feels better for me than using an unsupported format... But is 16/0 supported? We are only guessing
__________________
i7 2600k @ 4.5 | GTX580 1.5GB (latest drivers) | P8Z77-V Pro MB | 8GB DDR3 1600 Mhz | SSD (OS) + Raptor 150 (Games) + 1TB WD (Extra) | X-Fi Fatality Pro (PCI) | Windows 7 x64 | TrackIR 4 | G940 Hotas

Last edited by mazex; 09-24-2011 at 07:29 PM.
Reply With Quote