Quote:
Originally Posted by pupo162
im starting to feel like a HOAX
i jsut tried multiple values for thsoe settings, including the 77/55 and the hold 24/16... truth is performance is the same stable thingy..
i really dont know how is this possible...
|
Well read my post a few pages back. If you set an invalid combination like color 24 and stencil 16 I guess it goes to some default setting in the "catch all" part of the logic in the code (the default: section of the switch case statement initializing the D3D device). It's probably a "low" setting like 16 bit with no stencil at all - thereby giving you more performance...
EDIT: The best way to test if my statement above holds is naturally to use some out of bounds value like you did above. If you get better performance with a setting like color=0 and stencil=0 than the default 24/8 then my theory holds? I seems like you did that so then the question is what it defaults to? 16/0? In that case you should notice that you have a more narrow color spectra in the game when you see the increased performance? As I see it there is no option in the game settings to use 16-bit color instead of 32-bit so I guess you found the way? Using color=16 and stencil=0 feels better for me than using an unsupported format... But is 16/0 supported? We are only guessing