View Single Post
  #1  
Old 09-24-2011, 07:50 PM
pupo162 pupo162 is offline
Approved Member
 
Join Date: Feb 2010
Posts: 1,188
Default

Quote:
Originally Posted by mazex View Post
Well read my post a few pages back. If you set an invalid combination like color 24 and stencil 16 I guess it goes to some default setting in the "catch all" part of the logic in the code (the default: section of the switch case statement initializing the D3D device). It's probably a "low" setting like 16 bit with no stencil at all - thereby giving you more performance...

EDIT: The best way to test if my statement above holds is naturally to use some out of bounds value like you did above. If you get better performance with a setting like color=0 and stencil=0 than the default 24/8 then my theory holds? I seems like you did that so then the question is what it defaults to? 16/0? In that case you should notice that you have a more narrow color spectra in the game when you see the increased performance? As I see it there is no option in the game settings to use 16-bit color instead of 32-bit so I guess you found the way? Using color=16 and stencil=0 feels better for me than using an unsupported format... But is 16/0 supported? We are only guessing
my issue is that when i reversed back into 24/8 my good performance standed

soemthing like

24/8 - bad performance --> 32/16 - awsome --> 24/8 - still awsome.

thats what is buggering me
Reply With Quote