View Single Post
  #4  
Old 09-25-2011, 12:32 AM
mazex's Avatar
mazex mazex is offline
Approved Member
 
Join Date: Oct 2007
Location: Sweden
Posts: 1,342
Default

A side note... As I understand it, what happens if you have to little precision in you depth/z buffer, especially when rendering scenes with a lot of objects spread out over a large depth like in CoD? You get flickering. Especially in shadows as you don't have precision enough to tell which pixel should be shown (ie the shadow or the grass below it) - so every other frame you get alterations of who is the winner, the grass, the shadow? Recognize that from somewhere

Anyway - being an old programmer working with server side software and not games I have just made a few attempts at doing directx games for fun in the Direct X 7-9 days. The new stuff in Dx10-11 is way to new for me, and besides I never really got the hang of all the 3D buffers and how to use them best etc... It worked rather good anyway with "standard" settings when you didn't have to render millions of objects

EDIT: But to my knowledge, a depth buffer is always 32-bit, where you normally use 24 bits for depth and 8 for stencil, even though you can use all 32-bits for depth (would be nice for CoD:s shadows?). But the you loose the stencil buffer... And having 32-bit depth and 16 bit stencil should not be possible, but maybe in Dx10-11? And do you need 16-bit stencil? And different graphics hardware have different support for this so that could cause the differing results. Anyway - a larger depth buffer should improve the quality - but not performance?
__________________
i7 2600k @ 4.5 | GTX580 1.5GB (latest drivers) | P8Z77-V Pro MB | 8GB DDR3 1600 Mhz | SSD (OS) + Raptor 150 (Games) + 1TB WD (Extra) | X-Fi Fatality Pro (PCI) | Windows 7 x64 | TrackIR 4 | G940 Hotas

Last edited by mazex; 09-25-2011 at 12:39 AM.
Reply With Quote