A side note... As I understand it, what happens if you have to little precision in you depth/z buffer, especially when rendering scenes with a lot of objects spread out over a large depth like in CoD? You get flickering. Especially in shadows as you don't have precision enough to tell which pixel should be shown (ie the shadow or the grass below it) - so every other frame you get alterations of who is the winner, the grass, the shadow? Recognize that from somewhere
Anyway - being an old programmer working with server side software and not games I have just made a few attempts at doing directx games for fun in the Direct X 7-9 days. The new stuff in Dx10-11 is way to new for me, and besides I never really got the hang of all the 3D buffers and how to use them best etc... It worked rather good anyway with "standard" settings when you didn't have to render millions of objects
EDIT: But to my knowledge, a depth buffer is always 32-bit, where you normally use 24 bits for depth and 8 for stencil, even though you can use all 32-bits for depth (would be nice for CoD:s shadows?). But the you loose the stencil buffer... And having 32-bit depth and 16 bit stencil should not be possible, but maybe in Dx10-11? And do you need 16-bit stencil? And different graphics hardware have different support for this so that could cause the differing results. Anyway - a larger depth buffer should improve the quality - but not performance?