![]() |
|
|||||||
| Technical threads All discussions about technical issues |
![]() |
|
|
Thread Tools | Display Modes |
|
|
|
#1
|
|||
|
|||
|
If this is the case than turn vsynk back on but force triple buffering ON in
ATI Tray Tools >> Game Profiles >> Create a profile for Launcher.exe >> Direct3D tweaks tab >> "Force to use triple buffers" checkbox. >> Save the profile. Epilepsy filter is not a good solution imho. |
|
#2
|
|||
|
|||
|
I will try ATT and RadeonPro if that doesn't work.
I have however tried with both applications before and done everything suggested even deinstalling everything and just installing the radeon driver and operate it with ATT but that leaves crossfire disabled since ATT in the latest version is broken in that regard. It doesn't seems to fare well with my system unfortunately. Perhaps the solution is to test with a monitor capable of 120Hz? But i will keep fiddling, sooner or later i will perhaps get it to a state where it is operational.
__________________
Gigabyte MA790FX5-UDP Phenom II X4 955 BE 3,4 GHz 8 GB DDR3 1333 Raid 0 Array 2x Radeon XFX HD6870 1GB |
|
#3
|
|||
|
|||
|
I have monitored CoD using well in excess of 2 Gb memory on my card and it climbs slowly but surely ever higher. I believe there is a memory leak in CoD.
Ataros's suggestions do nothing for me that I can notice. Last edited by icarus; 01-26-2012 at 09:57 PM. |
|
#4
|
|||
|
|||
|
Interesting stuff Ataros.
It seems since Vista Win 7 triple buffering has had some changes. "Q Can anyone let me know whether the triple buffering and vsync in the Nvidia Control Panel work for DirectX games? A The 'triple buffering' option does affect rendering behavior within the modern DirectX APIs. The 'vertical synchronization' control is less cooperative: in Windows XP, it applies to both OpenGL and DirectX 9 APIs; in Windows Vista and Windows 7, it only affects the OpenGL APIs." http://forums.nvidia.com/index.php?showtopic=173860 |
|
#5
|
|||
|
|||
|
Quote:
|
|
#6
|
|||
|
|||
|
Quote:
However there are many different opinions and discussions on other forums like guru3D, etc. Some say only OGL lines of code are present inside the drivers. Another opinion is that an application itself must have triple buffering enabled and some applications enable it by default when a user switches vsynk on inside the app (but not in drivers). Some apps like Left for Dead and BF3 have tripple buffering as a separate checkbox in settings. We do not know how CloD is programmed and need to benchmark it with various settings to find out. IIRC when I switched vsync ON in drivers I had some issues. Maybe ingame CloD vsync uses TB. ATI Catalyst still lists triple buffering inside OGL section only. I run vsynk off to be on a safe side. Having TrackIR smoothing set to max helps me to avoid tearing. It still happens but on very rare occasions. Reducing settings also helps with it. For NV users to be on a safe side I would install D3D Overrider to force triple bufering ON and test it. It is free Download and install recent RivaTuner, find the D3D Overrider executable inside the install folder and copy it somewhere. you can now uninstall RivaTuner. D3D Overrider does not require RivaTuner to run. Last edited by Ataros; 01-27-2012 at 11:11 AM. |
|
#7
|
|||
|
|||
|
Quote:
1. Vysinc Disabled. (Pre render frames=1 NVidia CP) 2. Vysinc enabled. (Pre render frames=1 NVidia CP) 3. Vsync Enabled + (Triple Buffering - Pre Render Frames =3 Nvidea CP) 4. Vsync Enabled + (Triple Buffering Forced D3D Overrider). I expected to see differences in performance but didn't except for Screen Tearining in 1 above. In fact, just having Vsync enabled in 2 above seemed just as good as 3, 4, and 1 but without the screen tearing....
__________________
MP ATAG_EvangelusE AMD A8 5600K Quad Core 3.6 Ghz - Win 7 64 - 8Gb Ram - GTX660ti 2Gb VRAM - FreeTrack - X52 - Asus 23' Monitor. |
|
#8
|
|||
|
|||
|
Quote:
I mention some tips on running a test here http://forum.1cpublishing.eu/showpos...01&postcount=2 To make sure the performance is not stuck because of your CPU only you can try running the same test in a small window on low settings. If this does not give you big fps increase then it is CPU fault. PS. Is it necessary to change pre-render frames? IIRC I always have them on auto/default. Increasing them causes input lag. Reducing - reduces performance. I would change it only if I have a very old card. But again for nVidia it may be different. Last edited by Ataros; 01-27-2012 at 07:54 PM. |
![]() |
| Thread Tools | |
| Display Modes | |
|
|