Just saw a review of a new kind of anti-aliasing method nVidia have released.
http://www.hardocp.com/article/2011/...ing_technology
To sum it up, you can get an anti-aliasing effect equal to or even better than 'normal' 4X AA, with about 50% less of a performance hit.
Very simply put, this is achieved by applying the anti-aliasing to the whole image you see on the screen instead of just applying it to the objects rendered within the image.
AMD/ATI have developed something similar, but it's more of a frame-rate killer and only works with AMD/ATI video-cards.
This nVidia version can work with both nVidia and AMD/ATI video cards, but for it to be most effective in a game the game has to be coded to work with it. I think though, all that means is the game developers have to make sure the anti-aliasing only applies to the 3-D part of the game and doesn't blur the 2-D elements like cross-hairs or HUD type writing that's supposed to appear on top of the main image. It's probably even possible to force the anti-aliasing even if the game developers haven't coded for it (at the cost of some blurry 2D elements), but that's yet to be seen.
Anyway, just thought I'd mention it. Looks like it might be a better way of getting rid of the jaggies in Cliffs Of Dover than...whatever that is they're using, or not using, at the moment. Something to add to the list perhaps.