Transparency Anti-aliasing and Video Processing
NVIDIA's transparency anti-aliasing uses sub-pixel samples to anti-alias alpha-textures. NVIDIA support it in G70 using both multi-sampling (depth samples) and super-sampling (texture samples) using alpha information attached to the texture. The number of samples taken depends on the base anti-aliasing mode you're using at the time. So if you've chosen to use 4x anti-aliasing in your game, the transparency AA algorithm takes 4 extra depth sample for the alpha texture. Here's how it looks. Firstly, without transparency AA on an alpha-textured chain-link fence (everyone will use chain-link fences, so be warned!). All images are clickable for lossless PNG versions.Notice the significant texture aliasing visible on the links. Turn on super-sampling transparency AA and most of the aliasing artifacts disappear.
The difference output of the two images shows you what parts of the image the super-sampling is working on. You'll need to click the image to see it properly, resizing the image loses the difference detail.
To measure the performance hit, I recorded a short demo in Half-Life 2 inside the prison section of Nova Prospekt, where there's more chain-link fencing (and hence alpha textures) than you know what to do with. Benchmarking the demo showed the performance hit.
The performance difference is about 12 percent over the demo, with between 10 and 15% NVIDIA's claimed average performance hit.
Video Processing
Following ATI's announcement at Computex that unannounced hardware was going to accelerate the decode of H.264 video, a format more commonly known as MPEG-4 AVC, NVIDIA have been keen to say that they'll have support for H.264 sometime in 2005, on all their hardware that has working PureVideo silicon. Playing back 1080p content using the FX and 7800 GTX test platform shows around 45 to 55% CPU usage. The GPU's doing something, but not something a 6600 GT can't do, for example.It appears that NVIDIA haven't spent much, if any, of their transistor budget for G70 on silicon used just to process video. H.264 support seems to be something that'll be accelerated by fragment programs on G70 and other NVIDIA hardware, rather than by dedicated decode hardware like ATI appear to possess. With 3D speed ever increasing, massively powerful CPUs like the 2800MHz FX becoming a limitation to new single boards, never mind SLI, image quality and video processing increases are what's needed next.
Video quality appears unchanged compared to NV43 and the other NVIDIA GPUs with a fixed video processor, which is slightly disappointing.
With H.264, especially since it's the native format for Sony's PSP hand-held gaming console and the video format for both HD-DVD and BluRay video, about to become the most dominant video format in common use, spending some time with the GPU in terms of decode outside of the fragment hardware seems like a prudent thing to do. We'll see.
Look out for an article on H.264 and other PC video formats to be used in HD content delivery and high-bitrate, high-compression rate portable video in due course.