Unless you have been living under a large rock and are a self-confessed technophobe, NVIDIA's launch of the impressive GeForce GTX 680 graphics card will most certainly have grabbed your attention.
Righting many of the wrongs present in the Fermi architecture and catching up with AMD on a bunch of multimedia features, GTX 680 is the fastest single-GPU graphics card this side of anywhere. Very quiet to boot, the well-heeled enthusiast should put it on top of their shortlist for the next graphics-card upgrade.
But we don't recommend the GTX 680 without any reservations. NVIDIA has made a number of sensible choices with respect to how the GTX 680's guts are constructed, though the decision to drop the memory bus width to 256-bits and opt for 2GB of super-fast GDDR5 are a potential cause for concern if you like to run with multiple screens - read lots of pixels - and game with copious amounts of image quality.
GTX 680's memory bandwidth is the same as the last-generation GTX 580, give or take a little, and the 2GB framebuffer is, obviously, smaller than the 3GB offered by AMD on its price-comparable Radeon HD 7970 GPU. Question is, does the GTX 680 hold up when tasked to run three screens - and, remember, it can do so from a single card now - knowing that it has both a memory bandwidth size and speed deficit compared to AMD's single-GPU champ, HD 7970?
We put each high-end card into our test-rig and connected them up to three BenQ EW2430 full-HD 24in monitors, resulting in an overall 5,760x1,080 resolution. Both cards are easy to setup, and NVIDIA's procedure even shows you, via an on-screen graphic, that the preferred setup for three-screen gaming is availed through using both DVI ports and the HDMI.
While this article is not an evaluation of the minutiae surrounding the multi-monitor setup technology for both companies, AMD's Eyefinity uses a Single Large Surface (SLS), which turns the three monitors into one rectangular slab of a screen, including full-width taskbar, whereas NVIDIA's simply extends the screens. There's no gaming penalty when following NVIDIA's path, but folk who want genuine three-screen real estate for 2D productivity are best served by AMD's tech, we feel.
What we care about is how the two monster cards perform when hit with the deadly concoction of over 6MP of load allied to our high-end settings. Full system spec. can be found here.