What would happen if you created a software wrapper that allowed a system without a graphics card to render DirectX 10 visuals on a CPU?
The folks at Microsoft decided to find out and development WARP10 (Windows Advanced Rasterisation Platform 10), a software component to be used in Windows 7.
WARP10, a software rasteriser, allows for DirectX rendering to take place on the CPU, allowing users to take advantage of DirectX functionality when a GPU isn't present. The idea itself isn't anything new, and despite being able to achieve its goal, performance is severely limited.
GPUs have the distinct advantage of dedicated graphics architecture, and features such as texturing units aren't available on today's CPUs. Similarly, a CPU's available bandwidth is far lower than that of a high-end graphics card.
Nonetheless, Microsoft found that WARP10 was able to run DirectX applications such as Crysis - a demanding 3D game - without any GPU at all. Highlighting the strain set upon the CPU, however, are the performance results. At a low resolution of 800x600, the high-end 3GHz Intel Core i7 processor managed an average FPS of only 7.36 - higher than Intel's integrated graphics, mind you, but still far too low to worry any dedicated graphics card.
Hardware |
Ave FPS |
Min FPS |
Max FPS |
---|---|---|---|
Core i7 8 Core @ 3.0GHz |
7.36 |
3.46 |
15.01 |
Penryn 4 Core @ 3.0GHz |
5.69 |
2.49 |
10.95 |
Penryn 2 Core @ 3.0GHz |
3.48 |
1.35 |
6.61 |
Phenom 9550 4 Core @ 2.2GHz |
3.01 |
0.53 |
5.46 |
NVIDIA 8800 GTS |
84.80 |
60.78 |
130.83 |
NVIDIA 8400 GS |
33.89 |
21.22 |
51.82 |
ATI 3400 |
37.18 |
22.97 |
59.77 |
Intel DX10 Integrated |
5.17 |
1.74 |
16.22 |
So, if its performance is so severely limited, what exactly is its purpose? Well, there are a few suggestions floating about. The first is that WARP10 will allow Microsoft to make its Windows 7 requirements a whole lot simpler, as a GPU may no longer be required in order to attach the "Windows 7 Capable" sticker.
There could be simpler uses, too. What would a user do if a dedicated GPU in a system were to fail? With WARP10, there's a fallback, and a user could continue to use the system without the GPU. There's a problem with this theory, though. WARP10 might take over graphics responsibilities without kicking up much of a fuss, but it'd need a video output in order to do so - that would be found on the integrated graphics or the dedicated card.
It seems as though there's no real purpose for WARP10, at least not yet. What Microsoft has done is demonstrate that DirectX visuals can be achieved without a GPU, albeit at a sluggish rate. It won't by any means have the likes of AMD and NVIDIA worried anytime soon, but looking forward there's something else that's technically familiar to WARP10 - Intel's Larrabee.
Larrabee, the codename given to Intel's forthcoming GPU, takes a software-driven approach to rendering. If a CPU with 8 virtual cores can achieve DirectX 10 framerates of around 7fps, what might a many-core GPU with texture sampling units be able to throw out?
The line between GPUs and CPUs continues to blur, and WARP10 is an interesting development. If you're intrigued, you can read more about it in an in-depth guide available at the MSDN Library.