Software rendering delivers? Microsoft WARP
I assume a few people have heard of Swiftshader, which basically emulates Direct3D 9 in software and allows people on crappy Intel GMA 950 based chipsets to play more games. Pretty cool, except for the minor fact that real GPUs will provide far better 3D performance and should be used if possible. Now, Microsoft apparently made their own software renderer called “WARP”, which ousts Swiftshader in feature set (Direct3D 10.1 and possibly more SSE instruction support) and probably in speed as well, as it supports more robust pixel shaders and such. Gaming performance is poor, except on the highest end CPUs. So before you bring out Crysis (or Metro 2033), an Intel “Gulftown” Core i7 with 6 hyperthreaded cores is most likely only going to be able to play the game on low settings and resolution.
This is apparently designed to provide basic graphics performance when you have a crappy GPU, or a broken one, or whatnot. But the idea of shifting the burden from the GPU back to the CPU gives me an idea… you know the latest batch of high end graphics card from our friends at NVidia? They have absolutely atrocious power efficiency, and due to the fact that only hardcore people tend to buy them, we often end up with a fairly hardcore CPU as well. The thing is, the highest end CPUs fail to consume nearly as much power at at heavy loads. What I’m thinking is that we use WARP to allow the GPU to idle more often and save relative power during low-moderate loads such as video playback or older games. Obviously, we’d have to test this out, but it could be interesting, and offer a competitor/companion to stuff like NVidia’s Optimus technology for laptops.