People usually refer to Direct3D as DirectX but I prefer Direct3D. Because Direct3d is one of DirectX components which today only a few left, such as DirectX Graphics (Direct3D), DirectX Audio (Direct Music + Direct Sound) and Direct Input, Yes that’s all and I am talking about DirectX 9.
Why do I write this topic? Sure there is an issue. Within our Vizky application, the one that I am working on (it’s my main job here at Vizrt Thailand). My job is to make the Direct3D renderer that renders then scene that is equivalent to OpenGL renderder. If you look into the Direct3D pipeline and compare it with Opengl’s you will see something difference.
The vital one is how Direct3D adds specular lighting after texture operations are done. While OpenGl does the operation inversely. So to make it works as it should be, I have to develop shader codes, well that’s not fun. Think about the machine that can not run shader, BAD LUCK! in that case I can do nothing but just let Direct3D does what it does.
Now the mystery comes again. It is in Direct3D pipeline when I want to have an automatic spherical mapping generation with texture translation using texture matrix. The math is easy enough and the way to setup Direct3D is straightforward but it does not work. Well, yesterday, I was working on the problem for a whole day until I got a solution from www.gamedev.net. It’s just a documentation of Direct3D and how well my English skills are. Direct3D does vertices computation internally because it is hard-wired with GPU. So I can not do much but just to call the APIs which are provided by Direct3D. The problems arise when I don’t understand the APIs enough to use them right or there are garbage in the documentation. The shader could be a solution but again what about old machines.
The next generation of 3D rendering on GPU will almost allow you to do every 3D operation including vertices transformation by yourself. I am not saying that the current generation GPU doesn’t allow you to do that but because of lagacy issues, making only one solution would not be a good way to go. But when we move to ideal next generation (get rid of lagacy issues or don’t have to support old machines). Then what I do is just write everything in shader and it will work, nothing is hidden in the GPU, well some of them will be. Even if it is Intel’s Larrabee, Nvidia or AMD (ATI) GPU. These arcitects will make my life easier!