Cg vs. glsl vs. ARB for GPGPU
Posted: Mon Jun 30, 2008 12:21 am
Hey!
I was just trying to hack around and try out GPGPU just for fun, trying to speed up a naive Bayesian classifier to begin with.
Looks like I have 2 portable options - nVidia Cg and 3DLabs' GLSL. ARB is portable too, but I'm not very sure due to lack of branching.
Not thinking of any optimizations right now (don't even know the possibilities) - if I go the Cg/glsl way, I assume they would do it for me.
Anyone having some experience with both or either? which one looks more promising wrt. automatic code generations from AST of operations? Or is the good old ARB the best choice for automatic code gen?
Thanks in advance
--
prashant
I was just trying to hack around and try out GPGPU just for fun, trying to speed up a naive Bayesian classifier to begin with.
Looks like I have 2 portable options - nVidia Cg and 3DLabs' GLSL. ARB is portable too, but I'm not very sure due to lack of branching.
Not thinking of any optimizations right now (don't even know the possibilities) - if I go the Cg/glsl way, I assume they would do it for me.
Anyone having some experience with both or either? which one looks more promising wrt. automatic code generations from AST of operations? Or is the good old ARB the best choice for automatic code gen?
Thanks in advance
--
prashant