C#: Perform Operations on GPU, not CPU (Calculate Pi) [closed]
Question
I've recently read a lot about software (mostly scientific/math and encryption related) that moves part of their calculation onto the GPU which causes a 100-1000 (!) fold increase in speed for supported operations.
Is there a library, API or other way to run something on the GPU via C#? I'm thinking of simple Pi calculation. I have a GeForce 8800 GTX if that's relevant at all (would prefer card independent solution though).
Solution
It's a very new technology, but you might investigate CUDA. Since your question is tagged with C#, here is a .Net wrapper.
As a bonus, it appears that your 8800 GTX supports CUDA.
OTHER TIPS
You might want to look at this question
You're probably looking for Accelerator, but if you are interested in game development in general I'll suggest you take a look at XNA
CUDA.NET should be exactly what you're looking for, and it seems to support your specific graphics card.
You can access the latest Direct3D APIs from .NET using the Windows API Code Pack. Direct3D 11 comes with Compute Shaders. These are roughly comparable to CUDA, but work also on non-NVIDIA GPUs.
Note that Managed DirectX and XNA are limited to the Direct3D 9 feature set, which is somewhat difficult to use for GPGPU.
There is a set of .Net bindings for Nvidia's CUDA api, it's called CUDA.net. You can refer to the reference guide to look at some sample C# code.
The preferred way to access your co-proccesor (GPU) would be using OpenCL so that your code would be portable with ATI cards, but I believe there may be additional coding required and I'm not sure how much support OpenCL has for the .Net platform.
If you want to use C++, here's a quick overview on how to get some sample code compiling with Visual Studio.
FYI: Accelerator (http://research.microsoft.com/en-us/projects/Accelerator/) was working great for a couple of tests.