Domanda

How do I use OpenCL(for GPU compute) with Hadoop ?

My data set resides in HDFS. I need to compute 5 metrics, among which 2 are compute intensive. So I want to compute those 2 metrics on GPU using OpenCL and the rest 3 metrics using java map reduce code on Hadoop.

How can I pass data from HDFS to GPU ? or How can my opencl code access data from HDFS ?

How can I trigger OpenCL codes from my Java map reduce codes ?

It would be great if someone could share a sample code.

Nessuna soluzione corretta

Altri suggerimenti

one can use jogamp (jocl) to invoke opencl from java, which is basically a wrapper over the native opencl libraries. you need to access first the data using java/hadoop libraries, transfer them to CLBuffers (which are java objects containing buffers used to communicate with opencl), copy them to gpu, invoke kernel, copy results back from gpu to your buffers. check the jocl examples.

another alternative is to use the aparapi library. here the data processing kernel is a simple java function (with some restrictions), the framework translates from java bytecode-> opencl, so the opencl part is hidden from the programmer. Of course, not everything can be translated from java->opencl, check their examples.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top