Question

My goal is to convert image pixels with value = 255 to 0. i.e. remove all the pure white pixels. Here's the code in python using opencv and opencl:

import os
import glob
import cv2 as cv
import numpy as np
import pyopencl as cl

def filter_image( ):

    platforms = cl.get_platforms()
    devices = platforms[0].get_devices( cl.device_type.ALL )
    context = cl.Context( [devices[0]] )
    cQ = cl.CommandQueue( context )
    kernel = """
        __kernel void filter( global uchar* a, global uchar* b ){
                int y = get_global_id(0);
                int x = get_global_id(1);

                int sizex = get_global_size(1);

                if( a[ y*sizex + x ] != 255 )
                        b[ y*sizex + x ] = a[ y*sizex + x ];
            }"""

    program = cl.Program( context, kernel ).build()

    for i in glob.glob("*.png"):

        image = cv.imread( i, 0 )        
        b = np.zeros_like( image, dtype = np.uint8 )
        rdBuf = cl.Buffer( 
                context,
                cl.mem_flags.READ_ONLY | cl.mem_flags.COPY_HOST_PTR,
                hostbuf = image
                          )

        wrtBuf = cl.Buffer( 
                context,
                cl.mem_flags.WRITE_ONLY,
                b.nbytes
                          )

        program.filter( cQ, image.shape, None, rdBuf, wrtBuf ).wait()
        cl.enqueue_copy( cQ, b, wrtBuf ).wait()
        cv.imshow( 'a', b )
        cv.waitKey( 0 )

def Filter( ):
    os.chdir('D:\image')
    filter_image( )
    cv.destroyAllWindows()

The problem I am facing is that, once I use the loop as in the above program, the logic works for only the first image. i.e. the white pixels are removed only for the first image but no effect is seen in the subsequent images, i.e the output image is same as the input images without any kind of effect on pixels with value = 255. This should be simple. I am unable to find any solution.

Kindly help me in resolving this problem.

Thank you.

Was it helpful?

Solution

In your kernel, you're not setting the pixel in image b to anything if the pixel in image a is white. You should change it to something like the following:

b[y * sizex + x] = (a[y * sizex + x] == 255) ? 0 : a[y * sizex + x];

Which will set the pixel in image b to zero if the pixel in image a is white, and copy the pixel unchanged otherwise. Also consider doing this kind of manipulation in-place, such that only one buffer is needed.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top