I am new to iphone programming and was wondering how to apply filters through Core Image. Also, which filter in the Core Image filter list is night vision because I saw the list in the documentation but was not able to determine which filter looked like night vision.

Thanks in advance

有帮助吗?

解决方案

You can simply convert your image to YCbCr color space and use Y channel value instead of Green channel in you RGB image which you want to show. Same way suggested here. Finally you have an image representing more green pixels instead of those which are brighter and vica versa. Although in night vision images, the most bright pixels are colored white, but this way you will have more green pixels! I presume you can use a nonlinear function of Y value for R and B values to reach this property. for example:

for(int i=0;i<image.size;i+=3)
{
RGBImage[i] = YCbCrImage[i];
RGBImage[i+1] = 256*log(YCbCrImage[i]+1);
RGBImage[i+2] = 256*log(YCbCrImage[i]+1);
}

Cheers

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top