Before I even get to the core issue here, I should point out that the GPUImageAdaptiveThresholdFilter already does a conversion to grayscale as a first step, so your -grayImage:
code in the above is unnecessary and will only slow things down. You can remove all that code and just pass your input image directly to the adaptive threshold filter.
What I believe is the problem here is a recent set of changes to the way that GPUImagePicture pulls in image data. It appears that images which aren't a multiple of 8 pixels wide end up looking like the above when imported. Some fixes were proposed about this, but if the latest code from the repository (not CocoaPods, which is often out of date relative to the GitHub repository) is still doing this, some more work may need to be done.
In the meantime, when you crop your images, do so to the nearest multiple of 8 pixels in width and you should see the correct result.