سؤال

I'm using OpenCV 2.4.8 in Android using JNI.

I opening the camera with VideoCapture and I want to record it. I have the image in a cv::Mat, it appear in the screen correctly.

But, when I try to open VideoCapture, it returns false always.

       // Camera resolution is 640x480 and its fine.
       _camera_resolution = calc_optimal_camera_resolution(u.name, 640, 480);  

        // Store in sdcard, I have permission in AndroidManifest.xml
        const char * videoName = "/sdcard/videoTest.avi"; 

             // Segmentation fault 11! this method doesnt work in Android?  so commented
            //const int ex = static_cast<int>(_reader.get(CV_CAP_PROP_FOURCC)); 

            // Segmentation fault 11! too!!! so commented
            //const double fps = _reader.get(CV_CAP_PROP_FPS);      

            // Try to open
            _isRecording = _writer.open(videoName, -1, 30, _camera_resolution, true); 
            // Return FALSE always
            if(!_writer.isOpened())
            {
                LOGE("rec - Error opening video writer");
            }
            else
            {
                LOGD("rec - Video writer opened in startRecording");
            }

I have tried to use as FOURCC:

CV_FOURCC('M', 'J', 'P', 'G') and CV_FOURCC('M', 'P', '4', 'V') // It doesn't works!

I have tried with different fps ratios, 15.0, 30.0...

Camera resolution seems work because when I print the value is correct.

Why is not opening correctly?

هل كانت مفيدة؟

المحلول

As I know OpenCV4adnroid doesn’t support video reading and writing. Try to rebuild your Opencv with an encoder option: (such: WITH_FFMPEG=YES or WITH_VFW=YES )

OR try save a sequence of images then encode a video from this sequence from java code I tried next proposition (ref)

public void imageToMP4(BufferedImage bi) {
    // A transform to convert RGB to YUV colorspace
    RgbToYuv420 transform = new RgbToYuv420(0, 0);

    // A JCodec native picture that would hold source image in YUV colorspace
    Picture toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);

    // Perform conversion
    transform.transform(AWTUtil.fromBufferedImage(bi), yuv);

    // Create MP4 muxer
    MP4Muxer muxer = new MP4Muxer(sink, Brand.MP4);

    // Add a video track
    CompressedTrack outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);

    // Create H.264 encoder
    H264Encoder encoder = new H264Encoder(rc);

    // Allocate a buffer that would hold an encoded frame
    ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);

    // Allocate storage for SPS/PPS, they need to be stored separately in a special place of MP4 file
    List<ByteBuffer> spsList = new ArrayList<ByteBuffer>();
    List<ByteBuffer> ppsList = new ArrayList<ByteBuffer>();

    // Encode image into H.264 frame, the result is stored in '_out' buffer
    ByteBuffer result = encoder.encodeFrame(_out, toEncode);

    // Based on the frame above form correct MP4 packet
    H264Utils.encodeMOVPacket(result, spsList, ppsList);

    // Add packet to video track
    outTrack.addFrame(new MP4Packet(result, 0, 25, 1, 0, true, null, 0, 0));

    // Push saved SPS/PPS to a special storage in MP4
    outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));

    // Write MP4 header and finalize recording
    muxer.writeHeader();
}

You can download JCodec library from a project web site or via Maven, for this add the below snippet to your pom.xml:

<dependency>
    <groupId>org.jcodec</groupId>
    <artifactId>jcodec</artifactId>
    <version>0.1.3</version>
</dependency>

Android: Android users can use something like below to convert Android Bitmap object to JCodec native format:

public static Picture fromBitmap(Bitmap src) {
    Picture dst = Picture.create((int)src.getWidth(), (int)src.getHeight(), RGB);
    fromBitmap(src, dst);
    return dst;
}

public static void fromBitmap(Bitmap src, Picture dst) {
    int[] dstData = dst.getPlaneData(0);
    int[] packed = new int[src.getWidth() * src.getHeight()];

    src.getPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());

    for (int i = 0, srcOff = 0, dstOff = 0; i < src.getHeight(); i++) {
        for (int j = 0; j < src.getWidth(); j++, srcOff++, dstOff += 3) {
            int rgb = packed[srcOff];
            dstData[dstOff]     = (rgb >> 16) & 0xff;
            dstData[dstOff + 1] = (rgb >> 8) & 0xff;
            dstData[dstOff + 2] = rgb & 0xff;
        }
    }
}

نصائح أخرى

('M','J','P','G') is the only supported by android while using .avi ext. Most important of all is to #include stdio.h, without this you wont be able to open VideoWriter video

cv::VideoWriter writer;
writer.open("your_mp4_file_path", cv::VideoWriter::fourcc('H', '2', '6', '4'),
                    15, //framerate
                    cv::Size(720, 1280),
                    true);
writer << mat_frame;

// remember writer.release() when finish

android opencv 4.5.2, opencv with ffmpeg + openh264, work well for me

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top