I use FFmpegFrameRecorder to record video from byte[] yuvFrame , but it seems that delayed
byte[] bytes = new byte[yuvFrame.length];
System.arraycopy(y, 0, bytes, 0, y.length);
for (int i = 0; i < u.length; i++) {
bytes[y.length + (i * 2)] = nv[i];
bytes[y.length + (i * 2) + 1] = nu[i];
}
opencv_core.IplImage yuvImage1 = opencv_core.IplImage.create(width, height * 3 / 2, IPL_DEPTH_8U, 1);
yuvImage1.getByteBuffer().put(bytes);
opencv_core.IplImage bgrImage = opencv_core.IplImage.create(width, height, IPL_DEPTH_8U, 3);
cvCvtColor(yuvImage1, bgrImage, CV_YUV2BGR_NV21);
Bitmap modifiedBitmap = IplImageToBitmap(bgrImage);
org.bytedeco.javacv.AndroidFrameConverter converter = new AndroidFrameConverter();
Frame modifiedFrame = converter.convert(modifiedBitmap);
recorder.record(modifiedFrame);