Git Product home page Git Product logo

Comments (4)

wysaid avatar wysaid commented on August 16, 2024

Sure, change the code, call the function:
https://github.com/wysaid/android-gpuimage-plus/blob/master/library/src/main/java/org/wysaid/nativePort/CGEFrameRecorder.java#L27

Or, you can extend the CameraRecordGLSurfaceView, and call your own record function, and pass the right bitrate.

from android-gpuimage-plus.

DannyFilo avatar DannyFilo commented on August 16, 2024

Hi many thanks for your suggestion,

I actually implemented ExtendedCameraRecordGLSurfaceView. Code is compilable etc, but have just 1 issue with initialisation of private CGEFrameRecorder mFrameRecorder = new CGEFrameRecorder(); Basically startRecording(fps, bitRate, recordFilename) returns always false and I got Log.i(LOG_TAG, "start recording failed!") generated in below short code, so camera is not recording anything.

Appreciate any suggestion on how to correctly initialise mFrameRecorder as you're the master of this package :)

Full code of ExtendedCameraRecordGLSurfaceView.


import android.Manifest;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.content.Context;
import android.content.ContextWrapper;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.hardware.Camera;
import android.net.Uri;
import android.os.Build;
import android.util.AttributeSet;
import android.util.Log;
import android.view.MotionEvent;
import android.view.View;
import android.view.ViewGroup;
import android.widget.FrameLayout;

import androidx.annotation.NonNull;

import androidx.annotation.Nullable;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import androidx.core.view.ViewCompat;

import org.wysaid.common.Common;
import org.wysaid.myUtils.FileUtil;
import org.wysaid.myUtils.ImageUtil;
import org.wysaid.nativePort.CGEFrameRecorder;
import org.wysaid.nativePort.CGENativeLibrary;
import org.wysaid.view.CameraRecordGLSurfaceView;

import io.flutter.plugin.common.BinaryMessenger;
import io.flutter.plugin.common.MethodCall;
import io.flutter.plugin.common.MethodChannel;
import io.flutter.plugin.common.PluginRegistry;
import io.flutter.plugin.platform.PlatformView;

import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.Objects;


import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.ShortBuffer;
import org.wysaid.nativePort.CGEFrameRecorder;

public class ExtendedCameraRecordGLSurfaceView extends CameraRecordGLSurfaceView {

   private AudioRecordRunnable mAudioRecordRunnable;
   private Thread mAudioThread;
   private final Object mRecordStateLock = new Object();
   private boolean mShouldRecord = false;

   private CGEFrameRecorder mFrameRecorder = new CGEFrameRecorder();

   public ExtendedCameraRecordGLSurfaceView(Context context, AttributeSet attrs) {
       super(context, attrs);
   }

   public void startRecording(int fps, int bitRate, String recordFilename, MethodChannel.Result result) {
       
       Log.i(LOG_TAG, "I'm in startRecording" + recordFilename);
       new Runnable() {
           @Override
           public void run() {
               Log.i(LOG_TAG, "I'm in run");
               if (mFrameRecorder == null) {
                   Log.i(LOG_TAG, "Error: startRecording after release!!");
                   result.success(false);
                   return;
               }

               if (!mFrameRecorder.startRecording(fps, bitRate, recordFilename)) {
                   Log.i(LOG_TAG, "start recording failed!");
                   result.success(false);
                   return;
               }
               Log.i(LOG_TAG, "glSurfaceView recording, file: " + recordFilename);
               synchronized (mRecordStateLock) {
                   mShouldRecord = true;
                   mAudioRecordRunnable = new AudioRecordRunnable(new StartRecordingCallback() {
                       @Override
                       public void startRecordingOver(boolean success) {
                           result.success(success);
                       }
                   });
                   if (mAudioRecordRunnable.audioRecord != null) {
                       mAudioThread = new Thread(mAudioRecordRunnable);
                       mAudioThread.start();
                   }
               }
           }
       }.run();
   }

   private interface StartRecordingCallback {
       void startRecordingOver(boolean success);
   }

   private class AudioRecordRunnable implements Runnable {
       int bufferSize;
       int bufferReadResult;
       public AudioRecord audioRecord;
       public volatile boolean isInitialized;
       private static final int sampleRate = 44100;
       ByteBuffer audioBufferRef;
       ShortBuffer audioBuffer;
       StartRecordingCallback recordingCallback;

       private AudioRecordRunnable(StartRecordingCallback callback) {
           recordingCallback = callback;
           try {
               bufferSize = AudioRecord.getMinBufferSize(sampleRate,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               Log.i(LOG_TAG, "audio min buffer size: " + bufferSize);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleRate,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
               audioBufferRef = ByteBuffer.allocateDirect(bufferSize * 2).order(ByteOrder.nativeOrder());
               audioBuffer = audioBufferRef.asShortBuffer();
           } catch (Exception e) {
               if (audioRecord != null) {
                   audioRecord.release();
                   audioRecord = null;
               }
           }

           if (audioRecord == null && recordingCallback != null) {
               recordingCallback.startRecordingOver(false);
               recordingCallback = null;
           }
       }

       public void run() {
           android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
           this.isInitialized = false;

           if (this.audioRecord == null) {
               recordingCallback.startRecordingOver(false);
               recordingCallback = null;
               return;
           }

           while (this.audioRecord.getState() == 0) {
               try {
                   Thread.sleep(100L);
               } catch (InterruptedException localInterruptedException) {
                   localInterruptedException.printStackTrace();
               }
           }
           this.isInitialized = true;

           try {
               this.audioRecord.startRecording();
           } catch (Exception e) {
               if (recordingCallback != null) {
                   recordingCallback.startRecordingOver(false);
                   recordingCallback = null;
               }
               return;
           }

           if (this.audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
               if (recordingCallback != null) {
                   recordingCallback.startRecordingOver(false);
                   recordingCallback = null;
               }
               return;
           }

           if (recordingCallback != null) {
               recordingCallback.startRecordingOver(true);
               recordingCallback = null;
           }

           while (true) {
               synchronized (mRecordStateLock) {
                   if (!mShouldRecord)
                       break;
               }

               audioBufferRef.position(0);
               bufferReadResult = this.audioRecord.read(audioBufferRef, bufferSize * 2);
               if (mShouldRecord && bufferReadResult > 0 && mFrameRecorder != null &&
                       mFrameRecorder.getTimestamp() > mFrameRecorder.getAudioStreamtime()) {
                   audioBuffer.position(0);
                   mFrameRecorder.recordAudioFrame(audioBuffer, bufferReadResult / 2);
               }
           }
           this.audioRecord.stop();
           this.audioRecord.release();
           Log.i(LOG_TAG, "Audio thread end!");
       }
   }
}


I call it in a following way from my project (mCameraView is object of ExtendedCameraRecordGLSurfaceView )

    mCameraView.startRecording(30, 1650000, recordFilename, new MethodChannel.Result() {
        @Override
        public void success(Object result) {
            Log.i(LOG_TAG, "Start recording OK");
        }

        @Override
        public void error(String errorCode, String errorMessage, Object errorDetails) {
            Log.i(LOG_TAG, "Start recording failed");
        }

        @Override
        public void notImplemented() {
            // todo
        }
    });
   

from android-gpuimage-plus.

wysaid avatar wysaid commented on August 16, 2024

Give a fork of this repo, and produce the problem please.

from android-gpuimage-plus.

DannyFilo avatar DannyFilo commented on August 16, 2024

Hi, I mean have described the problem above.

As per your advise I extended CameraRecordGLSurfaceView to ExtendedCameraRecordGLSurfaceView and I'm trying to force running startRecording(int fps, int bitRate, String recordFilename, MethodChannel.Result result) method to be able to increase bitrate and fps in camera recordings on the new private CGEFrameRecorder mFrameRecorder = new CGEFrameRecorder(); object instance.

The only issue I have is that mFrameRecorder.startRecording(int fps, int bitRate, String recordFilename, MethodChannel.Result result) always returns false probably because of incorrect initialsation of CGEFrameRecorder.

I guess initialising CGEFrameRecorder mFrameRecorder = new CGEFrameRecorder(); instance in that way is not enough to have a correct mFrameRecorder object.

So appreciate if you can give a hint how to get correct non-returning-false CGEFrameRecorder object.

Many many thanks

from android-gpuimage-plus.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.