google / grafika Goto Github PK
View Code? Open in Web Editor NEWGrafika test app
License: Apache License 2.0
Grafika test app
License: Apache License 2.0
/**
* Computes the data buffer offset for the next place to store data.
* <p>
* Equal to the start of the previous packet's data plus the previous packet's length.
*/
private int getHeadStart() {
if (mMetaHead == mMetaTail) {
// list is empty
return 0;
}
final int dataLen = mDataBuffer.length;
final int metaLen = mPacketStart.length;
int beforeHead = (mMetaHead + metaLen - 1) % metaLen;
return (mPacketStart[beforeHead] + mPacketLength[beforeHead] + 1) % dataLen;
}
Why "+1" in the return statement? Is this intentional?
I want to record a 640_640 video.I tried to set the video size in CircularEncoder and continiousCapture
The video size is 640_640. But the video is stretched. What I need to change to archive that?? and the video is rotated. how can I handle the rotation
Thanks
Amlan.
please understand my poor English too..
i try Full-screen record from CameraCaptureActivity. but i got poor quality output.
how can i get better quality output?
http://stackoverflow.com/questions/36804487/implement-grafika-for-gpuimage-camera-live-filter
@fadden please help with this asap.
Hello again,
This isn't really a bug per say I just wanted to ask why does the preview in the show and capture activity seem deformed on a nexus 7. I'm asking here so that others can see this as well.
It's like the preview has been stretched a bit from a square to a rectangle maybe?
Is this caused by the openCamera(1280, 720);
line in onResume part of the activity there?
Is there a way to maybe get the devices/screen optimal resolution to avoid such distortions?
Thank you :)
Cool project first of all. :) I find it very useful.
Now whenever I Build and Launch it on Nexus 4 with 4.4.2 or Moto X with 4.3 it crashes with the following.
Is this a bug or am I doing something wrong?
03-28 14:04:44.331: E/WindowManager(1241): android.view.WindowLeaked: Activity com.android.grafika.MainActivity has leaked window com.android.internal.policy.impl.PhoneWindow$DecorView{52837850 V.E..... R....... 0,0-729,276} that was originally added here
03-28 14:04:44.331: E/WindowManager(1241): at android.view.ViewRootImpl.<init>(ViewRootImpl.java:348)
03-28 14:04:44.331: E/WindowManager(1241): at android.view.WindowManagerGlobal.addView(WindowManagerGlobal.java:248)
03-28 14:04:44.331: E/WindowManager(1241): at android.view.WindowManagerImpl.addView(WindowManagerImpl.java:69)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.Dialog.show(Dialog.java:286)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.AlertDialog$Builder.show(AlertDialog.java:951)
03-28 14:04:44.331: E/WindowManager(1241): at com.android.grafika.ContentManager.prepareContent(ContentManager.java:126)
03-28 14:04:44.331: E/WindowManager(1241): at com.android.grafika.ContentManager.createAll(ContentManager.java:113)
03-28 14:04:44.331: E/WindowManager(1241): at com.android.grafika.MainActivity.onCreate(MainActivity.java:121)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.Activity.performCreate(Activity.java:5231)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1087)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2159)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2245)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.ActivityThread.access$800(ActivityThread.java:135)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1196)
03-28 14:04:44.331: E/WindowManager(1241): at android.os.Handler.dispatchMessage(Handler.java:102)
03-28 14:04:44.331: E/WindowManager(1241): at android.os.Looper.loop(Looper.java:136)
03-28 14:04:44.331: E/WindowManager(1241): at android.app.ActivityThread.main(ActivityThread.java:5017)
03-28 14:04:44.331: E/WindowManager(1241): at java.lang.reflect.Method.invokeNative(Native Method)
03-28 14:04:44.331: E/WindowManager(1241): at java.lang.reflect.Method.invoke(Method.java:515)
03-28 14:04:44.331: E/WindowManager(1241): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:779)
03-28 14:04:44.331: E/WindowManager(1241): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:595)
03-28 14:04:44.331: E/WindowManager(1241): at dalvik.system.NativeStart.main(Native Method)
Hello Fadden,
I asked a question about real-time-blur on Stack Overflow and was delighted when you responded with this Grafika tool.
My only issue is that I notice the blur is not as strong as id like it to be,
and ive been playing around with this:
case CameraCaptureActivity.FILTER_BLUR:
programType = Texture2dProgram.ProgramType.TEXTURE_EXT_FILT;
kernel = new float[] {
1f/16f, 2f/16f, 1f/16f,
2f/16f, 4f/16f, 2f/16f,
1f/16f, 2f/16f, 1f/16f };
break;
but I cant quite figure out what to change in that 3x3 image blur radius matrix, or whatever those fractions represent to change them to make the blur stronger.
Any idea(s)?
Thanks again!
Steve
Hi I am using Double decode example to load videos into TextureView.
Code work most of the time but for some mp4 files it crash giving only :
libc: Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 7998
Same code works for webM files and most mp4 files. Place where I expect it to fail is :
extractor = new MediaExtractor();
extractor.setDataSource(sourceFile);
in MoviePlayer [L:113]
Any hint how to walk around or what might be the problem?
i tried VideoEncoderCore.java to do hw encoder on SM-N9100 | Galaxy Note 4, but i got the following error:
09-10 17:28:39.809: D/VideoEncoderCore(15144): format: {height=480, width=640, bitrate=500000, mime=video/avc, frame-rate=20, i-frame-interval=1, color-format=2130708361}
09-10 17:28:39.809: I/ACodec(15144): [] Now uninitialized
09-10 17:28:39.809: I/OMXClient(15144): Using client-side OMX mux.
09-10 17:28:39.929: I/ACodec(15144): [OMX.qcom.video.encoder.avc] Now Loaded
09-10 17:28:39.929: E/ACodec(15144): onConfigureComponent mime.c_str() = video/avc
09-10 17:28:39.929: E/ACodec(15144): [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -2147483648
09-10 17:28:39.929: W/ACodec(15144): do not know color format 0x7fa30c04 = 2141391876
09-10 17:28:39.929: W/ACodec(15144): do not know color format 0x7f000789 = 2130708361
09-10 17:28:39.939: I/ACodec(15144): [OMX.qcom.video.encoder.avc] setupVideoEncoder succeeded
09-10 17:28:39.939: W/ACodec(15144): do not know color format 0x7f000789 = 2130708361
09-10 17:28:39.939: I/ACodec(15144): [OMX.qcom.video.encoder.avc] Now Loaded->Idle
09-10 17:28:39.949: I/ACodec(15144): [OMX.qcom.video.encoder.avc] Now Idle->Executing
09-10 17:28:39.949: I/ACodec(15144): [OMX.qcom.video.encoder.avc] Now Executing
09-10 17:28:39.959: D/EglCore(15144): EGLContext created, client version 3
09-10 17:28:39.979: D/Texture2dProgram(15144): Created program 6 (TEXTURE_EXT)
09-10 17:28:39.979: D/VideoEncoderCore(15144): drainEncoder(false)
09-10 17:28:39.999: D/VideoEncoderCore(15144): drainEncoder(false)
09-10 17:28:40.009: D/VideoEncoderCore(15144): drainEncoder(false)
09-10 17:28:40.029: D/VideoEncoderCore(15144): drainEncoder(false)
09-10 17:28:40.029: D/VideoEncoderCore(15144): encoder output format changed: {height=480, width=640, csd-1=java.nio.ByteArrayBuffer[position=0,limit=8,capacity=8], mime=video/avc, what=1869968451, csd-0=java.nio.ByteArrayBuffer[position=0,limit=17,capacity=17]}
I want to play video in slow and fast motion.
I added one video from sd-card but there are no audio in it.
Now i want to play video with audio.
how can i do?
Some sort of latency exerciser would be handy. Examples include "pointer trails" to see how far behind the touch input we get, and a big sliding box that tries to move with your finger. For the latter I'm thinking of something like Google Maps but with just a simple texture instead of network tiles.
The goal is to have a test app that shows the benefits of DispSync and eglPresentationTimeANDROID.
Background: http://stackoverflow.com/questions/26317132/minimize-android-glsurfaceview-lag/
Hi All,
I am trying to play recorded video in my app, please check the steps below:
/storage/emulated/0/Android/data/com.test/files/video-1.mp4
---preview fragment appears to show recorded video, fragment piece of code below:
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {
SpeedControlCallback callback = new SpeedControlCallback();
Surface surface = new Surface(surfaceTexture);
MoviePlayer player = null;
try {
player = new MoviePlayer(
new File(getContext().getExternalFilesDir(null), mFilepathToShare),
surface, callback); // line 81
} catch (IOException ioe) {
surface.release();
return;
}
adjustAspectRatio(player.getVideoWidth(), player.getVideoHeight());
mPlayTask = new MoviePlayer.PlayTask(player, this);
mPlayTask.setLoopMode(true);
mPlayTask.execute();
}
Everything is ok, video successfully playing in infinity loop.
Press 'back button' to leave preview fragment (everything is still ok)
---after that i am trying to repeat step 1.
Record another video (press on button to start video recording and press again in a few seconds)
path of the new video is
/storage/emulated/0/Android/data/com.test/files/video-2.mp4
Expected result: preview fragment appears to show second video (like it was in step 2)
Actual result: RuntimeException
In the second time I faced with RuntimeException in the following line:
player = new MoviePlayer(new File(getActivity().getExternalFilesDir(null), mFilepathToShare), surface, callback);
Log message below:
java.lang.RuntimeException: No video track found
in /storage/emulated/0/Android/data/com.test/files/video-2.mp4
at com.test.grafika.MoviePlayer.<init>(MoviePlayer.java:117)
at com.test.preview.ui.PreviewFragment.onSurfaceTextureAvailable(PreviewFragment.java:81)
At first time result of extractor.getTrackCount()
was 1, but in the second time it is 0.
Both file are OK, and successfully played by default video app.
Device: SGS Galaxy S5.
OS Version: Android 5.0.
The MoviePlayer
class works fine for a video loaded from sd-card, but it fails with the following error when i try to load a video from a URL. One of the URLs i tried is this
http://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4
I am running the code on Android M.
I/MediaHTTPConnection: proxyName: 0.0.0.0 0
13110-14961/com.android.grafika E/NuCachedSource2: source returned error -1, 0 retries left
13110-13110/com.android.grafika E/QComExtractorFactory: Sniff FAIL :: coundn't pull enough data for sniffing
13110-13110/com.android.grafika E/Grafika: Unable to play movie
java.io.IOException: Failed to instantiate extractor.
at android.media.MediaExtractor.nativeSetDataSource(Native Method)
at android.media.MediaExtractor.setDataSource(MediaExtractor.java:182)
at com.android.grafika.MoviePlayer.<init>(MoviePlayer.java:113)at com.android.grafika.PlayMovieActivity.clickPlayStop(PlayMovieActivity.java:189)
at java.lang.reflect.Method.invoke(Native Method)
at android.view.View$DeclaredOnClickListener.onClick(View.java:4450)
at android.view.View.performClick(View.java:5201)
at android.view.View$PerformClick.run(View.java:21163)
at android.os.Handler.handleCallback(Handler.java:746)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:148)
at android.app.ActivityThread.main(ActivityThread.java:5443)
at java.lang.reflect.Method.invoke(Native Method)
atcom.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:728)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:618)
Some excellent research by Petros Douvantzis determined that there is a race in the texture handling in "show + capture camera". The source of the problem is the use of a shared EGL context -- the camera frames are converted to an "external" texture in one context, but rendered from another.
This discussion on the WebGL mailing list discusses the matter. Appendix C in the OpenGL ES spec, and section 2.4 in the EGL 1.5 spec, explain the expected behavior.
For correct behavior, the application code must ensure mutually exclusive access during the texture update, and needs to issue GL commands that effectively provide memory barriers. On the producer side, updateTexImage()
must be followed by glFinish()
, with the two wrapped with a synchronized
block or the write op of a read/write lock. On the consumer side, the texture must be re-bound before being drawn to effect the memory barrier, and the rendering operation must be synchronized
or within the read op of a read/write lock.
These operations will potentially stall the involved threads, reducing throughput. It's better, and simpler, to use a single EGLContext, and call updateTexImage()
from the thread that does the rendering. It's possible to use a single context with GLSurfaceView by attaching the SurfaceTexture to the GLSurfaceView's context with the appropriate API calls. The "show + capture camera" demo should be updated to use this approach, and avoid shared contexts altogether.
The other Activities in Grafika use SurfaceView and a single context, and are not affected by this issue.
Hi,
I tried to find answer and looked through the Android code on the grep code. So just to be 100% will ask here. Is it possible to adopt Yours DoubleDecodeActivity approach to have 2 Activities and share video between them i.e. call recreateView() from an another Activity? I suppose no. I've tried and your approach perfectly works within the same activity with rotation and etc, but doesn't work when I try to attach TextureView from an another Activity. With
09-01 20:24:57.703 21115-21151/simpleApp E/GLConsumer: [unnamed-21115-0] updateAndRelease: GLConsumer is not attached to an OpenGL ES context
I saw that MediaPlayer allows to attach some other TextureView but didn't go through the code to check if it;s possible with multiple Activities
Could you be so kind to give a hint how it can be achieved (preserve Video stream across Activities).
Thanks in advance
I am not able to play sound in video. I have used DoubleDecodeActivity to show videos in TextureView.
Is there any solution for it?
I want to play 4 same videos in different angles.
Is it possible to get a frame from a VideoSurfaceView with grafika?
unable to apply emboss , sharpen ,blur and edge detection filter for full preview.
if any one have idea about that please let me know.
With Camera 2 API being the way to go in the future are there any plans to update this repo to use Camera 2 API? I would really look forward to use the ContinousCapture Feature in the future as well.
Hello :)
Just a small suggestion.
In Texture2DProgram in the draw method before it starts drawing it should do
GLES20.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
To clear the buffer and not leave residues from previous draws.
I launch the grafika, a dialog turned out and said "Failed to generate cotent. Some features may be unavailable. Can't use input surface with software codec: OMX.google.h264.encoder"
Then I click OK button and entered the continuous caputure activity, the app crashed.
My device infomation is as follows:
Model: Lenovo A560
Android version: 4.3
I am sure this is the key log:
06-29 21:27:50.267 E/OMXMaster(17347): A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
06-29 21:27:50.267 I/SoftAVCEncoder(17347): Construct SoftAVCEncoder
06-29 21:27:50.267 D/ResourceManager(17347): findUseCaseAndSetParameter - mime=video/avc,componentName=OMX.google.h264.encoder,isDecoder=0
06-29 21:27:50.267 D/ResourceManager(17347): findUseCaseAndSetParameter-useCase =,useCaseFlag = 0, codecFlags = 0
06-29 21:27:50.267 D/ResourceManager(17347): mime = video/avc, componentName = OMX.google.h264.encoder, isDecoder = 0
06-29 21:27:50.267 D/ResourceManager(17347): software video useCase =
06-29 21:27:50.267 I/ACodec (17347): setupVideoEncoder succeeded
06-29 21:27:50.267 E/OMXNodeInstance(17347): createInputSurface requires AndroidOpaque color format
06-29 21:27:50.267 E/ACodec (17347): [OMX.google.h264.encoder] onCreateInputSurface returning error -38
06-29 21:27:50.267 W/MediaCodec(17347): createInputSurface failed, err=-38
06-29 21:27:50.277 D/KeyguardUpdateMonitor( 1249): sendKeyguardVisibilityChanged(true)
06-29 21:27:50.277 D/KeyguardUpdateMonitor( 1249): handleKeyguardVisibilityChanged(1)
06-29 21:27:50.277 D/AndroidRuntime(17347): Shutting down VM
06-29 21:27:50.277 W/dalvikvm(17347): threadid=1: thread exiting with uncaught exception (group=0x415ac8b0)
06-29 21:27:50.297 E/AndroidRuntime(17347): FATAL EXCEPTION: main
06-29 21:27:50.297 E/AndroidRuntime(17347): java.lang.IllegalStateException
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.media.MediaCodec.createInputSurface(Native Method)
06-29 21:27:50.297 E/AndroidRuntime(17347): at com.android.grafika.CircularEncoder.(CircularEncoder.java:124)
06-29 21:27:50.297 E/AndroidRuntime(17347): at com.android.grafika.ContinuousCaptureActivity.surfaceCreated(ContinuousCaptureActivity.java:383)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.SurfaceView.updateWindow(SurfaceView.java:571)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.SurfaceView.access$000(SurfaceView.java:86)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:175)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:833)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:1860)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1004)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:5481)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.Choreographer$CallbackRecord.run(Choreographer.java:749)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.Choreographer.doCallbacks(Choreographer.java:562)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.Choreographer.doFrame(Choreographer.java:532)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:735)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.os.Handler.handleCallback(Handler.java:730)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.os.Handler.dispatchMessage(Handler.java:92)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.os.Looper.loop(Looper.java:137)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.app.ActivityThread.main(ActivityThread.java:5136)
06-29 21:27:50.297 E/AndroidRuntime(17347): at java.lang.reflect.Method.invokeNative(Native Method)
06-29 21:27:50.297 E/AndroidRuntime(17347): at java.lang.reflect.Method.invoke(Method.java:525)
06-29 21:27:50.297 E/AndroidRuntime(17347): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:737)
06-29 21:27:50.297 E/AndroidRuntime(17347): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:553)
06-29 21:27:50.297 E/AndroidRuntime(17347): at dalvik.system.NativeStart.main(Native Method)
06-29 21:27:50.317 I/ActivityManager( 1249): Notify an ApplicationCrash
Dear fadden, How to fix this issue?
When I try to open TextureFromCameraActivity, my app has stopped working, and this is the main reason:
java.lang.RuntimeException: Fail to connect to camera service
at this line in TextureFromCameraActivity:
mCamera = Camera.open(i);
i use the MedaiCodec to recorder the vidoe,when mMuxer.stop(). allways have Failed to stop the muxer?!
hi Fadden
I noticed you released some sample code on bigflake.com/mediacodec, one sample is DecodeEditEncodeTest.java
I am studying this file and have one question: I cannot find the definition of InputSurface and OutputSurface
Could you also share this part sample code? Thanks a lot.
Hey @fadden or @google-admin !
I am trying to run the ContinuousCaptureActivity app (Show + Capture camera) on my Nexus 6 (and Nexus 5x) and after about 6-7 minutes, the preview always gets stuck on the last frame and stays there. Make sure to not click on "start recording", just watch the preview.
Narrowing it down, its obviously not the TextureMovieEncoder (since i'm not recording yet), and onDrawFrame is still getting called, and same with onFrameAvailable, so either the frames from the camera are getting stuck or the opengl texture is getting stuck.
I am running Nexus 6 Android 6.0.1 API 23 and there are also no significant logs that I can see.
This might also be relevant to
#36
How to set key frame when use MediaCodec encode video?
when i click the "Start recording" button in CameraCaptureActivity , the crash happen always.
my phone is HUAWEI P6-U06 , sdk 19
here is the log:
06-12 10:22:37.030: E/AndroidRuntime(6301): FATAL EXCEPTION: TextureMovieEncoder
06-12 10:22:37.030: E/AndroidRuntime(6301): Process: com.android.grafika, PID: 6301
06-12 10:22:37.030: E/AndroidRuntime(6301): java.lang.RuntimeException: eglCreateContext: EGL error: 0x3009
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.gles.EglCore.checkEglError(EglCore.java:370)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.gles.EglCore.(EglCore.java:126)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder.prepareEncoder(TextureMovieEncoder.java:385)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder.handleStartRecording(TextureMovieEncoder.java:312)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder.access$0(TextureMovieEncoder.java:309)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder$EncoderHandler.handleMessage(TextureMovieEncoder.java:281)
06-12 10:22:37.030: E/AndroidRuntime(6301): at android.os.Handler.dispatchMessage(Handler.java:102)
06-12 10:22:37.030: E/AndroidRuntime(6301): at android.os.Looper.loop(Looper.java:136)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder.run(TextureMovieEncoder.java:248)
06-12 10:22:37.030: E/AndroidRuntime(6301): at java.lang.Thread.run(Thread.java:841)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 1 is not owned by the client (state=3)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 3 is not owned by the client (state=0)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 0 is not owned by the client (state=0)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 1 is not owned by the client (state=3)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 2 is not owned by the client (state=0)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 3 is not owned by the client (state=0)
is this phone doesn't support the opengles? is there any way to avoid this?
appreciate ur answer.
EGL14.eglReleaseThread() causes the application to terminate instantly on Android 4.4 devices:
https://github.com/google/grafika/blob/master/src/com/android/grafika/gles/EglCore.java#L191
Simply commenting the call out appears to resolve all issues; I'm unsure whether this is causing any leakage.
I am having an issue getting to 30 fps (1280x720) on my device. The mediarecorder api is capable of this frame rate, but the camera capture sample peeks around 19 fps. Smaller resolutions seem to be better. Is this expected? Where in the code would I look for optimizations (if you think there may be some)?
Readme contains : Experiment with alternatives to glReadPixels(). Add a PBO speed test. (Doesn't seem to be a way to play with eglCreateImageKHR from Java.)
but It seems it has not been added yet.
I am trying to use https://github.com/CyberAgent/android-gpuimage for some image processing tasks. There seems to a speed bottle neck at https://github.com/CyberAgent/android-gpuimage/blob/febdf4900b437b2069661fd371d5469196e26f18/library/src/jp/co/cyberagent/android/gpuimage/PixelBuffer.java#L194
If there is some faster alternatives to glReadPixels it would greatly enhance the image processing time.
I have run your project(https://github.com/google/grafika) in this project I played a video
in slow motion
by changing Play video(TextureView) activity
's FixedPlaybackRate
value
SpeedControlCallback callback = new SpeedControlCallback();
callback.setFixedPlaybackRate(1);
It had played a whole video slowly. what I actually need is have to play video slowly for selected intervals
not the entire video.is it any possible to set the regions(intervals)
?.
Gratitude can find so good content,
I want to know, if can Change the video speed,then produce native video file(different speed).
very thankful.
In continuous capture Activity, when press power button and press it again, the continuous capture does not work. The reason seems that the Surfaceview has not been destoried. So we need do the init in callback surfaceCreated Again after onResume if the sruface is still alive.
Hi,
One of the features in the list to be completed is "Use virtual displays to record app activity.". Can you please let me know if this is even possible today? There is a comment at http://www.bigflake.com/screenrecord/ that "There's no API to allow an application to record itself.". I am trying to record a webview of Cordova App.
Thanks,
Rishi
Thanx for this great repository of Android Java media samples.
Is there any way to capture the output of a webgl canvas element to feed into, e.g., the Record GL app (https://github.com/google/grafika/blob/master/src/com/android/grafika/RecordFBOActivity.java? We need to do this with high performance, not just something that involves string encoding, etc.
Any ideas?
Hi all, I am using the movie player to reproduce a list of videos. If I reproduce the videos for a day we have a issue with memory.
I/ActivityManager( 439): Low on memory:
I/ActivityManager( 439): ntv N 630053 kB: mediaserver (109) native
I/ActivityManager( 439): ntv N 3631 kB: zygote (107) native
I/ActivityManager( 439): ntv N 1248 kB: surfaceflinger (106) native
I/ActivityManager( 439): ntv N 977 kB: drmserver (108) native
I/ActivityManager( 439): ntv N 531 kB: netd (102) native
This produces that our applications goes down.
Also I have this trace (like in a loop) in the log:
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
Any idea?
All the best!
Hello.
We are developing an app using Grafika.
However, the crash occurs in the following part in some devices.
Is the problem a bug in the device and its OS?
Is there a way to fix?
[crash log]
12-07 18:49:58.398 22288 24445 F libc : Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 24445 (TextureMovieEnc)
12-07 18:49:58.479 476 476 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
12-07 18:49:58.483 476 476 F DEBUG : Build fingerprint: 'google/bullhead/bullhead:6.0/MDA89E/2296692:user/release-keys'
12-07 18:49:58.483 476 476 F DEBUG : Revision: 'rev_1.0'
12-07 18:49:58.484 476 476 F DEBUG : ABI: 'arm'
12-07 18:49:58.485 476 476 F DEBUG : pid: 22288, tid: 24445, name: TextureMovieEnc
12-07 18:49:58.485 476 476 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
12-07 18:49:58.576 476 476 F DEBUG : r0 00000000 r1 00000000 r2 d60c6670 r3 00000001
12-07 18:49:58.576 476 476 F DEBUG : r4 d5fad5c0 r5 dc896000 r6 00000006 r7 00000000
12-07 18:49:58.577 476 476 F DEBUG : r8 00000002 r9 d60cb800 sl d2b70480 fp 00000000
12-07 18:49:58.577 476 476 F DEBUG : ip 00000000 sp d4a8f338 lr e9d9ca7d pc e9d9cad6 cpsr a00e0030
12-07 18:49:58.617 476 476 F DEBUG :
12-07 18:49:58.617 476 476 F DEBUG : backtrace:
12-07 18:49:58.618 476 476 F DEBUG : #00 pc 000c7ad6 /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlBindTexture(unsigned int, unsigned int)+361)
12-07 18:49:58.618 476 476 F DEBUG : #1 pc 7442301d /data/dalvik-cache/arm/system@[email protected] (offset 0x1ec4000)
12-07 19:19:48.344 28977 32344 F libc : Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 32344 (TextureMovieEnc)
12-07 19:19:48.436 476 476 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
12-07 19:19:48.438 476 476 F DEBUG : Build fingerprint: 'google/bullhead/bullhead:6.0/MDA89E/2296692:user/release-keys'
12-07 19:19:48.438 476 476 F DEBUG : Revision: 'rev_1.0'
12-07 19:19:48.438 476 476 F DEBUG : ABI: 'arm'
12-07 19:19:48.438 476 476 F DEBUG : pid: 28977, tid: 32344, name: TextureMovieEnc
12-07 19:19:48.438 476 476 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
12-07 19:19:48.734 476 476 F DEBUG : r0 00000000 r1 00000000 r2 00000000 r3 00000006
12-07 19:19:48.734 476 476 F DEBUG : r4 00000001 r5 d768e090 r6 00000000 r7 d7743540
12-07 19:19:48.734 476 476 F DEBUG : r8 00000002 r9 d4520000 sl 00000001 fp d5ea00c0
12-07 19:19:48.734 476 476 F DEBUG : ip 00000005 sp d7335120 lr e9e3cdc1 pc e9e3cdc8 cpsr 800e0030
12-07 19:19:48.752 476 476 F DEBUG :
12-07 19:19:48.752 476 476 F DEBUG : backtrace:
12-07 19:19:48.753 476 476 F DEBUG : #00 pc 00167dc8 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::UpdateTextureSampler(EsxSamplerDesc const_, A4xTextureObject const_, A4xSamplerObject const_)+187)
12-07 19:19:48.753 476 476 F DEBUG : #1 pc 001680a3 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateTexSamplersCommon(A4xProgram_, int, EsxBitField96_)+302)
12-07 19:19:48.753 476 476 F DEBUG : #2 pc 001683cb /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateTexSamplers()+58)
12-07 19:19:48.754 476 476 F DEBUG : #3 pc 00166ab5 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateState(EsxDrawDescriptor const_)+1564)
12-07 19:19:48.754 476 476 F DEBUG : #4 pc 00166e7d /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::HwValidateGfxState(EsxDrawDescriptor const_)+4)
12-07 19:19:48.754 476 476 F DEBUG : #5 pc 000e6f0b /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::ValidateGfxState(EsxDrawDescriptor const_)+442)
12-07 19:19:48.754 476 476 F DEBUG : #6 pc 000dc509 /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::DrawArraysInstanced(EsxPrimType, int, unsigned int, unsigned int)+112)
12-07 19:19:48.755 476 476 F DEBUG : #7 pc 000c8fff /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlDrawArrays(unsigned int, int, int)+62)
12-07 19:19:48.755 476 476 F DEBUG : #8 pc 744237a5 /data/dalvik-cache/arm/system@[email protected] (offset 0x1ec4000)
12-07 20:17:23.719 4545 21239 F libc : Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 21239 (TextureMovieEnc)
12-07 20:17:23.801 476 476 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
12-07 20:17:23.803 476 476 F DEBUG : Build fingerprint: 'google/bullhead/bullhead:6.0/MDA89E/2296692:user/release-keys'
12-07 20:17:23.804 476 476 F DEBUG : Revision: 'rev_1.0'
12-07 20:17:23.804 476 476 F DEBUG : ABI: 'arm'
12-07 20:17:23.804 476 476 F DEBUG : pid: 4545, tid: 21239, name: TextureMovieEnc
12-07 20:17:23.804 476 476 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
12-07 20:17:23.833 476 476 W debuggerd: type=1400 audit(0.0:3436): avc: denied { search } dev="dm-2" ino=384999 scontext=u:r:debuggerd:s0 tcontext=u:object_r:app_data_file:s0:c512,c768 tclass=dir permissive=0
12-07 20:17:23.870 476 476 F DEBUG : r0 00000000 r1 00000000 r2 00000000 r3 00000001
12-07 20:17:23.870 476 476 F DEBUG : r4 d6ac3200 r5 00000000 r6 00000001 r7 d7d7e25c
12-07 20:17:23.871 476 476 F DEBUG : r8 00000001 r9 d6517000 sl d7d2bd60 fp d65192a4
12-07 20:17:23.871 476 476 F DEBUG : ip 00000005 sp d3053d10 lr e9e31285 pc e9e31290 cpsr 800e0030
12-07 20:17:23.951 476 476 F DEBUG :
12-07 20:17:23.951 476 476 F DEBUG : backtrace:
12-07 20:17:23.952 476 476 F DEBUG : #00 pc 0015c290 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateYuvConversionConstants(A4xProgram const_, int)+307)
12-07 20:17:23.952 476 476 F DEBUG : #1 pc 0015c5df /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateNamedUniformConstants()+570)
12-07 20:17:23.952 476 476 F DEBUG : #2 pc 001667d7 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateState(EsxDrawDescriptor const_)+830)
12-07 20:17:23.953 476 476 F DEBUG : #3 pc 00166e7d /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::HwValidateGfxState(EsxDrawDescriptor const_)+4)
12-07 20:17:23.943 476 476 W debuggerd: type=1400 audit(0.0:3437): avc: denied { read } for name="kgsl-3d0" dev="tmpfs" ino=8778 scontext=u:r:debuggerd:s0 tcontext=u:object_r:gpu_device:s0 tclass=chr_file permissive=0
12-07 20:17:23.953 476 476 W debuggerd: type=1400 audit(0.0:3438): avc: denied { read } for name="kgsl-3d0" dev="tmpfs" ino=8778 scontext=u:r:debuggerd:s0 tcontext=u:object_r:gpu_device:s0 tclass=chr_file permissive=0
12-07 20:17:23.953 476 476 F DEBUG : #4 pc 000e6f0b /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::ValidateGfxState(EsxDrawDescriptor const_)+442)
12-07 20:17:23.953 476 476 F DEBUG : #5 pc 000dc509 /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::DrawArraysInstanced(EsxPrimType, int, unsigned int, unsigned int)+112)
12-07 20:17:23.953 476 476 F DEBUG : #6 pc 000c8fff /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlDrawArrays(unsigned int, int, int)+62)
12-07 20:17:23.953 476 476 F DEBUG : #7 pc 744237a5 /data/dalvik-cache/arm/system@[email protected] (offset 0x1ec4000)
F/libc ( 4117): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 4459 (TextureMovieEnc)
I/DEBUG ( 355): *** *** *** *** *** *** *** *** *** *** *** *** *** *** ** ***
I/DEBUG ( 355): Build fingerprint: 'google/shamu/shamu:5.1.1/LMY47Z/1860966:user/release-keys'
I/DEBUG ( 355): Revision: '33696'
I/DEBUG ( 355): ABI: 'arm'
I/DEBUG ( 355): pid: 4117, tid: 4459, name: TextureMovieEnc
I/DEBUG ( 355): signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
I/DEBUG ( 355): r0 9baa2600 r1 00000000 r2 00000000 r3 00000000
I/DEBUG ( 355): r4 00000001 r5 9bff3100 r6 9baa2600 r7 aecc1e5c
I/DEBUG ( 355): r8 9be42f28 r9 00000001 sl 9bff3100 fp 9be42f00
I/DEBUG ( 355): ip 9bb30e00 sp 93f7b6a0 lr aac04477 pc aac04240 cpsr 200b8c30
I/DEBUG ( 355):
I/DEBUG ( 355): backtrace:
I/DEBUG ( 355): #00 pc 00100240 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxResource::GfxMem(unsigned int) const+7)
I/DEBUG ( 355): #1 pc 00100473 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxResource::UpdateGfxMemReference(EsxCmdMgr_, unsigned int, EsxAccessType)+18)
I/DEBUG ( 355): #2 pc 00100537 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxResource::UpdatePackedGfxMemReference(EsxCmdMgr_, EsxSubResourceRange const_, EsxAccessType)+170)
I/DEBUG ( 355): #3 pc 0014b357 /system/vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateTexSamplers()+434)
I/DEBUG ( 355): #4 pc 00149e97 /system/vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateState(EsxDrawDescriptor const_)+1394)
I/DEBUG ( 355): #5 pc 0014a0a1 /system/vendor/lib/egl/libGLESv2_adreno.so (A4xContext::HwValidateGfxState(EsxDrawDescriptor const_)+8)
I/DEBUG ( 355): #6 pc 00114235 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxContext::ValidateGfxState(EsxDrawDescriptor const_)+420)
I/DEBUG ( 355): #7 pc 00117211 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxContext::DrawArraysInstanced(EsxPrimType, int, unsigned int, unsigned int)+92)
I/DEBUG ( 355): #8 pc 000b2e81 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlDrawArrays(unsigned int, int, int)+52)
I/DEBUG ( 355): #9 pc 000e54eb /system/vendor/lib/egl/libGLESv2_adreno.so (EsxGlApiParamValidate::GlDrawArrays(EsxDispatch, unsigned int, int, int)+58)
I/DEBUG ( 355): #10 pc 000a9a61 /system/vendor/lib/egl/libGLESv2_adreno.so (glDrawArrays+44)
I/DEBUG ( 355): #11 pc 00b1260b /data/dalvik-cache/arm/system@[email protected]
I/ThermalEngine( 367): ACTION: CPU - Setting CPU[0] to 1190400
I/ThermalEngine( 367): ACTION: CPU - Setting CPU[1] to 1190400
I/ThermalEngine( 367): ACTION: CPU - Setting CPU[2] to 1190400
I/ThermalEngine( 367): ACTION: CPU - Setting CPU[3] to 1190400
F/libc (13050): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 15568 (TextureMovieEnc)
I/DEBUG ( 651): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
I/DEBUG ( 651): UUID: 212c74fe-e554-4ff9-b0e1-3f5f6ad4ecc6
I/DEBUG ( 651): Build fingerprint: 'docomo/SO-03H/SO-03H:5.1.1/32.0.B.0.426/2049702380:user/release-keys'
I/DEBUG ( 651): Revision: '0'
I/DEBUG ( 651): ABI: 'arm'
I/DEBUG ( 651): pid: 13050, tid: 15568, name: TextureMovieEnc
I/DEBUG ( 651): signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
I/DEBUG ( 651): r0 00000000 r1 00000001 r2 00000006 r3 00000000
I/DEBUG ( 651): r4 dee9b000 r5 dec1a600 r6 dece7b70 r7 ded85240
I/DEBUG ( 651): r8 00008d65 r9 00000006 sl dee97f00 fp 00000001
I/DEBUG ( 651): ip ef712f90 sp daf107d8 lr 00000000 pc ef57f2c6 cpsr a00e0030
I/DEBUG ( 651):
I/DEBUG ( 651): backtrace:
I/DEBUG ( 651): #00 pc 000a82c6 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlBindTexture(unsigned int, unsigned int)+341)
I/DEBUG ( 651): #1 pc 000a05bb /system/vendor/lib/egl/libGLESv2_adreno.so (glBindTexture+26)
I/DEBUG ( 651): #2 pc 00b2459d /system/framework/arm/boot.oat
Hi,
I am running grafika MultiSurfaceActivity and Record GL on Andriod Lollipop 5.1 running on Freescale Sabre SD board.
I can see all the test play smoothly when no screen is touched or disturbed, but as soon as I start scrolling the notification bar from top and continue that for long time, the fps gets reduced to 35-40FPS.
I have confirmed the same test on Kitkat 4.4.2 and JB 4.2.2 and they seems to work fine.
So behaviour when playing MP4 from Gallery. The video gets stuck and lags a lot when we start playing with Notification bar
Can you share your though on this. I also had to change BufferQueue implementation to ASync mode to achieve over 60FPS.
Regards,
Gurtaj
Hi, thanks for useful awesome example!
i want draw bitmap image on my recorded video, instead of Box in TextureMovieEncoder. (like watermark in video, not camerapreview)
private void drawBox() {
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
Bitmap waterMark = BitmapFactory.decodeResource(mContext.getResources(), R.drawable.ic_launcher);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, waterMark, 0);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
waterMark.recycle();
}
why does this not work?
and how can i make this work?
Only on Nexus 9, Nexus 7 work well.
For the "GL Recorder" activity can the rendering be done in native C++ and use the rest of the implementation as is?
E.g:
GLES20.glBindFramebuffer(GL_FRAMBUFFER, javaFBO);
nativeRender(...);
GLES20.glBindFramebuffer(GL_FRAMEBUFFER, 0);
drawToScreen();
Great job on this sandbox, learning a lot from it!
Thanks,
Cristina
I record videos the capture activity and then play them back which is ok. I wanna do some video manipulation based on their framerate but it won't work as most of the video report wrong frame rates. I am using Nexus 5 with 4.4.2 installed on it with latest grafika code.
I've started android programming. I tried to review your code to CameraCameraActivity.
I don't know how can I record both video and filters to a file.
Please show me your hint.
If I comment the line 121 ContentManager.getInstance().createAll(this);
where the app crashes the list menu appears correctly. If I then chose Show + capture camera
it shows the preview correctly. I then press the "Record" button and get the following exception and the app crashes.
03-28 14:10:16.726: I/OMXClient(1469): Using client-side OMX mux.
03-28 14:10:16.726: I/SoftAVCEncoder(1469): Construct SoftAVCEncoder
03-28 14:10:16.726: I/ACodec(1469): setupVideoEncoder succeeded
03-28 14:10:16.726: E/OMXNodeInstance(1469): createInputSurface requires AndroidOpaque color format
03-28 14:10:16.726: E/ACodec(1469): [OMX.google.h264.encoder] onCreateInputSurface returning error -38
03-28 14:10:16.726: W/MediaCodec(1469): createInputSurface failed, err=-38
03-28 14:10:16.726: W/dalvikvm(1469): threadid=12: thread exiting with uncaught exception (group=0xa4bfd648)
03-28 14:10:16.730: E/AndroidRuntime(1469): FATAL EXCEPTION: TextureMovieEncoder
03-28 14:10:16.730: E/AndroidRuntime(1469): java.lang.IllegalStateException
03-28 14:10:16.730: E/AndroidRuntime(1469): at android.media.MediaCodec.createInputSurface(Native Method)
03-28 14:10:16.730: E/AndroidRuntime(1469): at com.android.grafika.VideoEncoderCore.<init>(VideoEncoderCore.java:79)
03-28 14:10:16.730: E/AndroidRuntime(1469): at com.android.grafika.TextureMovieEncoder.prepareEncoder(TextureMovieEncoder.java:381)
03-28 14:10:16.730: E/AndroidRuntime(1469): at com.android.grafika.TextureMovieEncoder.handleStartRecording(TextureMovieEncoder.java:312)
03-28 14:10:16.730: E/AndroidRuntime(1469): at com.android.grafika.TextureMovieEncoder.access$0(TextureMovieEncoder.java:309)
03-28 14:10:16.730: E/AndroidRuntime(1469): at com.android.grafika.TextureMovieEncoder$EncoderHandler.handleMessage(TextureMovieEncoder.java:281)
03-28 14:10:16.730: E/AndroidRuntime(1469): at android.os.Handler.dispatchMessage(Handler.java:99)
03-28 14:10:16.730: E/AndroidRuntime(1469): at android.os.Looper.loop(Looper.java:137)
03-28 14:10:16.730: E/AndroidRuntime(1469): at com.android.grafika.TextureMovieEncoder.run(TextureMovieEncoder.java:248)
03-28 14:10:16.730: E/AndroidRuntime(1469): at java.lang.Thread.run(Thread.java:841)
03-28 14:10:16.734: D/dalvikvm(1469): GC_FOR_ALLOC freed 110K, 7% free 4238K/4548K, paused 3ms, total 3ms
I have been going for 2 days and cant run your app it fails with Unable to generate content dialog and this shows on the logcat. My device is a samsung tablet running cynogenmod 12.1 (Lollipop 5.1)
Any ideas?
04-17 15:11:12.552 4577-4606/com.android.grafika E/MPEG4Writer﹕ timestampUs 125000 < lastTimestampUs 375000 for Video track
04-17 15:11:12.557 2184-2474/? W/GraphicBufferSource﹕ Dropped back down to Loaded without Executing
04-17 15:11:12.598 2184-2184/? E/BufferQueueProducer﹕ [GraphicBufferSource] cancelBuffer: BufferQueue has been abandoned
04-17 15:11:12.602 4577-4596/com.android.grafika D/MPEG4Writer﹕ Video track stopping
04-17 15:11:12.602 4577-4596/com.android.grafika D/MPEG4Writer﹕ Video track source stopping
04-17 15:11:12.602 4577-4596/com.android.grafika D/MPEG4Writer﹕ Video track source stopped
04-17 15:11:12.602 4577-4596/com.android.grafika D/MPEG4Writer﹕ Stopping writer thread
04-17 15:11:12.602 4577-4605/com.android.grafika D/MPEG4Writer﹕ 0 chunks are written in the last batch
04-17 15:11:12.603 4577-4596/com.android.grafika D/MPEG4Writer﹕ Writer thread stopped
04-17 15:11:12.605 4577-4596/com.android.grafika W/Grafika﹕ Failed while generating content
java.lang.IllegalStateException: Failed to stop the muxer
at android.media.MediaMuxer.nativeStop(Native Method)
at android.media.MediaMuxer.stop(MediaMuxer.java:225)
at com.android.grafika.GeneratedMovie.releaseEncoder(GeneratedMovie.java:145)
at com.android.grafika.MovieEightRects.create(MovieEightRects.java:74)
at com.android.grafika.ContentManager.prepare(ContentManager.java:154)
at com.android.grafika.ContentManager.access$000(ContentManager.java:39)
at com.android.grafika.ContentManager$GenerateTask.doInBackground(ContentManager.java:235)
at com.android.grafika.ContentManager$GenerateTask.doInBackground(ContentManager.java:203)
at android.os.AsyncTask$2.call(AsyncTask.java:292)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
at java.lang.Thread.run(Thread.java:818)
I want to switch the front or back camera in CameraCaptureActivity
(src/com/android/grafika/CameraCaptureActivity.java), as the following:
public boolean switchCamera() {
releaseCamera();
mGLView.onPause();
if (mReqCameraId == Camera.CameraInfo.CAMERA_FACING_BACK) {
mReqCameraId = Camera.CameraInfo.CAMERA_FACING_FRONT;
} else {
mReqCameraId = Camera.CameraInfo.CAMERA_FACING_BACK;
}
openCamera(mReqCameraId);
mGLView.onResume();
mGLView.queueEvent(new Runnable() {
@Override
public void run() {
mRenderer.setCameraPreviewSize(mCameraPreviewWidth, mCameraPreviewHeight);
}
});
return true;
}
It works, but the FOV has been changed when I back to the camera which first time launched. It seems the frame has been clipped.
So where did i miss when switch the front-back camera?
Thanks.
HI,
I want to play a video file and duplicate the rendering views eventually with different filters. If I get it right, I can do it using one surfaceview with several quads but it may be difficult to have a flexible layout with other components in between and manage several shaders.
In other words, is there a way to have two surface view pointing to the same video texture or mediaplayer with different shaders for different filter effect.
Thanks for your help, Best, Thibaut
Why no separate input loop and output loop to different threads, to avoid hungry/delay on input side or output side? With separated threads, we can dequeueInputBuffer/dequeueOutputBuffer with a long timeout.
Hello,
The developer is developing an app based on the ContinuousCaptureActivity.
ContinuousCaptureActivity filter effects (filter code from CameraCaptureActivity) were added to this exception as follows during filter change occurs.
05-28 01: 26: 18.116: E / AndroidRuntime (1476): java.lang.RuntimeException: glUseProgram: glError 0x501
Is it because you use the SurfaceView? Would you have any solutions?
Thank you.
I notice there is a double decoding in Grafika. A feature that I need is to play multiple videos simultaneously, say 4, or 6, as if we have a single player. So basically the players are glued together. Is it possible to achieve that? I mean, can we generalize it to N number of players, or it is only for 2?
Note that in my case, each of these four videos are in fact a tile of the original video; so you see a single video if the sync is good. Can you put a sample for this scenario?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.