Git Product home page Git Product logo

grafika's Introduction

Grafika

Welcome to Grafika, a dumping ground for Android graphics & media hacks.

Grafika is:

  • A collection of hacks exercising graphics features.
  • An SDK app, developed for API 18 (Android 4.3). While some of the code may work with older versions of Android, some sporatic work is done to support them.
  • Open source (Apache 2 license), copyright by Google. So you can use the code according to the terms of the license (see "LICENSE").
  • A perpetual work-in-progress. It's updated whenever the need arises.

However:

  • It's not stable.
  • It's not polished or well tested. Expect the UI to be ugly and awkward.
  • It's not intended as a demonstration of the proper way to do things. The code may handle edge cases poorly or not at all. Logging is often left enabled at a moderately verbose level.
  • It's barely documented.
  • It's not part of the Android Open Source Project. We cannot accept contributions to Grafika, even if you have an AOSP CLA on file.
  • It's NOT AN OFFICIAL GOOGLE PRODUCT. It's just a bunch of stuff that got thrown together on company time and equipment.
  • It's generally just not supported.

To some extent, Grafika can be treated as a companion to the Android System-Level Graphics Architecture document. The doc explains the technology that the examples rely on, and uses some of Grafika's activities as examples. If you want to understand how the code here works, start by reading that.

There is some overlap with the code on http://www.bigflake.com/mediacodec/. The code there largely consists of "headless" CTS tests, which are designed to be robust, self-contained, and largely independent of the usual app lifecycle issues. Grafika is a conventional app, and makes an effort to handle app issues correctly (like not doing lots of work on the UI thread).

Features are added to Grafika as the need arises, often in response to developer complaints about correctness or performance problems in the platform (either to confirm that the problems exist, or demonstrate an approach that works).

There are two areas where some amount of care is taken:

  • Thread safety. It's really easy to get threads crossed in subtly dangerous ways when working with the media classes. (Read the Android SMP Primer for a detailed introduction to the problem.) GL/EGL's reliance on thread-local storage doesn't help. Threading issues are frequently called out in comments in the source code.
  • Garbage collection. GC pauses cause jank. Ideally, none of the activities will do any allocations while in a "steady state". Allocations may occur while changing modes, e.g. starting or stopping recording.

All code is written in the Java programming language -- the NDK is not used.

The first time Grafika starts, two videos are generated (gen-eight-rects, gen-sliders). If you want to experiment with the generation code, you can cause them to be re-generated from the main activity menu ("Regenerate content").

Current features

* Play video (TextureView). Plays the video track from an MP4 file.

  • Only sees files in /data/data/com.android.grafika/files/. All of the activities that create video leave their files there. You'll also find two automatically-generated videos (gen-eight-rects.mp4 and gen-slides.mp4).
  • By default the video is played once, at the same rate it was recorded. You can use the checkboxes to loop playback and/or play the frames at a fixed rate of 60 FPS.
  • Uses a TextureView for output.
  • Name starts with an asterisk so it's at the top of the list of activities.

Continuous capture. Stores video in a circular buffer, saving it when you hit the "capture" button. (Formerly "Constant capture".)

  • Currently hard-wired to try to capture 7 seconds of video from the camera at 6MB/sec, preferably 15fps 720p. That requires a buffer size of about 5MB.
  • The time span of frames currently held in the buffer is displayed. The actual time span saved when you hit "capture" will be slightly less than what is shown because we have to start the output on a sync frame, which are configured to appear once per second.
  • Output is a video-only MP4 file ("constant-capture.mp4"). Video is always 1280x720, which usually matches what the camera provides; if it doesn't, the recorded video will have the wrong aspect ratio.

Double decode. Decodes two video streams side-by-side to a pair of TextureViews.

  • Plays the two auto-generated videos. Note they play at different rates.
  • The video decoders don't stop when the screen is rotated. We retain the SurfaceTexture and just attach it to the new TextureView. Useful for avoiding expensive codec reconfigures. The decoders do stop if you leave the activity, so we don't tie up hardware codec resources indefinitely. (It also doesn't stop if you turn the screen off with the power button, which isn't good for the battery, but might be handy if you're feeding an external display or your player also handles audio.)
  • Unlike most activities in Grafika, this provides different layouts for portrait and landscape. The videos are scaled to fit.

Hardware scaler exerciser. Shows GL rendering with on-the-fly surface size changes.

  • The motivation behind the feature this explores is described in a developer blog post: http://android-developers.blogspot.com/2013/09/using-hardware-scaler-for-performance.html
  • You will see one frame rendered incorrectly when changing sizes. This is because the render size is adjusted in the "surface changed" callback, but the surface's size doesn't actually change until we latch the next buffer. This is straightforward to fix (left as an exercise for the reader).

Live camera (TextureView). Directs the camera preview to a TextureView.

  • This comes more or less verbatim from the TextureView documentation.
  • Uses the default (rear-facing) camera. If the device has no default camera (e.g. Nexus 7 (2012)), the Activity will crash.

Multi-surface test. Simple activity with three overlapping SurfaceViews, one marked secure.

  • Useful for examining HWC behavior with multiple static layers, and screencap / screenrecord behavior with a secure surface. (If you record the screen one of the circles should be missing, and capturing the screen should just show black.)
  • If you tap the "bounce" button, the circle on the non-secure layer will animate. It will update as quickly as possible, which may be slower than the display refresh rate because the circle is rendered in software. The frame rate will be reported in logcat.

Play video (SurfaceView). Plays the video track from an MP4 file.

  • Works very much like "Play video (TextureView)", though not all features are present. See the class comment for a list of advantages to using SurfaceView.

Record GL app. Simultaneously draws to the display and to a video encoder with OpenGL ES, using framebuffer objects to avoid re-rendering.

  • It can write to the video encoder three different ways: (1) draw twice; (2) draw offscreen and blit twice; (3) draw onscreen and blit framebuffer. #3 doesn't work yet.
  • The renderer is trigged by Choreographer to update every vsync. If we get too far behind, we will skip frames. This is noted by an on-screen drop counter and a border flash. You generally won't see any stutter in the animation, because we don't skip the object movement, just the render.
  • The encoder is fed every-other frame, so the recorded output will be ~30fps rather than ~60fps on a typical device.
  • The recording is letter- or pillar-boxed to maintain an aspect ratio that matches the display, so you'll get different results from recording in landscape vs. portrait.
  • The output is a video-only MP4 file ("fbo-gl-recording.mp4").

Record Screen using MediaProjectionManager. Records the screen to a movie using the MediaProjectionManager. This API requires API level 23 (Marshmallow) or greater.

Scheduled swap. Exercises a SurfaceFlinger feature that allows you to submit buffers to be displayed at a specific time.

  • Requires API 19 (Android 4.4 "KitKat") to do what it's supposed to. The current implementation doesn't really look any different on API 18 to the naked eye.
  • You can configure the frame delivery timing (e.g. 24fps uses a 3-2 pattern) and how far in advance frames are scheduled. Selecting "ASAP" disables scheduling.
  • Use systrace with tags sched gfx view --app=com.android.grafika to observe the effects.
  • The moving square changes colors when the app is unhappy about timing.

Show + capture camera. Attempts to record at 720p from the front-facing camera, displaying the preview and recording it simultaneously.

  • Use the record button to toggle recording on and off.
  • Recording continues until stopped. If you back out and return, recording will start again, with a real-time gap. If you try to play the movie while it's recording, you will see an incomplete file (and probably cause the play movie activity to crash).
  • The recorded video is scaled to 640x480, so it will probably look squished. A real app would either set the recording size equal to the camera input size, or correct the aspect ratio by letter- or pillar-boxing the frames as they are rendered to the encoder.
  • You can select a filter to apply to the preview. It does not get applied to the recording. The shader used for the filters is not optimized, but seems to perform well on most devices (the original Nexus 7 (2012) being a notable exception). Demo here: http://www.youtube.com/watch?v=kH9kCP2T5Gg
  • The output is a video-only MP4 file ("camera-test.mp4").

Simple Canvas in TextureView. Exercises software rendering to a TextureView with a Canvas.

  • Renders as quickly as possible. Because it's using software rendering, this will likely run more slowly than the "Simple GL in TextureView" activity.
  • Toggles the use of a dirty rect every 64 frames. When enabled, the dirty rect extends horizontally across the screen.

Simple GL in TextureView. Demonstates simple use of GLES in a TextureView, rather than a GLSurfaceView.

  • Renders as quickly as possible. On most devices it will exceed 60fps and flicker wildly, but in 4.4 ("KitKat") a bug prevents the system from dropping frames.

Texture from Camera. Renders Camera preview output with a GLES texture.

  • Adjust the sliders to set the size, rotation, and zoom. Touch anywhere else to center the rect at the point of the touch.

Color bars. Displays RGB color bars.

OpenGL ES Info. Dumps version info and extension lists.

  • The "Save" button writes a copy of the output to the app's file area.

glTexImage2D speed test. Simple, unscientific measurement of the time required to upload a 512x512 RGBA texture with glTexImage2D().

glReadPixels speed test. Simple, unscientific measurement of the time required for glReadPixels() to read a 720p frame.

Known issues

  • Nexus 4 running Android 4.3 (JWR67E): "Show + capture camera" crashes if you select one of the filtered modes. Appears to be a driver bug (Adreno "Internal compiler error").

Feature & fix ideas

In no particular order.

  • Stop using AsyncTask for anything where performance or latency matters.
  • Add a "fat bits" viewer for camera (single SurfaceView; left half has live camera feed and a pan rect, right half has 8x pixels)
  • Change the "Simple GL in TextureView" animation. Or add an epilepsy warning.
  • Cross-fade from one video to another, recording the result. Allow specification of the resolution (maybe QVGA, 720p, 1080p) and generate appropriately.
  • Add features to the video player, like a slider for random access, and buttons for single-frame advance / rewind (requires seeking to nearest sync frame and decoding frames until target is reached).
  • Convert a series of PNG images to video.
  • Play continuous video from a series of MP4 files with different characteristics. Will probably require "preloading" the next movie to keep playback seamless.
  • Experiment with alternatives to glReadPixels(). Add a PBO speed test. (Doesn't seem to be a way to play with eglCreateImageKHR from Java.)
  • Do something with ImageReader class (req API 19).
  • Figure out why "double decode" playback is sometimes janky.
  • Add fps indicator to "Simple GL in TextureView".
  • Capture audio from microphone, record + mux it.
  • Enable preview on front/back cameras simultaneously, display them side-by-side. (This appears to be impossible except on specific devices.)
  • Add a test that renders to two different TextureViews using different EGLContexts from a single renderer thread.

grafika's People

Contributors

claywilkinson avatar fadden avatar jason-cooke avatar paleozogt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

grafika's Issues

Retain the same SurfaceTexture across Activities

Hi,

I tried to find answer and looked through the Android code on the grep code. So just to be 100% will ask here. Is it possible to adopt Yours DoubleDecodeActivity approach to have 2 Activities and share video between them i.e. call recreateView() from an another Activity? I suppose no. I've tried and your approach perfectly works within the same activity with rotation and etc, but doesn't work when I try to attach TextureView from an another Activity. With

09-01 20:24:57.703 21115-21151/simpleApp E/GLConsumer: [unnamed-21115-0] updateAndRelease: GLConsumer is not attached to an OpenGL ES context

I saw that MediaPlayer allows to attach some other TextureView but didn't go through the code to check if it;s possible with multiple Activities

Could you be so kind to give a hint how it can be achieved (preserve Video stream across Activities).

Thanks in advance

Alternatives to glReadPixels

Readme contains : Experiment with alternatives to glReadPixels(). Add a PBO speed test. (Doesn't seem to be a way to play with eglCreateImageKHR from Java.)

but It seems it has not been added yet.

I am trying to use https://github.com/CyberAgent/android-gpuimage for some image processing tasks. There seems to a speed bottle neck at https://github.com/CyberAgent/android-gpuimage/blob/febdf4900b437b2069661fd371d5469196e26f18/library/src/jp/co/cyberagent/android/gpuimage/PixelBuffer.java#L194
If there is some faster alternatives to glReadPixels it would greatly enhance the image processing time.

RuntimeException: No video track found in .mp4 file

Hi All,
I am trying to play recorded video in my app, please check the steps below:

  1. Open main screen of the app to record video (video recorded successfully and putted to app folder)
    ---press 'record button' on screen, wait a few seccond and press again
  2. After step 1 we have .mp4 file located in app directory. file path:

/storage/emulated/0/Android/data/com.test/files/video-1.mp4

---preview fragment appears to show recorded video, fragment piece of code below:

 @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {

        SpeedControlCallback callback = new SpeedControlCallback();

        Surface surface = new Surface(surfaceTexture);
        MoviePlayer player = null;
        try {
            player = new MoviePlayer(
                   new File(getContext().getExternalFilesDir(null), mFilepathToShare), 
                   surface, callback); // line 81

        } catch (IOException ioe) {
            surface.release();
            return;
        }
        adjustAspectRatio(player.getVideoWidth(), player.getVideoHeight());

        mPlayTask = new MoviePlayer.PlayTask(player, this);
        mPlayTask.setLoopMode(true);
        mPlayTask.execute();
    }

Everything is ok, video successfully playing in infinity loop.

  1. Press 'back button' to leave preview fragment (everything is still ok)
    ---after that i am trying to repeat step 1.

  2. Record another video (press on button to start video recording and press again in a few seconds)
    path of the new video is

/storage/emulated/0/Android/data/com.test/files/video-2.mp4

Expected result: preview fragment appears to show second video (like it was in step 2)
Actual result: RuntimeException

In the second time I faced with RuntimeException in the following line:
player = new MoviePlayer(new File(getActivity().getExternalFilesDir(null), mFilepathToShare), surface, callback);

Log message below:

  java.lang.RuntimeException: No video track found 
in  /storage/emulated/0/Android/data/com.test/files/video-2.mp4
at com.test.grafika.MoviePlayer.<init>(MoviePlayer.java:117)
at com.test.preview.ui.PreviewFragment.onSurfaceTextureAvailable(PreviewFragment.java:81)

At first time result of extractor.getTrackCount() was 1, but in the second time it is 0.
Both file are OK, and successfully played by default video app.
Device: SGS Galaxy S5.
OS Version: Android 5.0.

Nexus 5 crash

When I try to open TextureFromCameraActivity, my app has stopped working, and this is the main reason:
java.lang.RuntimeException: Fail to connect to camera service
at this line in TextureFromCameraActivity:
mCamera = Camera.open(i);

Continuous Capture: 1 byte hole between chunks in CircurlarEncoderBuffer

/**
 * Computes the data buffer offset for the next place to store data.
 * <p>
 * Equal to the start of the previous packet's data plus the previous packet's length.
 */
private int getHeadStart() {
    if (mMetaHead == mMetaTail) {
        // list is empty
        return 0;
    }

    final int dataLen = mDataBuffer.length;
    final int metaLen = mPacketStart.length;

    int beforeHead = (mMetaHead + metaLen - 1) % metaLen;
    return (mPacketStart[beforeHead] + mPacketLength[beforeHead] + 1) % dataLen;
}

Why "+1" in the return statement? Is this intentional?

Increasing the strength of the blur filter

Hello Fadden,

I asked a question about real-time-blur on Stack Overflow and was delighted when you responded with this Grafika tool.

My only issue is that I notice the blur is not as strong as id like it to be,
and ive been playing around with this:
case CameraCaptureActivity.FILTER_BLUR:
programType = Texture2dProgram.ProgramType.TEXTURE_EXT_FILT;
kernel = new float[] {
1f/16f, 2f/16f, 1f/16f,
2f/16f, 4f/16f, 2f/16f,
1f/16f, 2f/16f, 1f/16f };
break;

but I cant quite figure out what to change in that 3x3 image blur radius matrix, or whatever those fractions represent to change them to make the blur stronger.
Any idea(s)?

Thanks again!

Steve

Deformed preview in show and capture activity?

Hello again,

This isn't really a bug per say I just wanted to ask why does the preview in the show and capture activity seem deformed on a nexus 7. I'm asking here so that others can see this as well.
It's like the preview has been stretched a bit from a square to a rectangle maybe?
Is this caused by the openCamera(1280, 720); line in onResume part of the activity there?
Is there a way to maybe get the devices/screen optimal resolution to avoid such distortions?

Thank you :)

How to switch front-back camera in CameraCaptureActivity ?

I want to switch the front or back camera in CameraCaptureActivity(src/com/android/grafika/CameraCaptureActivity.java), as the following:

public boolean switchCamera() {
        releaseCamera();
        mGLView.onPause();
        if (mReqCameraId == Camera.CameraInfo.CAMERA_FACING_BACK) {
            mReqCameraId = Camera.CameraInfo.CAMERA_FACING_FRONT;
        } else {
            mReqCameraId = Camera.CameraInfo.CAMERA_FACING_BACK;
        }
        openCamera(mReqCameraId);

        mGLView.onResume();
        mGLView.queueEvent(new Runnable() {
            @Override
            public void run() {
                mRenderer.setCameraPreviewSize(mCameraPreviewWidth, mCameraPreviewHeight);
            }
        });
        return true;
    }

It works, but the FOV has been changed when I back to the camera which first time launched. It seems the frame has been clipped.

So where did i miss when switch the front-back camera?

Thanks.

MoviePlayer doesnt work for an HTTP url

The MoviePlayer class works fine for a video loaded from sd-card, but it fails with the following error when i try to load a video from a URL. One of the URLs i tried is this
http://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4

I am running the code on Android M.

I/MediaHTTPConnection: proxyName: 0.0.0.0 0
13110-14961/com.android.grafika E/NuCachedSource2: source returned error -1, 0 retries left
13110-13110/com.android.grafika E/QComExtractorFactory: Sniff FAIL :: coundn't pull enough data for sniffing
13110-13110/com.android.grafika E/Grafika: Unable to play movie

        java.io.IOException: Failed to instantiate extractor.
        at android.media.MediaExtractor.nativeSetDataSource(Native Method)
        at android.media.MediaExtractor.setDataSource(MediaExtractor.java:182)
        at com.android.grafika.MoviePlayer.<init>(MoviePlayer.java:113)at com.android.grafika.PlayMovieActivity.clickPlayStop(PlayMovieActivity.java:189)
     at java.lang.reflect.Method.invoke(Native Method)
     at android.view.View$DeclaredOnClickListener.onClick(View.java:4450)
     at android.view.View.performClick(View.java:5201)
     at android.view.View$PerformClick.run(View.java:21163)
     at android.os.Handler.handleCallback(Handler.java:746)
     at android.os.Handler.dispatchMessage(Handler.java:95)
     at android.os.Looper.loop(Looper.java:148)
     at android.app.ActivityThread.main(ActivityThread.java:5443)
     at java.lang.reflect.Method.invoke(Native Method)
    atcom.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:728)
    at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:618)

"Continuous Capture" Activity mishandles unblank

In continuous capture Activity, when press power button and press it again, the continuous capture does not work. The reason seems that the Surfaceview has not been destoried. So we need do the init in callback surfaceCreated Again after onResume if the sruface is still alive.

Crash in Show and Capture camera

If I comment the line 121 ContentManager.getInstance().createAll(this); where the app crashes the list menu appears correctly. If I then chose Show + capture camera it shows the preview correctly. I then press the "Record" button and get the following exception and the app crashes.

    03-28 14:10:16.726: I/OMXClient(1469): Using client-side OMX mux.
    03-28 14:10:16.726: I/SoftAVCEncoder(1469): Construct SoftAVCEncoder
    03-28 14:10:16.726: I/ACodec(1469): setupVideoEncoder succeeded
    03-28 14:10:16.726: E/OMXNodeInstance(1469): createInputSurface requires AndroidOpaque color format
    03-28 14:10:16.726: E/ACodec(1469): [OMX.google.h264.encoder] onCreateInputSurface returning error -38
    03-28 14:10:16.726: W/MediaCodec(1469): createInputSurface failed, err=-38
    03-28 14:10:16.726: W/dalvikvm(1469): threadid=12: thread exiting with uncaught exception (group=0xa4bfd648)
    03-28 14:10:16.730: E/AndroidRuntime(1469): FATAL EXCEPTION: TextureMovieEncoder
    03-28 14:10:16.730: E/AndroidRuntime(1469): java.lang.IllegalStateException
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at android.media.MediaCodec.createInputSurface(Native Method)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at com.android.grafika.VideoEncoderCore.<init>(VideoEncoderCore.java:79)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at com.android.grafika.TextureMovieEncoder.prepareEncoder(TextureMovieEncoder.java:381)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at com.android.grafika.TextureMovieEncoder.handleStartRecording(TextureMovieEncoder.java:312)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at com.android.grafika.TextureMovieEncoder.access$0(TextureMovieEncoder.java:309)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at com.android.grafika.TextureMovieEncoder$EncoderHandler.handleMessage(TextureMovieEncoder.java:281)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at android.os.Handler.dispatchMessage(Handler.java:99)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at android.os.Looper.loop(Looper.java:137)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at com.android.grafika.TextureMovieEncoder.run(TextureMovieEncoder.java:248)
    03-28 14:10:16.730: E/AndroidRuntime(1469):     at java.lang.Thread.run(Thread.java:841)
    03-28 14:10:16.734: D/dalvikvm(1469): GC_FOR_ALLOC freed 110K, 7% free 4238K/4548K, paused 3ms, total 3ms

Expected fps camera capture

I am having an issue getting to 30 fps (1280x720) on my device. The mediarecorder api is capable of this frame rate, but the camera capture sample peeks around 19 fps. Smaller resolutions seem to be better. Is this expected? Where in the code would I look for optimizations (if you think there may be some)?

Add glclear in texture2DProgram

Hello :)

Just a small suggestion.
In Texture2DProgram in the draw method before it starts drawing it should do

    GLES20.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);

To clear the buffer and not leave residues from previous draws.

Loading bitmap texture on encoded video

Hi, thanks for useful awesome example!
i want draw bitmap image on my recorded video, instead of Box in TextureMovieEncoder. (like watermark in video, not camerapreview)

private void drawBox() {
    int[] textures = new int[1];
    GLES20.glGenTextures(1, textures, 0);
    Bitmap waterMark = BitmapFactory.decodeResource(mContext.getResources(), R.drawable.ic_launcher);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
    GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, waterMark, 0);
    GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
    GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
    waterMark.recycle();
}

why does this not work?
and how can i make this work?

Camera 2 Api

With Camera 2 API being the way to go in the future are there any plans to update this repo to use Camera 2 API? I would really look forward to use the ContinousCapture Feature in the future as well.

Crash on 4.4.2 and 4.3

Cool project first of all. :) I find it very useful.

Now whenever I Build and Launch it on Nexus 4 with 4.4.2 or Moto X with 4.3 it crashes with the following.
Is this a bug or am I doing something wrong?

    03-28 14:04:44.331: E/WindowManager(1241): android.view.WindowLeaked: Activity com.android.grafika.MainActivity has leaked window com.android.internal.policy.impl.PhoneWindow$DecorView{52837850 V.E..... R....... 0,0-729,276} that was originally added here
    03-28 14:04:44.331: E/WindowManager(1241):  at android.view.ViewRootImpl.<init>(ViewRootImpl.java:348)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.view.WindowManagerGlobal.addView(WindowManagerGlobal.java:248)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.view.WindowManagerImpl.addView(WindowManagerImpl.java:69)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.Dialog.show(Dialog.java:286)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.AlertDialog$Builder.show(AlertDialog.java:951)
    03-28 14:04:44.331: E/WindowManager(1241):  at com.android.grafika.ContentManager.prepareContent(ContentManager.java:126)
    03-28 14:04:44.331: E/WindowManager(1241):  at com.android.grafika.ContentManager.createAll(ContentManager.java:113)
    03-28 14:04:44.331: E/WindowManager(1241):  at com.android.grafika.MainActivity.onCreate(MainActivity.java:121)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.Activity.performCreate(Activity.java:5231)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1087)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2159)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2245)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.ActivityThread.access$800(ActivityThread.java:135)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1196)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.os.Handler.dispatchMessage(Handler.java:102)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.os.Looper.loop(Looper.java:136)
    03-28 14:04:44.331: E/WindowManager(1241):  at android.app.ActivityThread.main(ActivityThread.java:5017)
    03-28 14:04:44.331: E/WindowManager(1241):  at java.lang.reflect.Method.invokeNative(Native Method)
    03-28 14:04:44.331: E/WindowManager(1241):  at java.lang.reflect.Method.invoke(Method.java:515)
    03-28 14:04:44.331: E/WindowManager(1241):  at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:779)
    03-28 14:04:44.331: E/WindowManager(1241):  at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:595)
    03-28 14:04:44.331: E/WindowManager(1241):  at dalvik.system.NativeStart.main(Native Method)

How can I crop the preview and encode

I want to record a 640_640 video.I tried to set the video size in CircularEncoder and continiousCapture
The video size is 640_640. But the video is stretched. What I need to change to archive that?? and the video is rotated. how can I handle the rotation

Thanks
Amlan.

How to play video in slow motion for selected intervals

I have run your project(https://github.com/google/grafika) in this project I played a video in slow motion by changing Play video(TextureView) activity's FixedPlaybackRate value
SpeedControlCallback callback = new SpeedControlCallback();
callback.setFixedPlaybackRate(1);
It had played a whole video slowly. what I actually need is have to play video slowly for selected intervals not the entire video.is it any possible to set the regions(intervals)?.

Change the video speed

Gratitude can find so good content,
I want to know, if can Change the video speed,then produce native video file(different speed).
very thankful.

Failed to stop the muxer (timestamps problem?)

I have been going for 2 days and cant run your app it fails with Unable to generate content dialog and this shows on the logcat. My device is a samsung tablet running cynogenmod 12.1 (Lollipop 5.1)
Any ideas?

04-17 15:11:12.552    4577-4606/com.android.grafika E/MPEG4Writer﹕ timestampUs 125000 < lastTimestampUs 375000 for Video track
04-17 15:11:12.557    2184-2474/? W/GraphicBufferSource﹕ Dropped back down to Loaded without Executing
04-17 15:11:12.598    2184-2184/? E/BufferQueueProducer﹕ [GraphicBufferSource] cancelBuffer: BufferQueue has been abandoned
04-17 15:11:12.602    4577-4596/com.android.grafika D/MPEG4Writer﹕ Video track stopping
04-17 15:11:12.602    4577-4596/com.android.grafika D/MPEG4Writer﹕ Video track source stopping
04-17 15:11:12.602    4577-4596/com.android.grafika D/MPEG4Writer﹕ Video track source stopped
04-17 15:11:12.602    4577-4596/com.android.grafika D/MPEG4Writer﹕ Stopping writer thread
04-17 15:11:12.602    4577-4605/com.android.grafika D/MPEG4Writer﹕ 0 chunks are written in the last batch
04-17 15:11:12.603    4577-4596/com.android.grafika D/MPEG4Writer﹕ Writer thread stopped
04-17 15:11:12.605    4577-4596/com.android.grafika W/Grafika﹕ Failed while generating content
    java.lang.IllegalStateException: Failed to stop the muxer
            at android.media.MediaMuxer.nativeStop(Native Method)
            at android.media.MediaMuxer.stop(MediaMuxer.java:225)
            at com.android.grafika.GeneratedMovie.releaseEncoder(GeneratedMovie.java:145)
            at com.android.grafika.MovieEightRects.create(MovieEightRects.java:74)
            at com.android.grafika.ContentManager.prepare(ContentManager.java:154)
            at com.android.grafika.ContentManager.access$000(ContentManager.java:39)
            at com.android.grafika.ContentManager$GenerateTask.doInBackground(ContentManager.java:235)
            at com.android.grafika.ContentManager$GenerateTask.doInBackground(ContentManager.java:203)
            at android.os.AsyncTask$2.call(AsyncTask.java:292)
            at java.util.concurrent.FutureTask.run(FutureTask.java:237)
            at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
            at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
            at java.lang.Thread.run(Thread.java:818)

ContinuousCaptureActivity & Fragment Shader(Filter) problem..

Hello,
The developer is developing an app based on the ContinuousCaptureActivity.

ContinuousCaptureActivity filter effects (filter code from CameraCaptureActivity) were added to this exception as follows during filter change occurs.

05-28 01: 26: 18.116: E / AndroidRuntime (1476): java.lang.RuntimeException: glUseProgram: glError 0x501

Is it because you use the SurfaceView? Would you have any solutions?

Thank you.

CameraCaptureActivity bug report

when i click the "Start recording" button in CameraCaptureActivity , the crash happen always.

my phone is HUAWEI P6-U06 , sdk 19

here is the log:
06-12 10:22:37.030: E/AndroidRuntime(6301): FATAL EXCEPTION: TextureMovieEncoder
06-12 10:22:37.030: E/AndroidRuntime(6301): Process: com.android.grafika, PID: 6301
06-12 10:22:37.030: E/AndroidRuntime(6301): java.lang.RuntimeException: eglCreateContext: EGL error: 0x3009
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.gles.EglCore.checkEglError(EglCore.java:370)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.gles.EglCore.(EglCore.java:126)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder.prepareEncoder(TextureMovieEncoder.java:385)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder.handleStartRecording(TextureMovieEncoder.java:312)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder.access$0(TextureMovieEncoder.java:309)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder$EncoderHandler.handleMessage(TextureMovieEncoder.java:281)
06-12 10:22:37.030: E/AndroidRuntime(6301): at android.os.Handler.dispatchMessage(Handler.java:102)
06-12 10:22:37.030: E/AndroidRuntime(6301): at android.os.Looper.loop(Looper.java:136)
06-12 10:22:37.030: E/AndroidRuntime(6301): at com.android.grafika.TextureMovieEncoder.run(TextureMovieEncoder.java:248)
06-12 10:22:37.030: E/AndroidRuntime(6301): at java.lang.Thread.run(Thread.java:841)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 1 is not owned by the client (state=3)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 3 is not owned by the client (state=0)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 0 is not owned by the client (state=0)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 1 is not owned by the client (state=3)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 2 is not owned by the client (state=0)
06-12 10:22:37.050: E/BufferQueue(6301): [unnamed-6301-0] cancelBuffer: slot 3 is not owned by the client (state=0)

is this phone doesn't support the opengles? is there any way to avoid this?
appreciate ur answer.

Recorded videos report framerate to be 59.58 while they are at 29.8

I record videos the capture activity and then play them back which is ok. I wanna do some video manipulation based on their framerate but it won't work as most of the video report wrong frame rates. I am using Nexus 5 with 4.4.2 installed on it with latest grafika code.

Show + capture activity

unable to apply emboss , sharpen ,blur and edge detection filter for full preview.

if any one have idea about that please let me know.

MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface failed on Samsung SM-N9100 | Galaxy Note 4

i tried VideoEncoderCore.java to do hw encoder on SM-N9100 | Galaxy Note 4, but i got the following error:
09-10 17:28:39.809: D/VideoEncoderCore(15144): format: {height=480, width=640, bitrate=500000, mime=video/avc, frame-rate=20, i-frame-interval=1, color-format=2130708361}
09-10 17:28:39.809: I/ACodec(15144): [] Now uninitialized
09-10 17:28:39.809: I/OMXClient(15144): Using client-side OMX mux.
09-10 17:28:39.929: I/ACodec(15144): [OMX.qcom.video.encoder.avc] Now Loaded
09-10 17:28:39.929: E/ACodec(15144): onConfigureComponent mime.c_str() = video/avc
09-10 17:28:39.929: E/ACodec(15144): [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -2147483648
09-10 17:28:39.929: W/ACodec(15144): do not know color format 0x7fa30c04 = 2141391876
09-10 17:28:39.929: W/ACodec(15144): do not know color format 0x7f000789 = 2130708361
09-10 17:28:39.939: I/ACodec(15144): [OMX.qcom.video.encoder.avc] setupVideoEncoder succeeded
09-10 17:28:39.939: W/ACodec(15144): do not know color format 0x7f000789 = 2130708361
09-10 17:28:39.939: I/ACodec(15144): [OMX.qcom.video.encoder.avc] Now Loaded->Idle
09-10 17:28:39.949: I/ACodec(15144): [OMX.qcom.video.encoder.avc] Now Idle->Executing
09-10 17:28:39.949: I/ACodec(15144): [OMX.qcom.video.encoder.avc] Now Executing
09-10 17:28:39.959: D/EglCore(15144): EGLContext created, client version 3
09-10 17:28:39.979: D/Texture2dProgram(15144): Created program 6 (TEXTURE_EXT)
09-10 17:28:39.979: D/VideoEncoderCore(15144): drainEncoder(false)
09-10 17:28:39.999: D/VideoEncoderCore(15144): drainEncoder(false)
09-10 17:28:40.009: D/VideoEncoderCore(15144): drainEncoder(false)
09-10 17:28:40.029: D/VideoEncoderCore(15144): drainEncoder(false)
09-10 17:28:40.029: D/VideoEncoderCore(15144): encoder output format changed: {height=480, width=640, csd-1=java.nio.ByteArrayBuffer[position=0,limit=8,capacity=8], mime=video/avc, what=1869968451, csd-0=java.nio.ByteArrayBuffer[position=0,limit=17,capacity=17]}

a error dialog turns out once grafika launch

I launch the grafika, a dialog turned out and said "Failed to generate cotent. Some features may be unavailable. Can't use input surface with software codec: OMX.google.h264.encoder"

Then I click OK button and entered the continuous caputure activity, the app crashed.

My device infomation is as follows:
Model: Lenovo A560
Android version: 4.3

I am sure this is the key log:
06-29 21:27:50.267 E/OMXMaster(17347): A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
06-29 21:27:50.267 I/SoftAVCEncoder(17347): Construct SoftAVCEncoder
06-29 21:27:50.267 D/ResourceManager(17347): findUseCaseAndSetParameter - mime=video/avc,componentName=OMX.google.h264.encoder,isDecoder=0
06-29 21:27:50.267 D/ResourceManager(17347): findUseCaseAndSetParameter-useCase =,useCaseFlag = 0, codecFlags = 0
06-29 21:27:50.267 D/ResourceManager(17347): mime = video/avc, componentName = OMX.google.h264.encoder, isDecoder = 0
06-29 21:27:50.267 D/ResourceManager(17347): software video useCase =
06-29 21:27:50.267 I/ACodec (17347): setupVideoEncoder succeeded
06-29 21:27:50.267 E/OMXNodeInstance(17347): createInputSurface requires AndroidOpaque color format
06-29 21:27:50.267 E/ACodec (17347): [OMX.google.h264.encoder] onCreateInputSurface returning error -38
06-29 21:27:50.267 W/MediaCodec(17347): createInputSurface failed, err=-38
06-29 21:27:50.277 D/KeyguardUpdateMonitor( 1249): sendKeyguardVisibilityChanged(true)
06-29 21:27:50.277 D/KeyguardUpdateMonitor( 1249): handleKeyguardVisibilityChanged(1)
06-29 21:27:50.277 D/AndroidRuntime(17347): Shutting down VM
06-29 21:27:50.277 W/dalvikvm(17347): threadid=1: thread exiting with uncaught exception (group=0x415ac8b0)
06-29 21:27:50.297 E/AndroidRuntime(17347): FATAL EXCEPTION: main
06-29 21:27:50.297 E/AndroidRuntime(17347): java.lang.IllegalStateException
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.media.MediaCodec.createInputSurface(Native Method)
06-29 21:27:50.297 E/AndroidRuntime(17347): at com.android.grafika.CircularEncoder.(CircularEncoder.java:124)
06-29 21:27:50.297 E/AndroidRuntime(17347): at com.android.grafika.ContinuousCaptureActivity.surfaceCreated(ContinuousCaptureActivity.java:383)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.SurfaceView.updateWindow(SurfaceView.java:571)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.SurfaceView.access$000(SurfaceView.java:86)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:175)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:833)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:1860)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1004)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:5481)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.Choreographer$CallbackRecord.run(Choreographer.java:749)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.Choreographer.doCallbacks(Choreographer.java:562)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.Choreographer.doFrame(Choreographer.java:532)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:735)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.os.Handler.handleCallback(Handler.java:730)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.os.Handler.dispatchMessage(Handler.java:92)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.os.Looper.loop(Looper.java:137)
06-29 21:27:50.297 E/AndroidRuntime(17347): at android.app.ActivityThread.main(ActivityThread.java:5136)
06-29 21:27:50.297 E/AndroidRuntime(17347): at java.lang.reflect.Method.invokeNative(Native Method)
06-29 21:27:50.297 E/AndroidRuntime(17347): at java.lang.reflect.Method.invoke(Method.java:525)
06-29 21:27:50.297 E/AndroidRuntime(17347): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:737)
06-29 21:27:50.297 E/AndroidRuntime(17347): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:553)
06-29 21:27:50.297 E/AndroidRuntime(17347): at dalvik.system.NativeStart.main(Native Method)
06-29 21:27:50.317 I/ActivityManager( 1249): Notify an ApplicationCrash

Dear fadden, How to fix this issue?

No audio in DoubleDecodeActivity

I am not able to play sound in video. I have used DoubleDecodeActivity to show videos in TextureView.
Is there any solution for it?

I want to play 4 same videos in different angles.

Android MediaExtractor crash when decoding some mp4 files. Libc fatal signal 11

Hi I am using Double decode example to load videos into TextureView.

Code work most of the time but for some mp4 files it crash giving only :

libc: Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 7998

Same code works for webM files and most mp4 files. Place where I expect it to fail is :

extractor = new MediaExtractor();
extractor.setDataSource(sourceFile);

in MoviePlayer [L:113]

Any hint how to walk around or what might be the problem?

Feature request: latency test

Some sort of latency exerciser would be handy. Examples include "pointer trails" to see how far behind the touch input we get, and a big sliding box that tries to move with your finger. For the latter I'm thinking of something like Google Maps but with just a simple texture instead of network tiles.

The goal is to have a test app that shows the benefits of DispSync and eglPresentationTimeANDROID.

Background: http://stackoverflow.com/questions/26317132/minimize-android-glsurfaceview-lag/

Working with NDK

For the "GL Recorder" activity can the rendering be done in native C++ and use the rest of the implementation as is?

E.g:

GLES20.glBindFramebuffer(GL_FRAMBUFFER, javaFBO);
nativeRender(...);
GLES20.glBindFramebuffer(GL_FRAMEBUFFER, 0);
drawToScreen();

Great job on this sandbox, learning a lot from it!

Thanks,
Cristina

recorded video output is so poor

please understand my poor English too..

i try Full-screen record from CameraCaptureActivity. but i got poor quality output.
how can i get better quality output?

Playing multiple videos simultaneously

I notice there is a double decoding in Grafika. A feature that I need is to play multiple videos simultaneously, say 4, or 6, as if we have a single player. So basically the players are glued together. Is it possible to achieve that? I mean, can we generalize it to N number of players, or it is only for 2?

fj62s

Note that in my case, each of these four videos are in fact a tile of the original video; so you see a single video if the sync is good. Can you put a sample for this scenario?

Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0

Hello.

We are developing an app using Grafika.

However, the crash occurs in the following part in some devices.
Is the problem a bug in the device and its OS?

Is there a way to fix?

[crash log]

12-07 18:49:58.398 22288 24445 F libc : Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 24445 (TextureMovieEnc)
12-07 18:49:58.479 476 476 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
12-07 18:49:58.483 476 476 F DEBUG : Build fingerprint: 'google/bullhead/bullhead:6.0/MDA89E/2296692:user/release-keys'
12-07 18:49:58.483 476 476 F DEBUG : Revision: 'rev_1.0'
12-07 18:49:58.484 476 476 F DEBUG : ABI: 'arm'
12-07 18:49:58.485 476 476 F DEBUG : pid: 22288, tid: 24445, name: TextureMovieEnc
12-07 18:49:58.485 476 476 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
12-07 18:49:58.576 476 476 F DEBUG : r0 00000000 r1 00000000 r2 d60c6670 r3 00000001
12-07 18:49:58.576 476 476 F DEBUG : r4 d5fad5c0 r5 dc896000 r6 00000006 r7 00000000
12-07 18:49:58.577 476 476 F DEBUG : r8 00000002 r9 d60cb800 sl d2b70480 fp 00000000
12-07 18:49:58.577 476 476 F DEBUG : ip 00000000 sp d4a8f338 lr e9d9ca7d pc e9d9cad6 cpsr a00e0030
12-07 18:49:58.617 476 476 F DEBUG :
12-07 18:49:58.617 476 476 F DEBUG : backtrace:
12-07 18:49:58.618 476 476 F DEBUG : #00 pc 000c7ad6 /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlBindTexture(unsigned int, unsigned int)+361)
12-07 18:49:58.618 476 476 F DEBUG : #1 pc 7442301d /data/dalvik-cache/arm/system@[email protected] (offset 0x1ec4000)

12-07 19:19:48.344 28977 32344 F libc : Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 32344 (TextureMovieEnc)
12-07 19:19:48.436 476 476 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
12-07 19:19:48.438 476 476 F DEBUG : Build fingerprint: 'google/bullhead/bullhead:6.0/MDA89E/2296692:user/release-keys'
12-07 19:19:48.438 476 476 F DEBUG : Revision: 'rev_1.0'
12-07 19:19:48.438 476 476 F DEBUG : ABI: 'arm'
12-07 19:19:48.438 476 476 F DEBUG : pid: 28977, tid: 32344, name: TextureMovieEnc
12-07 19:19:48.438 476 476 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
12-07 19:19:48.734 476 476 F DEBUG : r0 00000000 r1 00000000 r2 00000000 r3 00000006
12-07 19:19:48.734 476 476 F DEBUG : r4 00000001 r5 d768e090 r6 00000000 r7 d7743540
12-07 19:19:48.734 476 476 F DEBUG : r8 00000002 r9 d4520000 sl 00000001 fp d5ea00c0
12-07 19:19:48.734 476 476 F DEBUG : ip 00000005 sp d7335120 lr e9e3cdc1 pc e9e3cdc8 cpsr 800e0030
12-07 19:19:48.752 476 476 F DEBUG :
12-07 19:19:48.752 476 476 F DEBUG : backtrace:
12-07 19:19:48.753 476 476 F DEBUG : #00 pc 00167dc8 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::UpdateTextureSampler(EsxSamplerDesc const_, A4xTextureObject const_, A4xSamplerObject const_)+187)
12-07 19:19:48.753 476 476 F DEBUG : #1 pc 001680a3 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateTexSamplersCommon(A4xProgram_, int, EsxBitField96_)+302)
12-07 19:19:48.753 476 476 F DEBUG : #2 pc 001683cb /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateTexSamplers()+58)
12-07 19:19:48.754 476 476 F DEBUG : #3 pc 00166ab5 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateState(EsxDrawDescriptor const_)+1564)
12-07 19:19:48.754 476 476 F DEBUG : #4 pc 00166e7d /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::HwValidateGfxState(EsxDrawDescriptor const_)+4)
12-07 19:19:48.754 476 476 F DEBUG : #5 pc 000e6f0b /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::ValidateGfxState(EsxDrawDescriptor const_)+442)
12-07 19:19:48.754 476 476 F DEBUG : #6 pc 000dc509 /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::DrawArraysInstanced(EsxPrimType, int, unsigned int, unsigned int)+112)
12-07 19:19:48.755 476 476 F DEBUG : #7 pc 000c8fff /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlDrawArrays(unsigned int, int, int)+62)
12-07 19:19:48.755 476 476 F DEBUG : #8 pc 744237a5 /data/dalvik-cache/arm/system@[email protected] (offset 0x1ec4000)

12-07 20:17:23.719 4545 21239 F libc : Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 21239 (TextureMovieEnc)
12-07 20:17:23.801 476 476 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
12-07 20:17:23.803 476 476 F DEBUG : Build fingerprint: 'google/bullhead/bullhead:6.0/MDA89E/2296692:user/release-keys'
12-07 20:17:23.804 476 476 F DEBUG : Revision: 'rev_1.0'
12-07 20:17:23.804 476 476 F DEBUG : ABI: 'arm'
12-07 20:17:23.804 476 476 F DEBUG : pid: 4545, tid: 21239, name: TextureMovieEnc
12-07 20:17:23.804 476 476 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
12-07 20:17:23.833 476 476 W debuggerd: type=1400 audit(0.0:3436): avc: denied { search } dev="dm-2" ino=384999 scontext=u:r:debuggerd:s0 tcontext=u:object_r:app_data_file:s0:c512,c768 tclass=dir permissive=0
12-07 20:17:23.870 476 476 F DEBUG : r0 00000000 r1 00000000 r2 00000000 r3 00000001
12-07 20:17:23.870 476 476 F DEBUG : r4 d6ac3200 r5 00000000 r6 00000001 r7 d7d7e25c
12-07 20:17:23.871 476 476 F DEBUG : r8 00000001 r9 d6517000 sl d7d2bd60 fp d65192a4
12-07 20:17:23.871 476 476 F DEBUG : ip 00000005 sp d3053d10 lr e9e31285 pc e9e31290 cpsr 800e0030
12-07 20:17:23.951 476 476 F DEBUG :
12-07 20:17:23.951 476 476 F DEBUG : backtrace:
12-07 20:17:23.952 476 476 F DEBUG : #00 pc 0015c290 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateYuvConversionConstants(A4xProgram const_, int)+307)
12-07 20:17:23.952 476 476 F DEBUG : #1 pc 0015c5df /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateNamedUniformConstants()+570)
12-07 20:17:23.952 476 476 F DEBUG : #2 pc 001667d7 /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateState(EsxDrawDescriptor const_)+830)
12-07 20:17:23.953 476 476 F DEBUG : #3 pc 00166e7d /vendor/lib/egl/libGLESv2_adreno.so (A4xContext::HwValidateGfxState(EsxDrawDescriptor const_)+4)
12-07 20:17:23.943 476 476 W debuggerd: type=1400 audit(0.0:3437): avc: denied { read } for name="kgsl-3d0" dev="tmpfs" ino=8778 scontext=u:r:debuggerd:s0 tcontext=u:object_r:gpu_device:s0 tclass=chr_file permissive=0
12-07 20:17:23.953 476 476 W debuggerd: type=1400 audit(0.0:3438): avc: denied { read } for name="kgsl-3d0" dev="tmpfs" ino=8778 scontext=u:r:debuggerd:s0 tcontext=u:object_r:gpu_device:s0 tclass=chr_file permissive=0
12-07 20:17:23.953 476 476 F DEBUG : #4 pc 000e6f0b /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::ValidateGfxState(EsxDrawDescriptor const_)+442)
12-07 20:17:23.953 476 476 F DEBUG : #5 pc 000dc509 /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::DrawArraysInstanced(EsxPrimType, int, unsigned int, unsigned int)+112)
12-07 20:17:23.953 476 476 F DEBUG : #6 pc 000c8fff /vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlDrawArrays(unsigned int, int, int)+62)
12-07 20:17:23.953 476 476 F DEBUG : #7 pc 744237a5 /data/dalvik-cache/arm/system@[email protected] (offset 0x1ec4000)

F/libc ( 4117): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 4459 (TextureMovieEnc)
I/DEBUG ( 355): *** *** *** *** *** *** *** *** *** *** *** *** *** *** ** ***
I/DEBUG ( 355): Build fingerprint: 'google/shamu/shamu:5.1.1/LMY47Z/1860966:user/release-keys'
I/DEBUG ( 355): Revision: '33696'
I/DEBUG ( 355): ABI: 'arm'
I/DEBUG ( 355): pid: 4117, tid: 4459, name: TextureMovieEnc
I/DEBUG ( 355): signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
I/DEBUG ( 355): r0 9baa2600 r1 00000000 r2 00000000 r3 00000000
I/DEBUG ( 355): r4 00000001 r5 9bff3100 r6 9baa2600 r7 aecc1e5c
I/DEBUG ( 355): r8 9be42f28 r9 00000001 sl 9bff3100 fp 9be42f00
I/DEBUG ( 355): ip 9bb30e00 sp 93f7b6a0 lr aac04477 pc aac04240 cpsr 200b8c30
I/DEBUG ( 355):
I/DEBUG ( 355): backtrace:
I/DEBUG ( 355): #00 pc 00100240 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxResource::GfxMem(unsigned int) const+7)
I/DEBUG ( 355): #1 pc 00100473 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxResource::UpdateGfxMemReference(EsxCmdMgr_, unsigned int, EsxAccessType)+18)
I/DEBUG ( 355): #2 pc 00100537 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxResource::UpdatePackedGfxMemReference(EsxCmdMgr_, EsxSubResourceRange const_, EsxAccessType)+170)
I/DEBUG ( 355): #3 pc 0014b357 /system/vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateTexSamplers()+434)
I/DEBUG ( 355): #4 pc 00149e97 /system/vendor/lib/egl/libGLESv2_adreno.so (A4xContext::ValidateState(EsxDrawDescriptor const_)+1394)
I/DEBUG ( 355): #5 pc 0014a0a1 /system/vendor/lib/egl/libGLESv2_adreno.so (A4xContext::HwValidateGfxState(EsxDrawDescriptor const_)+8)
I/DEBUG ( 355): #6 pc 00114235 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxContext::ValidateGfxState(EsxDrawDescriptor const_)+420)
I/DEBUG ( 355): #7 pc 00117211 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxContext::DrawArraysInstanced(EsxPrimType, int, unsigned int, unsigned int)+92)
I/DEBUG ( 355): #8 pc 000b2e81 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlDrawArrays(unsigned int, int, int)+52)
I/DEBUG ( 355): #9 pc 000e54eb /system/vendor/lib/egl/libGLESv2_adreno.so (EsxGlApiParamValidate::GlDrawArrays(EsxDispatch
, unsigned int, int, int)+58)
I/DEBUG ( 355): #10 pc 000a9a61 /system/vendor/lib/egl/libGLESv2_adreno.so (glDrawArrays+44)
I/DEBUG ( 355): #11 pc 00b1260b /data/dalvik-cache/arm/system@[email protected]
I/ThermalEngine( 367): ACTION: CPU - Setting CPU[0] to 1190400
I/ThermalEngine( 367): ACTION: CPU - Setting CPU[1] to 1190400
I/ThermalEngine( 367): ACTION: CPU - Setting CPU[2] to 1190400
I/ThermalEngine( 367): ACTION: CPU - Setting CPU[3] to 1190400

F/libc (13050): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 15568 (TextureMovieEnc)
I/DEBUG ( 651): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
I/DEBUG ( 651): UUID: 212c74fe-e554-4ff9-b0e1-3f5f6ad4ecc6
I/DEBUG ( 651): Build fingerprint: 'docomo/SO-03H/SO-03H:5.1.1/32.0.B.0.426/2049702380:user/release-keys'
I/DEBUG ( 651): Revision: '0'
I/DEBUG ( 651): ABI: 'arm'
I/DEBUG ( 651): pid: 13050, tid: 15568, name: TextureMovieEnc
I/DEBUG ( 651): signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
I/DEBUG ( 651): r0 00000000 r1 00000001 r2 00000006 r3 00000000
I/DEBUG ( 651): r4 dee9b000 r5 dec1a600 r6 dece7b70 r7 ded85240
I/DEBUG ( 651): r8 00008d65 r9 00000006 sl dee97f00 fp 00000001
I/DEBUG ( 651): ip ef712f90 sp daf107d8 lr 00000000 pc ef57f2c6 cpsr a00e0030
I/DEBUG ( 651):
I/DEBUG ( 651): backtrace:
I/DEBUG ( 651): #00 pc 000a82c6 /system/vendor/lib/egl/libGLESv2_adreno.so (EsxContext::GlBindTexture(unsigned int, unsigned int)+341)
I/DEBUG ( 651): #1 pc 000a05bb /system/vendor/lib/egl/libGLESv2_adreno.so (glBindTexture+26)
I/DEBUG ( 651): #2 pc 00b2459d /system/framework/arm/boot.oat

About DecodeEditEncodeTest.java

hi Fadden
I noticed you released some sample code on bigflake.com/mediacodec, one sample is DecodeEditEncodeTest.java
I am studying this file and have one question: I cannot find the definition of InputSurface and OutputSurface
Could you also share this part sample code? Thanks a lot.

MultiSurfaceActivity bouncing ball freeze

Hi,

I am running grafika MultiSurfaceActivity and Record GL on Andriod Lollipop 5.1 running on Freescale Sabre SD board.

I can see all the test play smoothly when no screen is touched or disturbed, but as soon as I start scrolling the notification bar from top and continue that for long time, the fps gets reduced to 35-40FPS.

I have confirmed the same test on Kitkat 4.4.2 and JB 4.2.2 and they seems to work fine.

So behaviour when playing MP4 from Gallery. The video gets stuck and lags a lot when we start playing with Notification bar

Can you share your though on this. I also had to change BufferQueue implementation to ASync mode to achieve over 60FPS.

Regards,
Gurtaj

Race condition in "show + capture camera"

Some excellent research by Petros Douvantzis determined that there is a race in the texture handling in "show + capture camera". The source of the problem is the use of a shared EGL context -- the camera frames are converted to an "external" texture in one context, but rendered from another.

This discussion on the WebGL mailing list discusses the matter. Appendix C in the OpenGL ES spec, and section 2.4 in the EGL 1.5 spec, explain the expected behavior.

For correct behavior, the application code must ensure mutually exclusive access during the texture update, and needs to issue GL commands that effectively provide memory barriers. On the producer side, updateTexImage() must be followed by glFinish(), with the two wrapped with a synchronized block or the write op of a read/write lock. On the consumer side, the texture must be re-bound before being drawn to effect the memory barrier, and the rendering operation must be synchronized or within the read op of a read/write lock.

These operations will potentially stall the involved threads, reducing throughput. It's better, and simpler, to use a single EGLContext, and call updateTexImage() from the thread that does the rendering. It's possible to use a single context with GLSurfaceView by attaching the SurfaceTexture to the GLSurfaceView's context with the appropriate API calls. The "show + capture camera" demo should be updated to use this approach, and avoid shared contexts altogether.

The other Activities in Grafika use SurfaceView and a single context, and are not affected by this issue.

video with multiple views

HI,

I want to play a video file and duplicate the rendering views eventually with different filters. If I get it right, I can do it using one surfaceview with several quads but it may be difficult to have a flexible layout with other components in between and manage several shaders.

In other words, is there a way to have two surface view pointing to the same video texture or mediaplayer with different shaders for different filter effect.

Thanks for your help, Best, Thibaut

Movie player memory issue

Hi all, I am using the movie player to reproduce a list of videos. If I reproduce the videos for a day we have a issue with memory.

I/ActivityManager( 439): Low on memory:
I/ActivityManager( 439): ntv N 630053 kB: mediaserver (109) native
I/ActivityManager( 439): ntv N 3631 kB: zygote (107) native
I/ActivityManager( 439): ntv N 1248 kB: surfaceflinger (106) native
I/ActivityManager( 439): ntv N 977 kB: drmserver (108) native
I/ActivityManager( 439): ntv N 531 kB: netd (102) native

This produces that our applications goes down.

Also I have this trace (like in a loop) in the log:

E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)
E/MediaCodec( 4719): native_window_api_connect returned an error: Invalid argument (-22)
I/OMXClient( 4719): Using client-side OMX mux.
E/BufferQueue( 106): [SurfaceView] connect: already connected (cur=3, req=3)

Any idea?

All the best!

Nexus 6 camera gets stuck

Hey @fadden or @google-admin !

I am trying to run the ContinuousCaptureActivity app (Show + Capture camera) on my Nexus 6 (and Nexus 5x) and after about 6-7 minutes, the preview always gets stuck on the last frame and stays there. Make sure to not click on "start recording", just watch the preview.

Narrowing it down, its obviously not the TextureMovieEncoder (since i'm not recording yet), and onDrawFrame is still getting called, and same with onFrameAvailable, so either the frames from the camera are getting stuck or the opengl texture is getting stuck.

I am running Nexus 6 Android 6.0.1 API 23 and there are also no significant logs that I can see.

This might also be relevant to
#36

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.