Git Product home page Git Product logo

Comments (15)

ChillingVan avatar ChillingVan commented on May 25, 2024

可以,请查看https://github.com/ChillingVan/android-openGL-canvas的OffScreenCanvas例子

from androidinstantvideo.

xiaotian317 avatar xiaotian317 commented on May 25, 2024

您好,我想实现的效果是把本地的时候不播放的情况下给每一帧的画面上填写一些图片或者文字,看了您的两个项目,暂时想到的方法是遍历视频文件的每一帧调用OffScreenCanvas例子给这一帧加上文件或者图片,然后保存起来,最后遍历完后把得到的所有图片生成一个视频,感觉这种效果不太好,请问如果是您的话,您是会怎样去实现呢,麻烦指点一下,感谢

from androidinstantvideo.

xiaotian317 avatar xiaotian317 commented on May 25, 2024

我好像要做的就是给ProduceTextureView设置一个视频让它播放,但是我这样设置好像没有效果
代码是放在ProduceTextureView中
@OverRide
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width,int height) {
Log.i("Surface","onSurfaceTextureAvailable ");

    super.onSurfaceTextureAvailable(surfaceTexture,width,height);
    surface=new Surface(surfaceTexture);
    new PlayerVideo().start();//开启一个线程去播放视频
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width,int height) {
    super.onSurfaceTextureSizeChanged(surface,width,height);
}

@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
    super.onSurfaceTextureDestroyed(surfaceTexture);
    surface=null;
    mMediaPlayer.stop();
    mMediaPlayer.release();
    return true;
}

@Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {

// System.out.println("onSurfaceTextureUpdated onSurfaceTextureUpdated");
super.onSurfaceTextureUpdated(surfaceTexture);
}

from androidinstantvideo.

ChillingVan avatar ChillingVan commented on May 25, 2024

可以参照这个项目的H264Encoder的用法。在它的OnDrawListener里就可以给视频每一帧绘制各种东西了。文字需要先转为Bitmap再绘制。

from androidinstantvideo.

xiaotian317 avatar xiaotian317 commented on May 25, 2024

在ProduceTextureView中
@OverRide
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width,int height) {
Log.i("Surface","onSurfaceTextureAvailable ");

super.onSurfaceTextureAvailable(surfaceTexture,width,height);
surface=new Surface(surfaceTexture);
new PlayerVideo().start();//开启一个线程去播放视频

}

我是这样给ProduceTextureView设置视频的,但是在
mMediaPlayer.setOnSeekCompleteListener(new MediaPlayer.OnSeekCompleteListener() {
@OverRide
public void onSeekComplete(MediaPlayer mediaPlayer) {
Log.i("onCompletion","onSeekComplete----"+mediaPlayer.getCurrentPosition());
}
});
打印出来的视频播放进度一直是0,ProduceTextureView不会播放视频

from androidinstantvideo.

ChillingVan avatar ChillingVan commented on May 25, 2024

你的surface有使用吗?ProduceTextureView有使用canvas绘制吗?mediaPlayer应该可以的,不过如果可以的话,播放视频还是用MediaCodec解码吧

from androidinstantvideo.

xiaotian317 avatar xiaotian317 commented on May 25, 2024

首先感谢您的耐心解答,谢谢
以下是我的MediaPlayer的代码,是有使用surface的,ProduceTextureView也有使用canvas绘制,现在的结果时绘制的操作在生成的h264视频有,但是背景是黑色的,视频没有,打印的 Log.i("onCompletion","onSeekComplete----"+mediaPlayer.getCurrentPosition());的进度也一直是0

private class PlayerVideo extends Thread{
@OverRide
public void run(){
try {
File file=new File(Environment.getExternalStorageDirectory()+"/pp.mp4");

            mMediaPlayer= new MediaPlayer();
            mMediaPlayer.setDataSource(file.getAbsolutePath());
            mMediaPlayer.setSurface(surface);
            mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
            mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
                @Override
                public void onPrepared(MediaPlayer mp){
                    mMediaPlayer.start();
                }
            });
            mMediaPlayer.setOnSeekCompleteListener(new MediaPlayer.OnSeekCompleteListener() {
                @Override
                public void onSeekComplete(MediaPlayer mediaPlayer) {
                    Log.i("onCompletion","onSeekComplete----"+mediaPlayer.getCurrentPosition());
                }
            });
            mMediaPlayer.prepare();

        } catch (Exception e) {
            e.printStackTrace();
        }
    }
}

from androidinstantvideo.

ChillingVan avatar ChillingVan commented on May 25, 2024

我有空再看看

from androidinstantvideo.

xiaotian317 avatar xiaotian317 commented on May 25, 2024

好的,感谢

from androidinstantvideo.

xiaotian317 avatar xiaotian317 commented on May 25, 2024

您好,我现在改用MediaCodec播放视频,会在decoder.configure(format, surface, null, 0);这一行报错
错误:
E/AndroidRuntime: FATAL EXCEPTION: Thread-4832
Process: com.chillingvan.instantvideo.sample, PID: 27306
android.media.MediaCodec$CodecException: Error 0xffffffea
at android.media.MediaCodec.native_configure(Native Method)
at android.media.MediaCodec.configure(MediaCodec.java:590)
at com.chillingvan.instantvideo.sample.test.video.ProduceTextureView$PlayerThread.run(ProduceTextureView.java:149)

以下是我修改ProduceTextureView后的代码

public class ProduceTextureView extends GLSurfaceTextureProducerView {

private String TAG="ProduceTextureView";
private TextureFilter textureFilter = new BasicTextureFilter();
private Bitmap bitmap;
private static final String SAMPLE = Environment.getExternalStorageDirectory() + "/pp.mp4";
private PlayerThread mPlayer = null;
private Surface mSurface;

public ProduceTextureView(Context context) {
    super(context);
}

public ProduceTextureView(Context context, AttributeSet attrs) {
    super(context, attrs);
}

public ProduceTextureView(Context context, AttributeSet attrs, int defStyleAttr) {
    super(context, attrs, defStyleAttr);
}

@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
    super.onSurfaceTextureAvailable(surface, width, height);
    if (mPlayer == null) {
        mSurface=new Surface(surface);
        mPlayer = new PlayerThread(mSurface);
        mPlayer.start();
    }
}

@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
    if (mPlayer != null) {
        mPlayer.interrupt();
    }
    return false;
}

@Override
public void onSurfaceCreated() {
    bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.lenna);
    setProducedTextureTarget(GLES20.GL_TEXTURE_2D);
    super.onSurfaceCreated();
}

public void setTextureFilter(TextureFilter textureFilter) {
    this.textureFilter = textureFilter;
}


private Bitmap bitmap1;
private Bitmap bitmap2;
private int index=1;
@Override
protected void onGLDraw(ICanvasGL canvas, SurfaceTexture producedSurfaceTexture, RawTexture producedRawTexture, @Nullable SurfaceTexture sharedSurfaceTexture, @Nullable BasicTexture sharedTexture) {

    TestVideoEncoder.drawRect(canvas, drawCnt);
}


private class PlayerThread extends Thread {
    private MediaExtractor extractor;
    private MediaCodec decoder;
    private Surface surface;

    public PlayerThread(Surface surface) {
        this.surface = surface;
    }

    @Override
    public void run() {
        extractor = new MediaExtractor();
        try {
            extractor.setDataSource(SAMPLE);
        } catch (IOException e) {
            e.printStackTrace();
        }

        for (int i = 0; i < extractor.getTrackCount(); i++) {
            MediaFormat format = extractor.getTrackFormat(i);
            String mime = format.getString(MediaFormat.KEY_MIME);
            if (mime.startsWith("video/")) {
                extractor.selectTrack(i);
                try {
                    decoder = MediaCodec.createDecoderByType(mime);
                } catch (IOException e) {
                    e.printStackTrace();
                }
                Log.e(TAG," decoder.configure(format, surface, null, 0);    "+surface);

               decoder.configure(format, surface, null, 0);


                break;
            }
        }

        if (decoder == null) {
            Log.e(TAG, "Can't find video info!");
            return;
        }

        decoder.start();

        ByteBuffer[] inputBuffers = decoder.getInputBuffers();
        ByteBuffer[] outputBuffers = decoder.getOutputBuffers();
        MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
        boolean isEOS = false;
        long startMs = System.currentTimeMillis();

        while (!Thread.interrupted()) {
            if (!isEOS) {
                int inIndex = decoder.dequeueInputBuffer(10000);
                if (inIndex >= 0) {
                    ByteBuffer buffer = inputBuffers[inIndex];
                    int sampleSize = extractor.readSampleData(buffer, 0);
                    if (sampleSize < 0) {
                        // We shouldn't stop the playback at this point, just pass the EOS
                        // flag to decoder, we will get it again from the
                        // dequeueOutputBuffer
                        Log.d(TAG, "InputBuffer BUFFER_FLAG_END_OF_STREAM");
                        decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                        isEOS = true;
                    } else {
                        decoder.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0);
                        extractor.advance();
                    }
                }
            }

            int outIndex = decoder.dequeueOutputBuffer(info, 10000);
            switch (outIndex) {
                case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                    Log.d(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
                    outputBuffers = decoder.getOutputBuffers();
                    break;
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    Log.d(TAG, "New format " + decoder.getOutputFormat());
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    Log.d(TAG, "dequeueOutputBuffer timed out!");
                    break;
                default:
                    ByteBuffer buffer = outputBuffers[outIndex];
                    Log.v(TAG, "We can't use this buffer but render it due to the API limit, " + buffer);

                    // We use a very simple clock to keep the video FPS, or the video
                    // playback will be too fast
                    while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
                        try {
                            sleep(10);
                        } catch (InterruptedException e) {
                            e.printStackTrace();
                            break;
                        }
                    }
                    decoder.releaseOutputBuffer(outIndex, true);
                    break;
            }

            // All decoded frames have been rendered, we can stop playing now
            if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                Log.d(TAG, "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
                break;
            }
        }

        decoder.stop();
        decoder.release();
        extractor.release();
    }
}

}

from androidinstantvideo.

nishibaiyang avatar nishibaiyang commented on May 25, 2024

应该在这项目上补充这些操作,这样才是完整的链路。否则视频解码这块缺失了,直接从对流的操作开始阅读代码还是让人迷惑的(最好放一个解析本地视频资源,然后对视频每一帧进行操作的样例)

from androidinstantvideo.

nishibaiyang avatar nishibaiyang commented on May 25, 2024

MediaCodec解析视频后,output成h264文件时,testVideoEncoder.write();时出现android.media.MediaCodec$CodecException: Error 0xffffffde

from androidinstantvideo.

ChillingVan avatar ChillingVan commented on May 25, 2024

@xiaotian317 @nishibaiyang
要完整地写完播放的需要花一些时间。你们的需求是:
把本地的时候不播放的情况下给每一帧的画面上填写一些图片或者文字,然后输出成一个新的视频。
我在本项目的一个例子A是:
从Camera里获取数据-->绘制-->编码-->rtmp上送。
按你们的需求:
本质就是
解码获得视频数据-->绘制-->编码-->本地文件

所以你们暂时可以先参考例子A,把前后两步替换掉就可以了

from androidinstantvideo.

ChillingVan avatar ChillingVan commented on May 25, 2024

@xiaotian317 @nishibaiyang

已完成使用MediaPlayer版的播放了,也就是可以从中获取视频数据了。当然音频数据另说。
例子地址:

https://github.com/ChillingVan/android-openGL-canvas/blob/master/canvasglsample/src/main/java/com/chillingvan/canvasglsample/video/MediaPlayerActivity.java

from androidinstantvideo.

xiaotian317 avatar xiaotian317 commented on May 25, 2024

好的,感谢热心指导,谢谢

from androidinstantvideo.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.