Use GStreamer to play media streams in chromium. We implemented a Media Process which is own by the Browser Process and creates players on-demand. Any Video tag will be backed by a GStreamer pipeline that lives in the Media Process.
In the case of MSE, currently Render Process sends data to Media Process within IPC messages directly. It is done the same way for Android (but sent to Browser Process).
It should be more efficient to send this data through shared memory. Then we could wrap these shm buffers into a special GstBufferPool. And tell appsrc to use it. We should save 2 copies of encoded data.
As we always need gst core libraries we do no need to filter them when running the sandbox. The usual way to handle this in Chromium is to load some shared libraries in the preSandbox step.
Then filtering of the "open" system call will be done only for path of gstreamer plugins shared libraries and sub dependencies.
For now media process is guarded under USE_GSTREAMER.
But Media Process has to be generic and should host any kind of media backend.
Adding this new guard also means to rename some classes we added. For example WebMediaPlayerGStreamer should be renamed to WebMediaPlayerProxy.
Also move this class to "content" namespace instead of "media".
Also move build in gst plugins into a new sub directory (rename gst_chromium_http_source to gst_chromium_http_src).
We are working on the latest chromium release that we integrated with the GStreamer Samsung Backend, following the instruction from the official website.
We try to play videos without success(local videos, videos on youtube).
It seems that the video never starts.
In chromium flags the gstreamer option is enabled. Using gstremer as standalone application all are fine.
Openh264 is an open source and free implementation of the h264 codec backed by Cisco and mozilla wich perform very well.
Sorry if say Something of stupid I'm Just a lambda user :p also thanks for your welcome work on gallium 3d !!
Currently it is done in MediaPlayerGStreamer for the first video tag. Because initially we thought having on glthread per video tag. But this is actually overkill. It is better to have only one IPC channel with the GPU process.
So we can now move "::gles2::Initialize()" to MediaChildThread and properly call "::gles2::deinit()"
It should be possible to compile Chromium entirely without the FFmpeg backend (only with GStreamer), not only either with both (runtime-selectable) or only with FFmpeg. This is especially important for us in Fedora because FFmpeg is not allowed in Fedora.
Currently, the build instructions recommend setting use_proprietary_codecs=1, which always reports those codecs as available. But on Fedora, those codecs are not available out of the box, but only if the plugins are installed from third-party repositories. So there needs to be a runtime check for at least media::MimeUtil::allow_proprietary_codecs_. But ideally, availability needs to be checked for each individual codec.
Chromium is moving from GYP to GN to generate "ninja" build. We currently only support GYP for now. We should wait for the GN move to be completed first.
Yes there are chromium but the people use chrome instead of chromium or Iron...
So there is a port planned for implement gstreamer on Firefox ? :)
Sorry for bad english.
The new servo layout engine is the future, it outperform gecko and blink by up to 25x ! And is more secure (rust inside)
So, Samsung already work with mozilla for servo and plan to create a New modern browser for Android at the beggining based on blink but is it planned that it will switch to servo when servo will be quite stable :) (Q2/3 2016 for Android and 2017/18 for desktop because desktop websites are more complex in general).
So it's on the interest of Samsung and of the humanity (the free software magic).
Sorry for my poor english and thanks for developing free software =).