Git Product home page Git Product logo

streampack's Introduction

StreamPack: RTMP and SRT live streaming SDK for Android

StreamPack is a modular live streaming library for Android made for both demanding video broadcasters and new video enthusiasts.

It is designed to be used in live streaming and gaming apps.

Setup

Get StreamPack core latest artifacts on mavenCentral:

dependencies {
    implementation 'io.github.thibaultbee:streampack:2.6.0'
    // For RTMP
    implementation 'io.github.thibaultbee:streampack-extension-rtmp:2.6.0'
    // For SRT
    implementation 'io.github.thibaultbee:streampack-extension-srt:2.6.0'
}

If you use both RTMP and SRT, you might have a conflict with libssl.so and libcrypto.so because they are both includes in native dependencies. To solve this, you can add in your build.gradle:

android {
    packagingOptions {
        pickFirst '**/*.so'
    }
}

Features

  • Video:
    • Source: Cameras or Screen recorder
    • Orientation: portrait or landscape
    • Codec: HEVC/H.265, AVC/H.264, VP9 or AV1 (experimental, see #90)
    • HDR (experimental, see #91)
    • Configurable bitrate, resolution, framerate (tested up to 60), encoder level, encoder profile
    • Video only mode
    • Device video capabilities
  • Audio:
    • Codec: AAC:LC, HE, HEv2,... or Opus
    • Configurable bitrate, sample rate, stereo/mono, data format
    • Processing: Noise suppressor or echo cancellation
    • Audio only mode
    • Device audio capabilities
  • File: TS or FLV or Fragmented MP4
    • Write to a single file or multiple chunk files
  • Streaming: RTMP/RTMPS or SRT
    • Support for enhanced RTMP
    • Ultra low-latency based on SRT
    • Network adaptive bitrate mechanism for SRT

Samples

Camera and audio sample

For source code example on how to use camera and audio streamers, check the sample app directory. On first launch, you will have to set RTMP url or SRT server IP in the settings menu.

Screen recorder

For source code example on how to use screen recorder streamer, check the sample screen recorder directory . On first launch, you will have to set RTMP url or SRT server IP in the settings menu.

Tests with a FFmpeg server

FFmpeg has been used as an SRT server+demuxer+decoder for the tests.

RTMP

Tells FFplay to listen on IP 0.0.0.0 and port 1935.

ffplay -listen 1 -i 'rtmp://0.0.0.0:1935/s/streamKey'

On StreamPack sample app settings, set Endpoint -> Type to Stream to a remove RTMP device, then set the server URL to rtmp://serverip:1935/s/streamKey. At this point, StreamPack sample app should successfully sends audio and video frames. On FFplay side, you should be able to watch this live stream.

SRT

Check how to build FFmpeg with libsrt in SRT CookBook. Tells FFplay to listen on IP 0.0.0.0 and port 9998:

ffplay -fflags nobuffer 'srt://0.0.0.0:9998?mode=listener'

On StreamPack sample app settings, set the server IP to your server IP and server Port to 9998 . At this point, StreamPack sample app should successfully sends audio and video frames. On FFplay side, you should be able to watch this live stream.

Quick start

If you want to create a new application, you should use the template StreamPack boilerplate. In 5 minutes, you will be able to stream live video to your server.

  1. Add permissions to your AndroidManifest.xml and request them in your Activity/Fragment.

  2. Create a SurfaceView to display camera preview in your layout

As a camera preview, you can use a SurfaceView, a TextureView or any View where that can provide a Surface.

To simplify integration, StreamPack provides an PreviewView.

<layout>
    <io.github.thibaultbee.streampack.views.PreviewView android:id="@+id/preview"
        android:layout_width="match_parent" android:layout_height="match_parent"
        app:enableZoomOnPinch="true" />
</layout>

app:enableZoomOnPinch is a boolean to enable zoom on pinch gesture.

  1. Instantiate the streamer (main live streaming class)
val streamer = CameraSrtLiveStreamer(context = requireContext())
  1. Configure audio and video settings
val audioConfig = AudioConfig(
    startBitrate = 128000,
    sampleRate = 44100,
    channelConfig = AudioFormat.CHANNEL_IN_STEREO
)

val videoConfig = VideoConfig(
    startBitrate = 2000000, // 2 Mb/s
    resolution = Size(1280, 720),
    fps = 30
)

streamer.configure(audioConfig, videoConfig)
  1. Inflate the camera preview with the streamer
/**
 * If the preview is in a PreviewView
 */
preview.streamer = streamer
/**
 * If the preview is in a SurfaceView, a TextureView, or any View that can provide a Surface
 */
streamer.startPreview(preview)
  1. Start the live streaming
streamer.startStream(ip, port)
  1. Stop and release the streamer
streamer.stopStream()
streamer.disconnect()
streamer.stopPreview() // The StreamerSurfaceView will be automatically stop the preview
streamer.release()

For more detailed explanation, check out the API documentation.

Permissions

You need to add the following permissions in your AndroidManifest.xml:

<manifest>
    <!-- Only for a live -->
    <uses-permission android:name="android.permission.INTERNET" />
    <!-- Only for a record -->
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
</manifest>

For a record, you also need to request the following dangerous permission: android.permission.WRITE_EXTERNAL_STORAGE.

To use the camera, you need to request the following permission:

<manifest>
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.CAMERA" />
</manifest>

Your application also has to request the following dangerous permission: android.permission.RECORD_AUDIO, android.permission.CAMERA.

For the PlayStore, your application might declare this in its AndroidManifest.xml

<manifest>
    <uses-feature android:name="android.hardware.camera" android:required="true" />
    <uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />
</manifest>

To use the screen recorder, you need to request the following permission:

<manifest>
    <uses-permission android:name="android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION" />
    <uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
    <uses-permission android:name="android.permission.POST_NOTIFICATIONS" />
    <!-- Only if you have to record audio -->
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
</manifest>

You will also have to declare the Service,

<application>
    <!-- YourScreenRecorderService extends ScreenRecorderRtmpLiveService or ScreenRecorderSrtLiveService -->
    <service android:name=".services.YourScreenRecorderService" android:exported="false"
        android:foregroundServiceType="mediaProjection" />
</application>

Tips

RTMP or SRT

RTMP and SRT are both live streaming protocols. SRT is a UDP-based modern protocol, it is reliable and ultra low latency. RTMP is a TCP-based protocol, it is also reliable but it is only low latency. There are already a lot of comparison over the Internet, so here is a summary: SRT:

  • Ultra low latency (< 1s)
  • HEVC support through MPEG-TS RTMP:
  • Low latency (2-3s)
  • HEVC not officially support (specification has been aban by its creator)

So, the main question is: "which protocol to use?" It is easy: if your server has SRT support, use SRT otherwise use RTMP.

Streamers

Let's start with some definitions! Streamers are classes that represent a live streaming pipeline: capture, encode, mux and send. They comes in multiple flavours: with different audio and video source, with different endpoints and functionalities... 3 types of base streamers are available:

  • CameraStreamers: for streaming from camera
  • ScreenRecorderStreamers: for streaming from screen
  • AudioOnlyStreamers: for streaming audio only

You can find specific streamers for File or for Live. Currently, there are 2 main endpoints:

  • FileStreamer: for streaming to file
  • LiveStreamer: for streaming to a RTMP or a SRT live streaming server

For example, you can use AudioOnlyFlvFileStreamer to stream from microphone only to a FLV file. Another example, you can use CameraRtmpLiveStreamer to stream from camera to a RTMP server.

If a streamer is missing, of course, you can also create your own. You should definitely submit it in a pull request.

Get device capabilities

Have you ever wonder: "What are the supported resolution of my cameras?" or "What is the supported sample rate of my audio codecs?"? Helpers classes are made for this. All Streamer comes with a specific Helper object (I am starting to have the feeling I repeat myself):

val helper = streamer.helper

Get extended settings

If you are looking for more settings on streamer, like the exposure compensation of your camera, you must have a look on Settings class. All together: "All Streamer comes with a specific Settings object":

streamer.settings

For example, if you want to change the exposure compensation of your camera, on a CameraStreamers you can do it like this:

streamer.settings.camera.exposure.compensation = value

Moreover you can check exposure range and step with:

streamer.settings.camera.exposure.availableCompensationRange
streamer.settings.camera.exposure.availableCompensationStep

Screen recorder Service

To record the screen, you have to use one of the ScreenRecorderStreamers inside an Android Service. To simplify this integration, StreamPack provides several ScreenRecorderService classes. Extends one of these class and overrides onNotification to customise the notification.

Android SDK version

Even if StreamPack sdk supports a minSdkVersion 21. I strongly recommend to set the minSdkVersion of your application to a higher version (the highest is the best!) for higher performance.

Licence

Copyright 2021 Thibault B.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

streampack's People

Contributors

dependabot[bot] avatar moliyadi avatar thibaultbee avatar yhbsh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

streampack's Issues

Stream preview dimensions

Hello!
Thanks for the library!
I'm using TextureView to show preview and there is transition that changes TextureView's size (and aspect ratio)
And I need preview height and width to calculate scale for the matrix to have correct aspect ratio of preview.
Is there any possibility to get the stream preview dimensions?
Thanks

Missing sources in the Maven Central

The Maven Central does not provide sources artifact for the StreamPack. Could you add sources to published artifacts?

I try to implement my own ISurfaceCapture for live streaming GL-rendered content, however it is very difficult to understand hierarchy and what library classes do without sources. The Android Studio has a "Download sources" feature, but it doesn't work for your library due to missing sources artifact.

There is a stackoverflow answer that presents a simple solution to this issue: https://stackoverflow.com/questions/26874498/publish-an-android-library-to-maven-with-aar-and-sources-jar I can use it to publish sources in a local maven, but I thought it's worth sharing to help other devs too.

Unsupported Operation Exception

First off, I'm super excited about this project! Good luck!

I was trying to test the demo app in the first beta release on my Pixel 4a, but I'm getting the following error:

startStream: java.lang.UnsupportedOperationException:
Tried to obtain display from a Context not associated with one. Only visual Contexts (such as Activity or one created with Context#createWindowContext) or ones created with Context#createDisplayContext are associated with displays. Other types of Contexts are typically related to background entities and may return an arbitrary display.

Thanks!

How ti use srt

Hello, when you are not connected to WIFI, srt cannot be used. Why? Looking forward to your reply, thank you!

Connection Error

Version

75df86a

Environment that reproduces the issue

Pixel 6 Pro Android 12

RTMP/SRT/... Server

SRT

Audio configuration

No response

Video configuration

No response

Is it reproducible in the demos application?

Yes

Reproduction steps

using a local SRT Stream server on Linux System and Try to run the app in emulator

Expected result

It should stream the video

Actual result

E/StandaloneCoroutine: startStream failed
java.net.ConnectException: Connection setup failure: connection timed out
at io.github.thibaultbee.srtdroid.models.Socket.connect(Socket.kt:241)
at io.github.thibaultbee.srtdroid.models.Socket.connect(Socket.kt:254)
at io.github.thibaultbee.streampack.ext.srt.internal.endpoints.SrtProducer$connect$4.invokeSuspend(SrtProducer.kt:119)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:3
Screenshot from 2023-04-14 18-34-20
Screenshot from 2023-04-14 18-35-43

Additional context

No response

Relevant logs output

No response

io.github.thibaultbee.streampack.error.StreamPackError: java.lang.IllegalArgumentException

Hi,
I am trying to run the screen recorder demo app in my Oneplus 9RT device (android 12). I have selected the RTMP protocol and also set the server url in settings. Following is the settings which I have used:-
App_settings

But when pressing the record screen button, I am getting the following crash in my phone:-

An error occurred                                                                                                                                                           
io.github.thibaultbee.streampack.error.StreamPackError: java.lang.IllegalArgumentException                                                                                  
	at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:242)                                                                         
	at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:264)                                                                         
	at io.github.thibaultbee.streampack.screenrecorder.ScreenRecorderService.onStartCommand(ScreenRecorderService.kt:314)                                                   
	at android.app.ActivityThread.handleServiceArgs(ActivityThread.java:4807)                                                                                               
	at android.app.ActivityThread.access$2100(ActivityThread.java:254)                                                                                                      
	at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2222)                                                                                                 
	at android.os.Handler.dispatchMessage(Handler.java:106)                                                                                                                 
	at android.os.Looper.loopOnce(Looper.java:233)                                                                                                                          
	at android.os.Looper.loop(Looper.java:344)                                                                                                                              
	at android.app.ActivityThread.main(ActivityThread.java:8212)                                                                                                            
	at java.lang.reflect.Method.invoke(Native Method)                                                                                                                       
	at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:584)                                                                                    
	at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1034)                                                                                                        
Caused by: java.lang.IllegalArgumentException                                                                                                                               
	at android.media.MediaCodec.native_configure(Native Method)                                                                                                             
	at android.media.MediaCodec.configure(MediaCodec.java:2176)                                                                                                             
	at android.media.MediaCodec.configure(MediaCodec.java:2092)                                                                                                             
	at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder.configureCodec(MediaCodecEncoder.kt:180)                                                        
	at io.github.thibaultbee.streampack.internal.encoders.VideoMediaCodecEncoder.createVideoCodec(VideoMediaCodecEncoder.kt:102)                                            
	at io.github.thibaultbee.streampack.internal.encoders.VideoMediaCodecEncoder.configure(VideoMediaCodecEncoder.kt:59)                                                    
	at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:237)                                                                         
	at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:264)                                                                         
	at io.github.thibaultbee.streampack.screenrecorder.ScreenRecorderService.onStartCommand(ScreenRecorderService.kt:314)                                                   
	at android.app.ActivityThread.handleServiceArgs(ActivityThread.java:4807)                                                                                               
	at android.app.ActivityThread.access$2100(ActivityThread.java:254)                                                                                                      
	at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2222)                                                                                                 
	at android.os.Handler.dispatchMessage(Handler.java:106)                                                                                                                 
	at android.os.Looper.loopOnce(Looper.java:233)                                                                                                                          
	at android.os.Looper.loop(Looper.java:344)                                                                                                                              
	at android.app.ActivityThread.main(ActivityThread.java:8212) 
        at java.lang.reflect.Method.invoke(Native Method) 
	at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:584)                                                                                    
	at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1034)
```          

[Bug]: Streaming the SRT video has a lot of lag

Version

2.5.2

Environment that reproduces the issue

samsung galaxy 21+
pixel 6 pro

RTMP/SRT/... Server

SRT

Audio configuration

ACC
sampleRate = 44100,
startBitrate = 128000,

Video configuration

default setting
resolution: 120x120

Is it reproducible in the demos application?

No

Reproduction steps

create stream
view stream on another android device

Expected result

low lag video stream

Actual result

very laggy video stream

Additional context

No response

Relevant logs output

No response

Socket Exception while connect with encrypted srt server

Hi,

I receive SocketException : Connection does not exist while stream to remote SRT device with passphrase every time, and it works ok if stream to remote SRT device without passphrase . Both are using sample app with latest release version 2.5.2,
and I tried stream by ffmpeg with passphrase, and it works fine, so I think it's might not be server's problem. (Same docker image and environment, only difference is with or without passphrase)

Could you please give me some advise to solve this problem?
Thanks a lot.

2023-03-13 18:23:50.718 14063-14667/io.github.thibaultbee.streampack.sample E/: onError
    io.github.thibaultbee.streampack.error.StreamPackError: io.github.thibaultbee.streampack.error.StreamPackError: java.net.SocketException: Connection does not exist
        at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$videoEncoderListener$1.onOutputFrame(BaseStreamer.kt:123)
        at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder$encoderCallback$1.onOutputBufferAvailable(MediaCodecEncoder.kt:109)
        at android.media.MediaCodec$EventHandler.handleCallback(MediaCodec.java:1865)
        at android.media.MediaCodec$EventHandler.handleMessage(MediaCodec.java:1763)
        at android.os.Handler.dispatchMessage(Handler.java:106)
        at android.os.Looper.loopOnce(Looper.java:233)
        at android.os.Looper.loop(Looper.java:334)
        at android.os.HandlerThread.run(HandlerThread.java:67)
     Caused by: io.github.thibaultbee.streampack.error.StreamPackError: java.net.SocketException: Connection does not exist
        at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$muxListener$1.onOutputFrame(BaseStreamer.kt:135)
        at io.github.thibaultbee.streampack.internal.muxers.ts.utils.TSOutputCallback.writePacket(TSOutputCallback.kt:24)
        at io.github.thibaultbee.streampack.internal.muxers.ts.packets.TS.write(TS.kt:130)
        at io.github.thibaultbee.streampack.internal.muxers.ts.packets.Pes.write(Pes.kt:51)
        at io.github.thibaultbee.streampack.internal.muxers.ts.TSMuxer.generateStreams(TSMuxer.kt:154)
        at io.github.thibaultbee.streampack.internal.muxers.ts.TSMuxer.encode(TSMuxer.kt:143)
        at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$videoEncoderListener$1.onOutputFrame(BaseStreamer.kt:120)
        at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder$encoderCallback$1.onOutputBufferAvailable(MediaCodecEncoder.kt:109) 
        at android.media.MediaCodec$EventHandler.handleCallback(MediaCodec.java:1865) 
        at android.media.MediaCodec$EventHandler.handleMessage(MediaCodec.java:1763) 
        at android.os.Handler.dispatchMessage(Handler.java:106) 
        at android.os.Looper.loopOnce(Looper.java:233) 
        at android.os.Looper.loop(Looper.java:334) 
        at android.os.HandlerThread.run(HandlerThread.java:67) 
     Caused by: java.net.SocketException: Connection does not exist
        at io.github.thibaultbee.srtdroid.models.Socket.send(Socket.kt:647)
        at io.github.thibaultbee.streampack.ext.srt.internal.endpoints.SrtProducer.write(SrtProducer.kt:160)
        at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$muxListener$1.onOutputFrame(BaseStreamer.kt:132)
        at io.github.thibaultbee.streampack.internal.muxers.ts.utils.TSOutputCallback.writePacket(TSOutputCallback.kt:24) 
        at io.github.thibaultbee.streampack.internal.muxers.ts.packets.TS.write(TS.kt:130) 
        at io.github.thibaultbee.streampack.internal.muxers.ts.packets.Pes.write(Pes.kt:51) 
        at io.github.thibaultbee.streampack.internal.muxers.ts.TSMuxer.generateStreams(TSMuxer.kt:154) 
        at io.github.thibaultbee.streampack.internal.muxers.ts.TSMuxer.encode(TSMuxer.kt:143) 
        at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$videoEncoderListener$1.onOutputFrame(BaseStreamer.kt:120) 
        at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder$encoderCallback$1.onOutputBufferAvailable(MediaCodecEncoder.kt:109) 
        at android.media.MediaCodec$EventHandler.handleCallback(MediaCodec.java:1865) 
        at android.media.MediaCodec$EventHandler.handleMessage(MediaCodec.java:1763) 
        at android.os.Handler.dispatchMessage(Handler.java:106) 
        at android.os.Looper.loopOnce(Looper.java:233) 
        at android.os.Looper.loop(Looper.java:334) 
        at android.os.HandlerThread.run(HandlerThread.java:67) 

[Bug]: SrtProducer.connect(url) always throws "unknown host" when onConnectionListener is null

Version

2.5.2

Environment that reproduces the issue

N/A

RTMP/SRT/... Server

N/A

Audio configuration

No response

Video configuration

No response

Is it reproducible in the demos application?

Not tested

Reproduction steps

Call srtProducer.connect(url) where url is a valid SRT URL.

Expected result

The method returns successfully.

Actual result

InvalidParameterException: Failed to parse URL <xx>: unknown host is thrown (from SrtProducer.kt:89)

Additional context

Hi! I encountered this small corner case while extending the SRT streamer and playing with it in a toy setup.

It appears that the safe call at SrtProducer.kt:120 will return null when onConnectionListener has not been set.

This seems to cause SrtProducer.kt:88 to evaluate to null even when uri.host is valid and the connection has succeeded. So it throws the error even when everything has worked.

I believe a possible fix might be to return true or Unit at the end of the connect(ip, port) method so that it's not null when successful. (Though I'm not sure of the code design/style considerations here, so I thought it better to just note the issue rather than submit a PR.)

Cheers,
George

Relevant logs output

No response

Dispatch network and heavy computation API

Dispatch network and heavy computation api from Main thread.
Concerned API:

  • connect/disconnect
  • configure
  • startStream/stopStream
  • startCapture/stopCapture
  • OnConnectionListener
  • OnErrorListener

Why ?
To avoid creating an unresponsive UI and simplify API usage.

Bug: Stream timeout...

In my test with Simple Realtime Server it is giving timeout.
I used the demo screen app, it seems that when the image is static, no data is sent to the server so it considers the connection idle and forces the disconnection.
I tested SRT and RTMP
image

Attempt to invoke virtual method 'int android.media.audiofx.AcousticEchoCanceler.setEnabled(boolean)' on a null object reference

Hi:
I often get below error crash my app. What does meaning of below Exception?

io.github.thibaultbee.streampack.error.StreamPackError: java.lang.NullPointerException: Attempt to invoke virtual method 'int android.media.audiofx.AcousticEchoCanceler.setEnabled(boolean)' on a null object reference

Error occur at this line↓
cameraRtmpLiveStreamer?.configure(audioConfig, videoConfig)

SRT for webinars

Hi
I wanted to know if this protocol is good for making a webinar with 5 callers and 50 listeners ?

[Feat]: Background RTMP Streaming

Version

2.5.2

Environment that reproduces the issue

Emulator

Use case description

It would be nice to have a background streaming service.

Proposed solution

No response

Alternative solutions

No response

Add support for camera with realtime source timestamp.

On device with realtime source timestamp, it is not possible to read stream from ffmpeg.
On ffmpeg side, the following errors happen:

2:41:18.339593/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=0 avail=0 ack.seq=1646609462 pkt.seq=1646609462 rcv-remain=8191 drift=1545
12:41:18.339743/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=1 avail=0 ack.seq=1646609462 pkt.seq=1646609463 rcv-remain=8191 drift=1545
12:41:19.324344/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=8 avail=0 ack.seq=1646609462 pkt.seq=1646609470 rcv-remain=8191 drift=1545
12:41:19.324490/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=9 avail=0 ack.seq=1646609462 pkt.seq=1646609471 rcv-remain=8191 drift=1545
12:41:19.340980/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=12 avail=0 ack.seq=1646609462 pkt.seq=1646609474 rcv-remain=8191 drift=1545
12:41:19.341026/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=13 avail=0 ack.seq=1646609462 pkt.seq=1646609475 rcv-remain=8191 drift=1545
12:41:19.372917/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=17 avail=0 ack.seq=1646609462 pkt.seq=1646609479 rcv-remain=8191 drift=1545

See #11 (comment)

Link aggregation

Add Link aggregation support in order to transmit stream on multiple link (Ethernet, Wi-Fi, Cellular,...)

Why ?
Improves stream quality by using cellular and Wi-Fi modem.

Prequisite
SRT support, then srtdroid support.

How to fill in Passphrase

Hi,I used the app with srt,I want to encrypt the stream, but i donot know how to fill the passphrase field。
Could you give me some advice?

OpenGL Filter support

Do you plan on exposing GL so you can use OpenGL to apply filters to the streaming video? Most similar libraries on Andriod (https://github.com/pedroSG94/rtmp-rtsp-stream-client-java) and HashinKit on iOS provide this feature or someway to filter/provide a way to modify the broadcast image. I specifically want this feature to be able to add an overlay image via a shader similar to this https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/4b85bce8475deb97380dad3bbe51aca6d24087ea/encoder/src/main/java/com/pedro/encoder/input/gl/render/filters/object/ImageObjectFilterRender.java

As a side note, I think this library has a lot of potentials and think it is heading in the right direction

Mirroring in landscape mode cause the sink side to be stretched

screenrecorder 1:

If the initial state of the demo is portrait mode, sink display is normal.
v_screen_source
Then the display on the sink side is normal. then, if the mobile phone is switched between horizontal and vertical ,it just work well.
v_sink

screenrecorder 2:

If the initial state of the phone is horizontal, the display screen is stretched.
Before clicking the "record screen" button, keep the phone in landscape mode, and then click "record screen"
land_source

The screen displayed on the sink is stretched, and later, if the mobile phone switches between horizontal and vertical mode, the picture is also distorted

land_sink

RTMP data timestamp not correct

I've been using StreamPack on Android via api.video-reactnative-live-stream.
Running on device or simulator the broadcasting service livepeer.studio is not able to show the broadcasted camera content on Android. Even though I am able to see it if I broadcast to a local ffplay -listen 1 -i rtmp://0.0.0.0:1935/s/streamKey.

After talking with a colleague of mine at Livepeer, he debugged the RTMP stream received content. His feedback is:

The RTMP content is completely corrupt:
[124+0] 58 bytes of H264 video keyframe header
[124+0] 4020 bytes of H264 video keyframe NALU
[158+0] 1476 bytes of H264 video iframe NALU
[191+0] 1060 bytes of H264 video iframe NALU
[224+0] 1412 bytes of H264 video iframe NALU
[235+0] 1524 bytes of H264 video iframe NALU
[221+0] 1476 bytes of H264 video iframe NALU
[231+0] 1540 bytes of H264 video iframe NALU
[219+0] 1412 bytes of H264 video iframe NALU
[229+0] 1572 bytes of H264 video iframe NALU
[238+0] 1588 bytes of H264 video iframe NALU
[226+0] 2116 bytes of H264 video iframe NALU
[212+0] 2404 bytes of H264 video iframe NALU
[222+0] 1156 bytes of H264 video iframe NALU
[233+0] 1572 bytes of H264 video iframe NALU
[220+0] 1588 bytes of H264 video iframe NALU
[229+0] 1956 bytes of H264 video iframe NALU
[217+0] 1780 bytes of H264 video iframe NALU
[227+0] 2036 bytes of H264 video iframe NALU
[213+0] 2052 bytes of H264 video iframe NALU
[224+0] 2228 bytes of H264 video iframe NALU
[211+0] 2228 bytes of H264 video iframe NALU
[220+0] 1476 bytes of H264 video iframe NALU
[208+0] 1828 bytes of H264 video iframe NALU
[218+0] 1956 bytes of H264 video iframe NALU
[204+0] 2036 bytes of H264 video iframe NALU
[215+0] 1284 bytes of H264 video iframe NALU
[225+0] 1348 bytes of H264 video iframe NALU
[235+0] 1428 bytes of H264 video iframe NALU
[245+0] 964 bytes of H264 video iframe NALU
[209+0] 868 bytes of H264 video iframe NALU

Numbers in front are timestamps in milliseconds. They're all over the place!

Do you know why this is happening?
Thanks

[Feat]: Add SRTLA (SRT transport proxy with link aggregation for connection bonding)

Version

2.5.2

Environment that reproduces the issue

N/A

Use case description

Just an idea, as this is now a very popular system and used by thousands now for IRL streaming... but I understand this may be a lot of work?

Support for SRTLA - SRT transport proxy with link aggregation for connection bonding. To make use of the mobile network and WiFi to create a bonded connection. SRTLA is open source, and already implemented in the 'IRL Pro' streaming app, however this does not provide screen recording, only camera support.

SRTLA is used as a part of the BELABOX streaming project, which allows bonding over multiple connections. More information regarding SRTLA here: https://github.com/BELABOX/srtla

Thank you, and thank you for the support you have already provided.

Proposed solution

Implement an option to use an SRTLA server as an endpoint, making use of the mobile cellular data and wifi.

Alternative solutions

No response

Preview stops after zooming

TLDR: A single device have experienced a malfunction while using the zooming functionality. If this should be a bigger issue it should be looked at further.

If you come across this issue and you have a Nokia device, it would be nice if you can test this issue so we can get some data on how big the issue is.

Issue description

When zooming, in this case using the StreamPack demo-camera app - When setting the zoomRatio, the preview appears to stop displaying the camera view.
With the error message E/CameraDeviceCallback: Camera 0 is in error 4

At this point we dont know the scope of this issue.
We want to figure out the cause of this issue.
These are the candidates that we can come up with.

  • Android 11
  • Nokia's android distro
  • The model Nokia T20
  • StreamPack
  • The spcific device that i have ahold of

Testing

First you confirm that the issue you are experiencing is the same as described here.
If this is the same issue. Then please provide the following information.

Tests

Nokia T20

Nokia 7.1 - Android 10

Works without any similarities to the issue on Nokia T20, and works without any errors.

Comment

This issue has been discussed here:

As of now this issue has only been experienced on a single device, and not reproduced by any other devices. Which would not be considered a critical fault with StreamPack. But for instance if the scope of this issue was affecting all Nokia devices on Android 11 and above, it might have to get looked at.

FPS drop and lags during streaming

Hi Thibault! First of all, thanks for the library!

Implementing streaming to my project I followed your experience as an example, but I ran into some difficulties. Do you mind if I ask few questions?

When you start the application, it is noticeable that the camera frame rate is below 30, although the value is set to 30, and when you start the stream, the fps drops even more. As far as I understood, this is due to the call to the holder.setFixedSize() method every time the surface changes.

I tried to call holder.setFixedSize() only in surfaceCreated() and then there are no problems with fps, but in this case, when the stream starts, the information about the set fps does not reach the server, which leads to errors and the stream does not play.

Can you please tell me how to fix this? I would be grateful for any advice!

No Data being received for RTMP on Android

I ran the sample app on Android. Providing an RTMP URL with the streaming key starts a successful stream, but when I check the actual data received I get nothing. This library is also being used in api.video-reactnative-live-stream

Network adaptive bitrate

Adds an adaptive bitrate algorithm to ajust bitrate to available bandwidth.

Why ?
Improves stream quality.

HEVC

Adds support of hardware HEVC video encoder when it is available on device.

Why ?
Improves video quality for the same bitrate.

TODO

  • In tsmux, add sps, pps, vps in front of a video I-frame
  • Add an API to get supported encoder list

Is there any release apk package?

Hi,Thanks for your masterpieces.
I was not familiar with kotlin and I try to build it, But failed。I really want to try srt on my phone, I was wondering is there any release apk package?

Support for vp8/vp9 encoders for video

Many android devices do not have H264/H265 encoders. But they do have either vp8 or vp9 encoder. How about extending support to these encoders as well?

[Feat]: Lock resolution to starting orientation

Version

2.5.2

Environment that reproduces the issue

  • Google Pixel 6 Pro - Android 13
  • ASUS ROG - Android 13

Use case description

When recording the screen and selecting 1920x1080 as the resolution, in portrait mode the receiving side has black around the phone screen - as a full frame 1920x1080 video.

If the phone is in portrait, an option to lock this to 1080x1920 for example, so that the video is resized correctly on the receiving end.

Proposed solution

An option in settings to lock the resolution to the starting orientation, and flip the resolution so that it is resized correctly.

Alternative solutions

No response

[Feat]: Get current bitrate of the stream

Version

2.5.2

Environment that reproduces the issue

Pixel 5a

Use case description

I would like to show the current bitrate to the user, but I am not able to get the bitrate

Proposed solution

A value in the setting object or something would be nice, or a callback of sorts on bitrate change

Alternative solutions

No response

OPUS

Adds support of hardware OPUS audio encoder when it is available on device.

Why ?
Improves audio quality for the same bitrate.

Prerequisites
A phone with a hardware OPUS encoder.

Bitrate spikes drastically at the 5-minute mark when streaming to a RTPM server.

The bitrate spikes drastically right at the minute 5:00 when streaming to a RTPM server.

To Reproduce
Steps to reproduce the behavior:

  1. Run the demo-camera app on an Android device.
  2. Go to settings and confirm that the video rate is set to 2 kb/s, select endpoint type "Stream to a remote RTPM device" and provide a RTPM server
  3. Start streaming
  4. For 5 minutes the bitrate stays at ~2000kb/s, but after that it spikes to higher values like 5000kb/s or even more causing the stream to look laggy.

Expected behavior
Bitrate stays at 2000 kb/s for as long as the stream lasts.

App crash when release RTMP stream

Hi,

Thanks for the library. It works but on one of my devices, the app just crashes as soon as I call stream.release(). The crash happens in native code so there's no way to try catch it. Here is the crash log:

12-19 19:25:49.181 16919 16919 F libc : Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 16919 (com.my.example)
12-19 19:25:49.214 18091 18091 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
12-19 19:25:49.214 18091 18091 F DEBUG : Build fingerprint: 'samsung/nobleltelgt/nobleltelgt:7.0/NRD90M/N920LKLU2DVG1:user/release-keys'
12-19 19:25:49.215 18091 18091 F DEBUG : Revision: '9'
12-19 19:25:49.215 18091 18091 F DEBUG : ABI: 'arm64'
12-19 19:25:49.215 18091 18091 F DEBUG : pid: 16919, tid: 16919, name: com.my.example >>> com.my.example <<<
12-19 19:25:49.215 18091 18091 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
12-19 19:25:49.215 18091 18091 F DEBUG : x0 0000007ab7a8a100 x1 0000007fe5412810 x2 0000000000000000 x3 0000007ab794dde2
12-19 19:25:49.215 18091 18091 F DEBUG : x4 0000007fe5412710 x5 0000800000000000 x6 0000000000ffffff x7 ffffffffffffffff
12-19 19:25:49.215 18091 18091 F DEBUG : x8 0000000000000000 x9 0000000000000000 x10 0000000000430000 x11 0000000072287fc0
12-19 19:25:49.215 18091 18091 F DEBUG : x12 00000000722e6660 x13 ffffff0000000000 x14 0000007a83c6d800 x15 0000000000000000
12-19 19:25:49.215 18091 18091 F DEBUG : x16 0000007aaa93b578 x17 0000007ab885f36c x18 00000000ebad6082 x19 0000000000000000
12-19 19:25:49.215 18091 18091 F DEBUG : x20 0000007fe5412810 x21 0000007ab7a8a100 x22 0000007fe5412b2c x23 00000000729bbb63
12-19 19:25:49.215 18091 18091 F DEBUG : x24 0000000000000004 x25 47469ecc02d4fc0d x26 0000007ab7ac7a98 x27 47469ecc02d4fc0d
12-19 19:25:49.215 18091 18091 F DEBUG : x28 0000000000000001 x29 0000007fe54127f0 x30 0000007aaa8e8884
12-19 19:25:49.215 18091 18091 F DEBUG : sp 0000007fe54127d0 pc 0000007ab885f388 pstate 0000000020000000
12-19 19:25:49.583 18091 18091 F DEBUG :
12-19 19:25:49.583 18091 18091 F DEBUG : backtrace:
12-19 19:25:49.583 18091 18091 F DEBUG : #00 pc 000000000000e388 /system/lib64/libutils.so (_ZNK7android7RefBase9decStrongEPKv+28)
12-19 19:25:49.583 18091 18091 F DEBUG : #1 pc 0000000000033880 /system/lib64/libmedia_jni.so
12-19 19:25:49.583 18091 18091 F DEBUG : #2 pc 00000000026e41b0 /system/framework/arm64/boot-framework.oat (offset 0x1fe8000) (android.media.MediaCodec.native_release+124)
12-19 19:25:49.584 18091 18091 F DEBUG : #3 pc 00000000026e6950 /system/framework/arm64/boot-framework.oat (offset 0x1fe8000) (android.media.MediaCodec.release+60)
12-19 19:25:49.584 18091 18091 F DEBUG : #4 pc 00000000000d1eb4 /system/lib64/libart.so (art_quick_invoke_stub+580)
12-19 19:25:49.584 18091 18091 F DEBUG : #5 pc 00000000000deb88 /system/lib64/libart.so (_ZN3art9ArtMethod6InvokeEPNS_6ThreadEPjjPNS_6JValueEPKc+208)
12-19 19:25:49.584 18091 18091 F DEBUG : #6 pc 000000000028db00 /system/lib64/libart.so (_ZN3art11interpreter34ArtInterpreterToCompiledCodeBridgeEPNS_6ThreadEPNS_9ArtMethodEPKNS_7DexFile8CodeItemEPNS_11ShadowFrameEPNS_6JValueE+312)
12-19 19:25:49.584 18091 18091 F DEBUG : #7 pc 0000000000286adc /system/lib64/libart.so (_ZN3art11interpreter6DoCallILb0ELb0EEEbPNS_9ArtMethodEPNS_6ThreadERNS_11ShadowFrameEPKNS_11InstructionEtPNS_6JValueE+592)
12-19 19:25:49.584 18091 18091 F DEBUG : #8 pc 00000000005565c8 /system/lib64/libart.so (MterpInvokeVirtualQuick+452)
12-19 19:25:49.584 18091 18091 F DEBUG : #9 pc 00000000000c8614 /system/lib64/libart.so (ExecuteMterpImpl+29972)
12-19 19:25:51.089 18091 18091 E : ro.debug_level = 0x4f4c
12-19 19:25:51.090 18091 18091 E : sys.mobilecare.preload = false

Crash when stop RTMP publishing

Hi, thank you for make this wonderful project, the part of SRT works very well, but there seems to be some problems in the part of RTMP, when I stop RTMP publishing, the app crashed, log like this:


11/14 23:58:46: Launching 'demo-camera' on smartisan OS105.
Install successfully finished in 3 s 498 ms.
$ adb shell am start -n "io.github.thibaultbee.streampack.sample/io.github.thibaultbee.streampack.app.ui.main.MainActivity" -a android.intent.action.MAIN -c android.intent.category.LAUNCHER
Connected to process 5910 on device 'smartisan-os105-b558c244'.
Capturing and displaying logcat messages from application. This behavior can be disabled in the "Logcat output" section of the "Debugger" settings page.
I/art: Late-enabling -Xcheck:jni
I/System: Daemon delayGCRequest, sDelayGCRequest=false, delay=true, sPendingGCRequest=false
W/art: Before Android 4.1, method android.graphics.PorterDuffColorFilter androidx.vectordrawable.graphics.drawable.VectorDrawableCompat.updateTintFilter(android.graphics.PorterDuffColorFilter, android.content.res.ColorStateList, android.graphics.PorterDuff$Mode) would have incorrectly overridden the package-private method in android.graphics.drawable.Drawable
I/CameraManagerGlobal: Connecting to camera service
W/CameraManagerGlobal: [soar.cts] ignore the status update of camera: 2
W/CameraManagerGlobal: [soar.cts] ignore the status update of camera: 3
W/DataBinding: Setting the fragment as the LifecycleOwner might cause memory leaks because views lives shorter than the Fragment. Consider using Fragment's view lifecycle
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
W/Utils: could not parse long range '175-174'
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
W/VideoCapabilities: Unrecognized profile/level 0/3 for video/mpeg2
W/VideoCapabilities: Unrecognized profile/level 0/3 for video/mpeg2
W/VideoCapabilities: Unsupported mime video/x-ms-wmv
W/VideoCapabilities: Unsupported mime video/x-ms-wmv
W/VideoCapabilities: Unsupported mime video/x-ms-wmv
W/VideoCapabilities: Unsupported mime video/divx
W/VideoCapabilities: Unsupported mime video/divx311
W/VideoCapabilities: Unsupported mime video/divx4
W/VideoCapabilities: Unsupported mime video/mp4v-esdp
I/VideoCapabilities: Unsupported profile 4 for video/mp4v-es
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
I/VideoMediaCodecEncoder: Selected encoder OMX.qcom.video.encoder.avc
I/OMXClient: MuxOMX ctor
E/ACodec: found 1 codecs
E/ACodec: codec OMX.qcom.video.encoder.avc selected
I/MediaCodec: MediaCodec will operate in async mode
E/ACodec: [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -1010
I/ExtendedACodec: setupVideoEncoder()
W/ACodec: do not know color format 0x7fa30c04 = 2141391876
W/ACodec: do not know color format 0x7fa30c00 = 2141391872
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/ACodec: setupAVCEncoderParameters with [profile: High] [level: Level51]
I/ACodec: [OMX.qcom.video.encoder.avc] cannot encode HDR static metadata. Ignoring.
I/ACodec: setupVideoEncoder succeeded
I/ExtendedACodec: [OMX.qcom.video.encoder.avc] configure, AMessage : AMessage(what = 'conf', target = 1) = {
      string mime = "video/avc"
      int32_t frame-rate = 30
      int32_t color-format = 2130708361
      int32_t profile = 8
      int32_t height = 1280
      int32_t width = 720
      int32_t bitrate = 2000000
      float i-frame-interval = 1.000000
      int32_t level = 32768
      int32_t encoder = 1
    }
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/Adreno: QUALCOMM build                   : 54dba37, I1b6e53de78
    Build Date                       : 02/21/18
    OpenGL ES Shader Compiler Version: XE031.14.00.04
    Local Branch                     : 
    Remote Branch                    : quic/gfx-adreno.lnx.1.0.r9-rel
    Remote Branch                    : NONE
    Reconstruct Branch               : NOTHING
I/Adreno: PFP: 0x005ff087, ME: 0x005ff063
D/: SurfaceMonitor closed!
I/AudioMediaCodecEncoder: Selected encoder OMX.google.aac.encoder
I/OMXClient: MuxOMX ctor
E/ACodec: found 1 codecs
E/ACodec: codec OMX.google.aac.encoder selected
I/MediaCodec: MediaCodec will operate in async mode
D/StandaloneCoroutine: Streamer is created
D/OpenGLRenderer: RenderMonitor init!
D/OpenGLRenderer: RenderMonitor closed!
I/OpenGLRenderer: Initialized EGL, version 1.4
D/OpenGLRenderer: Swap behavior 1
I/PreviewView: Starting on camera: 0
D/PreviewView: View finder size: 1080 x 2010
D/PreviewView: Selected preview size: 1920x1080
I/art: Do partial code cache collection, code=17KB, data=29KB
I/art: After code cache collection, code=17KB, data=29KB
I/art: Increasing code cache capacity to 128KB
D/AutoFitSurfaceView: Measured dimensions set: 1131 x 2010
W/art: Before Android 4.1, method double java.util.concurrent.ThreadLocalRandom.internalNextDouble(double, double) would have incorrectly overridden the package-private method in java.util.Random
W/art: Before Android 4.1, method int java.util.concurrent.ThreadLocalRandom.internalNextInt(int, int) would have incorrectly overridden the package-private method in java.util.Random
W/art: Before Android 4.1, method long java.util.concurrent.ThreadLocalRandom.internalNextLong(long, long) would have incorrectly overridden the package-private method in java.util.Random
I/CameraController: Supported FPS range list: [[15, 15], [20, 20], [24, 24], [7, 30], [30, 30], [10, 30]]
D/CameraController: Selected Fps range [30, 30]
I/System: Daemon delayGCRequest, sDelayGCRequest=true, delay=false, sPendingGCRequest=false
I/: Connection succeeded
D/ACodec: dataspace changed to 0x10c10000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:1(BT709), T:3(SMPTE_170M))
I/: Format changed : {csd-1=java.nio.HeapByteBuffer[pos=0 lim=9 cap=9], mime=video/avc, frame-rate=30, width=720, height=1280, color-standard=1, color-range=2, bitrate=2000000, csd-0=java.nio.HeapByteBuffer[pos=0 lim=22 cap=22], color-transfer=3, max-bitrate=2000000}
I/: Format changed : {bitrate=128000, mime=audio/mp4a-latm, csd-0=java.nio.HeapByteBuffer[pos=0 lim=2 cap=2], channel-count=2, sample-rate=44100, max-bitrate=128000}
D/VideoMediaCodecEncoder: Not running
D/AudioMediaCodecEncoder: Not running
I/AudioMediaCodecEncoder: Selected encoder OMX.google.aac.encoder
I/OMXClient: MuxOMX ctor
E/ACodec: found 1 codecs
E/ACodec: codec OMX.google.aac.encoder selected
I/MediaCodec: MediaCodec will operate in async mode
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
I/VideoMediaCodecEncoder: Selected encoder OMX.qcom.video.encoder.avc
I/OMXClient: MuxOMX ctor
E/ACodec: found 1 codecs
E/ACodec: codec OMX.qcom.video.encoder.avc selected
I/MediaCodec: MediaCodec will operate in async mode
E/ACodec: [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -1010
I/ExtendedACodec: setupVideoEncoder()
W/ACodec: do not know color format 0x7fa30c04 = 2141391876
W/ACodec: do not know color format 0x7fa30c00 = 2141391872
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/ACodec: setupAVCEncoderParameters with [profile: High] [level: Level51]
I/ACodec: [OMX.qcom.video.encoder.avc] cannot encode HDR static metadata. Ignoring.
I/ACodec: setupVideoEncoder succeeded
I/ExtendedACodec: [OMX.qcom.video.encoder.avc] configure, AMessage : AMessage(what = 'conf', target = 10) = {
      string mime = "video/avc"
      int32_t frame-rate = 30
      int32_t color-format = 2130708361
      int32_t profile = 8
      int32_t height = 1280
      int32_t width = 720
      int32_t bitrate = 2000000
      float i-frame-interval = 1.000000
      int32_t level = 32768
      int32_t encoder = 1
    }
W/ACodec: do not know color format 0x7f000789 = 2130708361
A/libc: Fatal signal 7 (SIGBUS), code 1, fault addr 0x7f617f72b1 in tid 5910 (reampack.sample)

I found if I comment outsocket.close() in RtmpProducer.kt, then no longer cause A/libc: Fatal signal 7 (SIGBUS), code 1, fault addr 0x7f617f72b1 in tid 5910 (reampack.sample) crash, but the socket was not closed, when start publish again, app crashed again.

I hope you can provide some help. Thanks.

Video Capture Mirroring

Version

N/A

Environment that reproduces the issue

N/A

Use case description

Recently, I drafted a PR for the api.video ios library to support the ability to mirror the video image apivideo/api.video-swift-live-stream#11

The hope was to come here and do the same, but I think it may not be possible - But was curious if you had any thoughts on a solution.

My initial attempt was to setMatrix on the previewSurface as well as the encoderSurface. But, this requires locking the canvas, which I was unable to accomplish while the preview is active. Here is the Mirror class I was messing around with. The new param cameraCapture is passed in when the CameraSettings class gets instantiated.

class Mirror(private val context: Context, private val cameraController: CameraController, private val cameraCapture: CameraCapture) {
    private val notMirrored: Matrix = Matrix()
    private val mirrored: Matrix = Matrix()
    private var _isMirrored = false;

    init {
        notMirrored.setScale(0F, 0F)
        mirrored.setScale(-1F, 1F)
    }

    var isMirrored: Boolean
        get() {
            return _isMirrored
        }

        set(newValue) {
            if (cameraCapture.previewSurface == null) return

            val canvas: Canvas = cameraCapture.previewSurface!!.lockCanvas(Rect())

            if (newValue) {
                canvas.setMatrix(mirrored)
            } else {
                canvas.setMatrix(notMirrored)
            }

            cameraCapture.previewSurface!!.unlockCanvasAndPost(canvas)
        }
}

Most solutions seem to talk about modifying the actual SurfaceView or TextureView, but this library seems to rely on Camera2 API to handle the surfaces (targets) directly.

Curious if you have any additional ideas on how to implement the ability to mirror (flip) the image.

Proposed solution

No response

Alternative solutions

No response

Configure CameraRtmpLiveStreamer to write to a file in parallel to the streaming to RTMP server?

Is there a way to configure the CameraRtmpLiveStreamer to write the stream to a file? Maybe exposing the muxer property, so we pass in writeToFile = true?

class CameraRtmpLiveStreamer(
    context: Context,
    enableAudio: Boolean = true,
    initialOnErrorListener: OnErrorListener? = null,
    initialOnConnectionListener: OnConnectionListener? = null
) : BaseCameraLiveStreamer(
    context = context,
    enableAudio = enableAudio,
    muxer = FlvMuxer(context = context, writeToFile = false),
    endpoint = RtmpProducer(hasAudio = enableAudio, hasVideo = true),
    initialOnErrorListener = initialOnErrorListener,
    initialOnConnectionListener = initialOnConnectionListener
)

[Feat]: Support for devices whose MountAngle is not general

Version

2.5.2 (or commit: 3a9af52)

Environment that reproduces the issue

Pixel4 - AOSP 10 (Rooted, Custom OS)
Root is required because the MountAngle in camera_config.xml, which is the ReadOnly configuration file of the OS, needs to be rewritten.
This can be reproduced by changing the angle to 0 or 180.

Use case description

Thanks for this project, it has been very helpful as I was trying to implement SRT. Thank you very much.

After using the demo and libraries of this project, I found that
It seems that devices with camera angles where the MountAngle is 0 or 180 are not oriented correctly.
MountAngle in camera_config.xml is the value of SENSOR_ORIENTATION in CameraCharacteristics.
e.g. https://github.com/LineageOS/android_device_bq_sdm660-common/blob/lineage-16.0/configs/camera/camera_config.xml

For example, if you request Size(1280, 720), then
A camera with MountAngle=90 will preview and deliver a 1280 x 720 image with the correct proportions.
However, with a camera with MountAngle=0, the captured video with 1280 horizontal and 720 vertical will be distorted, as if it were scaled down or enlarged to 1280 vertical and 720 horizontal.

The following is a capture of the distorted image.
(Android preview screen and the image distributed by SRT)

  • image

  • image

After applying my patch (please check alternative solutions.)

  • image

Proposed solution

No response

Alternative solutions

I have created a patch to work with the less common MountAngle devices.
Here is the Dirty code. Sry.

diff --git a/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/DeviceOrientationProvider.kt b/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/DeviceOrientationProvider.kt
index 2dc06e1..cc60558 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/DeviceOrientationProvider.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/DeviceOrientationProvider.kt
@@ -16,14 +16,22 @@
 package io.github.thibaultbee.streampack.internal.data.orientation
 
 import android.content.Context
+import android.hardware.camera2.CameraCharacteristics
 import android.util.Size
 import io.github.thibaultbee.streampack.internal.interfaces.IOrientationProvider
 import io.github.thibaultbee.streampack.internal.utils.extensions.deviceOrientation
 import io.github.thibaultbee.streampack.internal.utils.extensions.isDevicePortrait
 import io.github.thibaultbee.streampack.internal.utils.extensions.landscapize
 import io.github.thibaultbee.streampack.internal.utils.extensions.portraitize
+import io.github.thibaultbee.streampack.utils.getCameraCharacteristics
+import io.github.thibaultbee.streampack.internal.sources.camera.CameraCapture
 
 class DeviceOrientationProvider(private val context: Context) : IOrientationProvider {
+    private var cameraCapture: CameraCapture? = null
+    fun setCameraCapture(capture: CameraCapture) {
+        this.cameraCapture = capture
+    }
+
     override val orientation: Int
         get() {
             //TODO: this might not be working on all devices
@@ -31,6 +39,13 @@ class DeviceOrientationProvider(private val context: Context) : IOrientationProv
             return if (deviceOrientation == 0) 270 else deviceOrientation - 90
         }
 
+    override val cameraAngle: Int
+        get() {
+            val cameraId = this.cameraCapture?.cameraId ?: return 90
+            val characteristics = context.getCameraCharacteristics(cameraId)
+            return characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION) as Int
+        }
+
     override fun orientedSize(size: Size): Size {
         return if (context.isDevicePortrait) {
             size.portraitize()
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/FixedOrientationProvider.kt b/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/FixedOrientationProvider.kt
index 8f5ee4f..e14c15f 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/FixedOrientationProvider.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/FixedOrientationProvider.kt
@@ -21,7 +21,7 @@ import io.github.thibaultbee.streampack.internal.utils.extensions.landscapize
 import io.github.thibaultbee.streampack.internal.utils.extensions.portraitize
 import io.github.thibaultbee.streampack.utils.OrientationUtils
 
-class FixedOrientationProvider(override val orientation: Int) : IOrientationProvider {
+class FixedOrientationProvider(override val orientation: Int, override val cameraAngle: Int = 90) : IOrientationProvider {
     override fun orientedSize(size: Size): Size {
         return if (OrientationUtils.isPortrait(orientation)) {
             size.portraitize()
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/internal/encoders/VideoMediaCodecEncoder.kt b/core/src/main/java/io/github/thibaultbee/streampack/internal/encoders/VideoMediaCodecEncoder.kt
index 1ed1709..1a0f2d5 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/internal/encoders/VideoMediaCodecEncoder.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/internal/encoders/VideoMediaCodecEncoder.kt
@@ -88,8 +88,15 @@ class VideoMediaCodecEncoder(
         val videoConfig = config as VideoConfig
         orientationProvider.orientedSize(videoConfig.resolution).apply {
             // Override previous format
-            format.setInteger(MediaFormat.KEY_WIDTH, width)
-            format.setInteger(MediaFormat.KEY_HEIGHT, height)
+            val cameraAngle = orientationProvider.cameraAngle
+            
+            if (cameraAngle == 90 || cameraAngle == 270) {
+                format.setInteger(MediaFormat.KEY_WIDTH, width)
+                format.setInteger(MediaFormat.KEY_HEIGHT, height)
+            } else {
+                format.setInteger(MediaFormat.KEY_WIDTH, height)
+                format.setInteger(MediaFormat.KEY_HEIGHT, width)
+            }
         }
     }
 
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/internal/interfaces/IOrientationProvider.kt b/core/src/main/java/io/github/thibaultbee/streampack/internal/interfaces/IOrientationProvider.kt
index 599d8ed..215bd72 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/internal/interfaces/IOrientationProvider.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/internal/interfaces/IOrientationProvider.kt
@@ -27,6 +27,12 @@ interface IOrientationProvider {
      */
     val orientation: Int
 
+   /**
+     * CameraSensor angle. Generally 90, 270.
+     * Expected values: 0, 90, 180, 270.
+     */
+     val cameraAngle: Int
+
     /**
      * Return the size with the correct orientation.
      */
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseCameraStreamer.kt b/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseCameraStreamer.kt
index d685267..f2016d1 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseCameraStreamer.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseCameraStreamer.kt
@@ -61,8 +61,13 @@ open class BaseCameraStreamer(
     initialOnErrorListener = initialOnErrorListener
 ), ICameraStreamer {
     private val cameraCapture = videoCapture as CameraCapture
+    private val deviceOrientationProvider = orientationProvider as DeviceOrientationProvider
     override val helper = CameraStreamerConfigurationHelper(muxer.helper)
 
+    init {
+        deviceOrientationProvider.setCameraCapture(cameraCapture)
+    }
+
     /**
      * Get/Set current camera id.
      */
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseStreamer.kt b/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseStreamer.kt
index cff625d..44df9ef 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseStreamer.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseStreamer.kt
@@ -59,7 +59,7 @@ abstract class BaseStreamer(
     private val context: Context,
     protected val audioCapture: IAudioCapture?,
     protected val videoCapture: IVideoCapture?,
-    orientationProvider: IOrientationProvider,
+    protected val orientationProvider: IOrientationProvider,
     private val muxer: IMuxer,
     protected val endpoint: IEndpoint,
     initialOnErrorListener: OnErrorListener? = null

Flashlight support

Would you mind enhancing it to support torch/flashlight parameter?
Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.