opentok / opentok-ios-sdk-samples Goto Github PK
View Code? Open in Web Editor NEWExample applications that use the OpenTok iOS SDK
Home Page: https://tokbox.com/developer/sdks/ios
License: MIT License
Example applications that use the OpenTok iOS SDK
Home Page: https://tokbox.com/developer/sdks/ios
License: MIT License
Hi, the extension keeps crashing and closing on iPad Mini.
Same code works on iPhone x and 8+, so i think that maybe the system closes it because it is over 50Mb.
How can i shrink resolution ? Can you please point me in the right direction so i can try if the extension si being killed ?
Thanks.
In our project, we need to display an image in the publisher stream, so I take the TBExampleVideoCapture
in this repo to custom the video capture.
The trick is to convert video and image Buffer to CIImages and render them on a CVPixelBufferRef using CIContext, finally, send the CVPixelBufferRef to CaptureConsumer.
The following is my main changes to add an image in the publisher stream:
ARGB
:-(id)init {
self = [super init];
if (self) {
// ....
_videoFrame = [[OTVideoFrame alloc] initWithFormat:
[OTVideoFormat videoFormatARGBWithWidth:_captureWidth
height:_captureHeight]];
// ....
}
return self;
}
- (int32_t)captureSettings:(OTVideoFormat*)videoFormat {
videoFormat.pixelFormat = OTPixelFormatARGB;
videoFormat.imageWidth = _captureWidth;
videoFormat.imageHeight = _captureHeight;
return 0;
}
CVPixelBufferRef
- (void)fillImagePixelBufferFromCGImage:(CGImageRef)image videoWidth:(CGFloat)videoWidth videoHeight:(CGFloat)videoHeight
{
if (imagePixelBuffer == nil) {
NSDictionary *options = @{
(NSString *)kCVPixelBufferCGImageCompatibilityKey: @NO,
(NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey: @NO
};
CVPixelBufferCreate(kCFAllocatorDefault,
videoWidth,
videoHeight,
kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef)(options),
&(imagePixelBuffer));
}
CVPixelBufferLockBaseAddress(imagePixelBuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(imagePixelBuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context =
CGBitmapContextCreate(pxdata,
videoWidth,
videoHeight,
8,
CVPixelBufferGetBytesPerRow(imagePixelBuffer),
rgbColorSpace,
kCGImageAlphaPremultipliedFirst |
kCGBitmapByteOrder32Big);
if ([self.delegate respondsToSelector:@selector(frameForImageInVideo)]) {
CGRect frame = [self.delegate frameForImageInVideo];
CGContextDrawImage(context, CGRectMake(frame.origin.y, frame.origin.x, frame.size.width, frame.size.height), image);
}
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(imagePixelBuffer, 0);
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
// ....
CVImageBufferRef videoBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
size_t videoWidth = CVPixelBufferGetWidth(videoBuffer);
size_t videoHeight = CVPixelBufferGetHeight(videoBuffer);
// create the finalPixelBuffer which will be consumed by CaptureConsumer finally
CVPixelBufferRef finalPixelBuffer;
NSDictionary *options = @{
(NSString *)kCVPixelBufferCGImageCompatibilityKey: @NO,
(NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey: @NO
};
CVPixelBufferCreate(kCFAllocatorDefault,
videoWidth,
videoHeight,
kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef)(options),
&(finalPixelBuffer));
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
// render video to finalPixelBuffer
CIImage *videoImage = [CIImage imageWithCVImageBuffer:videoBuffer];
[context
render:videoImage
toCVPixelBuffer:finalPixelBuffer
bounds:CGRectMake(0, 0, videoWidth, videoHeight)
colorSpace:rgbColorSpace];
// render image to finalPixelBuffer
if ([self.delegate respondsToSelector:@selector(imageToMergeIntoVideo)] &&
[self.delegate respondsToSelector:@selector(frameForImageInVideo)] &&
[self.delegate imageToMergeIntoVideo]) {
UIImage *image = [self.delegate imageToMergeIntoVideo];
CGRect frame = [self.delegate frameForImageInVideo];
[self fillImagePixelBufferFromCGImage:[image CGImage] videoWidth:videoWidth videoHeight:videoHeight];
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imagePixelBuffer];
[context
render:ciImage
toCVPixelBuffer:finalPixelBuffer
bounds:CGRectMake(frame.origin.y, frame.origin.x, frame.size.width, frame.size.height)
colorSpace:rgbColorSpace];
}
// send finalPixelBuffer to CaptureConsumer
CVPixelBufferLockBaseAddress(finalPixelBuffer, 0);
uint8_t *planes[1];
planes[0] = CVPixelBufferGetBaseAddress(finalPixelBuffer);
CMTime time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
_videoFrame.timestamp = time;
_videoFrame.orientation = [self currentDeviceOrientation];
[_videoFrame setPlanesWithPointers:planes numPlanes:1];
[_videoCaptureConsumer consumeFrame:_videoFrame];
CVPixelBufferUnlockBaseAddress(finalPixelBuffer, 0);
CVPixelBufferRelease(finalPixelBuffer);
CGColorSpaceRelease(rgbColorSpace);
}
When I test the changes, The publisher steam looks like this:
It looks like the steam is added a blue filter.
I thought there is something wrong during the merge process at first. But when I convert the finalPixelBuffer to UIImage using the following code:
CGImageRef testImageRef;
VTCreateCGImageFromCVPixelBuffer(finalPixelBuffer, nil, &testImageRef);
UIImage *testImage = [UIImage imageWithCGImage:testImageRef];
the result told me the merge process works well. This is the resulting image:
Do you have any ideas to remove the blue filter? Thanks!
Hello together,
as far as we can see with iOS 16 it is possible to have camera access also the App is not in foreground.
I Player a bit with iOS SDK and PIP mode and could add it.
Only if i put the App in Background the Camera Stream will drop and I not can see my UIViews anymore.
That is normal behavior and I could challange it on Android where it is working quite fine.
Will it be possible to have this extension for AVFoundation also for the iOS SDK.
Is it somehow possible to receive the Stream Raw Data without the Subscriber View so that I can add it to the PIP mode. And the User can see the other user when the app is in Background to Share the Screen?
Kind regards
Describe the bug
When I'm start working with connect session via token, after init flow of OTSession instance catch the crash.
It start raise error when update OpenTok version from 2.21.3 to 2.22.0 or 2.23.0 or 2.24.0
To Reproduce
Related to sdk version 2.22.0
OTError *result = nil;
if (session.sessionConnectionStatus == OTSessionConnectionStatusConnected) { return; }
[session connectWithToken:token error:&error];
Application(11135,0x170373000) malloc: *** error for object 0x2822a1b48: pointer being freed was not allocated
Application(11135,0x170373000) malloc: *** error for object 0x2822a1b48: pointer being freed was not allocated
Application(11135,0x170373000) malloc: *** set a breakpoint in malloc_error_break to debug
Screenshots
Device (please compete the following information):
On version 2.21.3 all works fine.
We have implemented screen share in iOS and Android.
In iOS its not working, as only time is displaying. Can you provide more detail or document that how it will be implemented and how it will work if we want to share my screen.
In android if we want to share mobile screen than how to achieve it, currently its sharing views only.
In Android one can switch between handset and speakerphone by calling AudioDeviceManager.getAudioDevice().setOutputMode. However, in iOS this requires using a custom audio driver. Why isn't this implemented in the iOS SDK similar to the Android SDK?
Initialising an OTPublisher freezes the main thread for a short time and prints out the fallowing warning:
Thread Performance Checker: -[AVCaptureSession startRunning] should be called from background thread. Calling it on the main thread can lead to UI unresponsiveness.
Initialising the OTPublisher creates and starts AVCaptureSession.
[AVCaptureSession startRunning] method is synchronous and blocks until the session starts running. It should be called from background thread.
Hello, I'm getting an error similar to his post when trying to profile my app against a simulator using Xcode 13: https://developer.apple.com/forums/thread/674716.
It appears that the solution proposed by Apple is to have the OpenTok.xcframework contain an arm64 slice for both simulators and devices. Is this something that you can work on?
Thank you!
When we activate the video after starting the conversation by enabling only the audio stream without enabling the video, the subscriberView doesn't appear on the iOS app. But when we go background and foreground just once, the video has appeared on the view.
If subscriber video opens, we want to see it in view.
In the latest version of the iOS SDK (v2.24.0) when initializing the publisher, the screen freezes and becomes unresponsive for a few seconds.
In the previous versions, there was a thread warning when initializing the publisher that has been fixed with the latest one.
Related issue: #267
Since the fix, the thread warning no longer appears, however the screen freeze still remains and now it is more noticeable than before. Now it takes about 10s for the publisher stream be initialized, and in that time the screen is not responsive. In the previous versions, despite the thread warning the freeze and init time was about 2-3s.
My App requirement is to test Audio and Video quality first, then start video call. Network test is done properly.
And when I start actual call I'm getting above error. And unable to publish my feed and audio to subscriber.
publisher(_:didFailWithError:)
1541 - Timed out while attempting to publish.
https://github.com/opentok/opentok-network-test/tree/master/iOS-Sample
I've tried in opentok version 2.19.1, 2.20.0
Hi, myself and at least one other person are unable to run the sample projects in this repository.
To reproduce
ViewController.m
.EXC_BAD_ACCESS
.Xcode logs up till crash:
2020-01-16 09:01:19.487426-0500 Basic-Video-Chat[21204:6399032]
2020-01-16 09:01:19.487494-0500 Basic-Video-Chat[21204:6399032] ------------------------------------------------
2020-01-16 09:01:19.487521-0500 Basic-Video-Chat[21204:6399032] OpenTok iOS Library, Rev.2
2020-01-16 09:01:19.487546-0500 Basic-Video-Chat[21204:6399032] This build was born on May 8 2019 at 16:57:23
2020-01-16 09:01:19.612931-0500 Basic-Video-Chat[21204:6399032] Version: 2.16.1.7383-ios
2020-01-16 09:01:19.612988-0500 Basic-Video-Chat[21204:6399032] libOpenTokObjC:b4f4e66c2a07ac12693b38cc2722a3061a9165ea
2020-01-16 09:01:19.613016-0500 Basic-Video-Chat[21204:6399032] Copyright 2019 TokBox, Inc.
2020-01-16 09:01:19.613039-0500 Basic-Video-Chat[21204:6399032] Licensed under the Apache License, Version 2.0
2020-01-16 09:01:19.613061-0500 Basic-Video-Chat[21204:6399032] ------------------------------------------------
2020-01-16 09:01:21.490235-0500 Basic-Video-Chat[21204:6399032] sessionDidConnect (sessionid-redacted)
2020-01-16 09:01:21.533336-0500 Basic-Video-Chat[21204:6399032] Metal GPU Frame Capture Enabled
2020-01-16 09:01:21.533878-0500 Basic-Video-Chat[21204:6399032] Metal API Validation Enabled
2020-01-16 09:01:23.643116-0500 Basic-Video-Chat[21204:6399032] Publishing
This is happening with the two sample projects mentioned above, but likely is an issue for more or all of them in this repo. Can someone please take a look? Thanks.
Hi,
I have been trying to set OTSubscriber or OTSession Volume Level like you have in js
subscriber.setAudioVolume
Issue: As Soon as Subscriber is connected. Volume keys control the volume of Session but I need to control volume using a UISlider in iOS Natively.
Can you please tell me how I can control session Audio volume using a UISlider or MPVolumeView of iOS ?
I need this urgently
Thanks
The warning produced reads Value stored to 'time' is never read
got this issue in ios ----- didFailWithError: (Error Domain=OTSessionErrorDomain Code=1004 "This SessionId can not be used with OpenTok 2.0 clients" UserInfo=0x17531c60 {NSLocalizedDescription=This SessionId can not be used with OpenTok 2.0 clients})
How do you fight the following warning displayed in console? Using Swift here.
*******************************************************
NOTICE: OPENTOK CONSOLE LOGGER HAS NOT BEEN SET.
PLEASE USE otk_console_set_logger(otk_console_logger)
TO SET YOUR LOGGER.
*******************************************************
Hello.
I have a problem when i use opentok ios(android) sdk.
On the LTE/3G, I can't see the subscribers(other client video).
But on the WIFI, it works fine.
What is the cause? I think because the LTE/3G bandwidth is small more than wifi.
Let me know the solving ways ASAP.
Regards.
Hello,
We are having some issues with poor audio quality when playing instruments over Opentok. The suggestion in the React Native issue I opened was to implement the custom audio driver and use kAudioUnitSubType_RemoteIO
only.
We tested the Custom Audio Driver in this repo with the following results. Attached are Opentok Playground archive samples to illustrate the issues: archive-tests 2.zip
kAudioUnitSubType_VoiceProcessingIO
. Result: Bad overall qualitykAudioUnitSubType_RemoteIO
Result: Better quality, but being filtered (more on this below)kAudioUnitSubType_RemoteIO
and sets AVAudioSession to setMode:AVAudioSessionModeMeasurement
in order to bypass the high-pass filter that is apparently still applied with RemoteIO (per https://stackoverflow.com/q/32227585/193210). (no difference in quality that I can tell from #2).If you noticed with the kAudioUnitSubType_RemoteIO
recordings, there is some sort of distortion and/or filter happening that makes higher ranges, e.g. the higher notes on the guitar sound poor. I'm wondering if someone could at least point me in the right direction on what to try next.
I've tried messing with some of the stream_format settings, but things just get crazy sounding :). Thanks for any insight.
Hi @robjperez ,
The screen-sharing module contains an old code base also there is no swift support for it. Can you please provide a swift example of the module with latest updates of OpenTok.
Best,
I am trying Signaling app provided in this repo. I have added valid token, session id and api keys. But I am getting this error
(Error Domain=OTSessionErrorDomain Code=1004 "The provided API key does not match this token" UserInfo={NSLocalizedDescription=The provided API key does not match this token})
Today, i try to stream screen sharing. but had problem, stream always is broken, and audio is distorted. Anyone, have any solution to scale image?, Thank you so much
Hi, we are experiencing some crashes with OTAudioDevice.
The call stack is the following:
Crashed: WebRTCWorkerThread
We are using pod 'OpenTok', '~> 2.22.3'
In particular, I am unable to get "subscriberDidDisconnectFromStream" and "subscriberDidReconnectToStream" to run when I have a device that loses network stability. Those specific methods only trigger on the device that is experiencing the network problem. Suggesting every other subscriber is having trouble connecting to the session. This is not what is implied in the documentation.
I have implemented OpenTok with CallKit. Whenever I start outgoing call, my stream is never published. At the inspector dashboard it says,
Attempting to publish with stream 54B02465-0E26-4AF0-8715-9D333BF6E9FC (has not started publishing to session yet)"
it never succeed in publishing.
Moreover, I get the error in the log:
ERROR[OpenTok]:Audio device error: startCapture.AudioOutputUnitStart returned error: -66637
Is the error in publishing because of this error?
I have added OTDefaultAudioDevice file from the demo provided on GitHub.
Below is my code:
`func provider(_ provider: CXProvider, perform action: CXStartCallAction) {
// Create & configure an instance of SpeakerboxCall, the app's model class representing the new outgoing call.
let call = SpeakerboxCall(uuid: action.callUUID, isOutgoing: true)
call.handle = action.handle.value
/*
Configure the audio session, but do not start call audio here, since it must be done once
the audio session has been activated by the system after having its priority elevated.
*/
// https://forums.developer.apple.com/thread/64544
// we can't configure the audio session here for the case of launching it from locked screen
// instead, we have to pre-heat the AVAudioSession by configuring as early as possible, didActivate do not get called otherwise
// please look for * pre-heat the AVAudioSession *
configureAudioSession()
/*
Set callback blocks for significant events in the call's lifecycle, so that the CXProvider may be updated
to reflect the updated state.
*/
call.hasStartedConnectingDidChange = {
provider.reportOutgoingCall(with: call.uuid, startedConnectingAt: call.connectingDate)
}
call.hasConnectedDidChange = {
provider.reportOutgoingCall(with: call.uuid, connectedAt: call.connectDate)
}
self.outgoingCall = call
// Signal to the system that the action has been successfully performed.
action.fulfill()
}
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
print("Received (#function)")
// If we are returning from a hold state
if answerCall?.hasConnected ?? false {
//configureAudioSession()
// See more details on how this works in the OTDefaultAudioDevice.m method handleInterruptionEvent
sendFakeAudioInterruptionNotificationToStartAudioResources();
return
}
if outgoingCall?.hasConnected ?? false {
//configureAudioSession()
// See more details on how this works in the OTDefaultAudioDevice.m method handleInterruptionEvent
sendFakeAudioInterruptionNotificationToStartAudioResources()
return
}
if outgoingCall != nil{
startCall(withAudioSession: audioSession) { success in
if success {
self.outgoingCall?.hasConnected = true
self.addCall(self.outgoingCall!)
self.startAudio()
}
}
}
if answerCall != nil{
answerCall(withAudioSession: audioSession) { success in
if success {
self.answerCall?.hasConnected = true
self.startAudio()
}
}
}
}
func sendFakeAudioInterruptionNotificationToStartAudioResources() {
var userInfo = Dictionary<AnyHashable, Any>()
let interrupttioEndedRaw = AVAudioSession.InterruptionType.ended.rawValue
userInfo[AVAudioSessionInterruptionTypeKey] = interrupttioEndedRaw
NotificationCenter.default.post(name: AVAudioSession.interruptionNotification, object: self, userInfo: userInfo)
}
func configureAudioSession() {
// See https://forums.developer.apple.com/thread/64544
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSession.Category.playAndRecord, mode: .default)
try session.setActive(true)
try session.setMode(AVAudioSession.Mode.voiceChat)
try session.setPreferredSampleRate(44100.0)
try session.setPreferredIOBufferDuration(0.005)
} catch {
print(#file)
print(#function)
print(error)
}
}
`
Any help is urgently needed.
TIA.
Hi guys,
Is it possible to set the title of the double height status bar that appears when backgrounding your app while in a call? I'm having trouble finding any mention of it in the Apple documentation.
If this is possible, I'd appreciate some guidance on how to do it.
Thanks!
I'm trying to use the TBExampleVideoCapture in a native Xamarin.iOS project via Native Bindings (https://docs.microsoft.com/en-us/xamarin/ios/platform/binding-objective-c/walkthrough?tabs=macos) and it results in a run-away memory leak and app crash.
I've tried both the following scenarios and they both crash:
Binding to the TBExampleVideoCapture.m directly and doing all other OpenTok work in Xamarin.iOS including setting the Publisher.VideoCapture property to the bound TBExampleVideoCapture object.
Changing the ViewController.m class from a UIViewController to UIView, then binding to that. I moved all OpenTok logic from Xamarin.iOS into native Objective-C UIView.
I should note that the TBExampleVideoCapture works as a Native iOS application. But there is something wrong when it's exposed as a native binding to Xamarin that causes the memory to run away. I thought perhaps there was something wrong with binding directly to the TBExampleVideoCapture via the Publisher.VideoCapture property, so I tried the UIView route thinking that all the rendering would occur on the native side and that would fix things, but it doesn't.
I'm really stuck and this is a pivotal thing for our application. If there's no way to resolve this we'll have to investigate other platforms unfortunately. We need the ability to draw on the Publisher's video before it's sent across the network. Thank you!
Sometimes when publishing a stream, I am getting the following error:
[AVCaptureSession startRunning] startRunning may not be called between calls to beginConfiguration and commitConfiguration
I have not found specific steps to reproduce it. It is not often, but it is happening on my production app as well. In the live app I am using Opentok 2.24.1, but I upgraded to 2.24.2 locally and the issue remains.
Describe the bug
Hello, we're getting quite a few crash reports on iOS 14, using the OTXCFramework (2.24.1)
Stracktrace:
Fatal Exception: NSInvalidArgumentException
0 CoreFoundation 0x129dc0 __exceptionPreprocess
1 libobjc.A.dylib 0x287a8 objc_exception_throw
2 CoreFoundation 0x19c5a0 -[__NSCFString characterAtIndex:].cold.1
3 CoreFoundation 0x1a85f8 -[__NSPlaceholderDictionary initWithCapacity:].cold.1
4 CoreFoundation 0x16b90 -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]
5 CoreFoundation 0x94b0 +[NSDictionary dictionaryWithObjects:forKeys:count:]
6 MeetingRoom 0x251e6c +[VGRTCDefaultVideoEncoderFactory supportedCodecs]
7 MeetingRoom 0x25224c -[VGRTCDefaultVideoEncoderFactory supportedCodecs]
8 MeetingRoom 0x739680 webrtc::ObjCVideoEncoderFactory::GetSupportedFormats() const
9 MeetingRoom 0x3959f4 cricket::WebRtcVideoEngine::send_codecs() const
10 MeetingRoom 0x1a920c webrtc::PeerConnectionFactory::GetVideoEncoderSupportedCodecs()
11 MeetingRoom 0x1a9170 webrtc::PeerConnectionFactory::PeerConnectionFactory(rtc::scoped_refptr<webrtc::ConnectionContext>, webrtc::PeerConnectionFactoryDependencies*)
12 MeetingRoom 0x1ad2f4 rtc::RefCountedObject<webrtc::PeerConnectionFactory>::RefCountedObject<rtc::scoped_refptr<webrtc::ConnectionContext>&, webrtc::PeerConnectionFactoryDependencies*>(rtc::scoped_refptr<webrtc::ConnectionContext>&, webrtc::PeerConnectionFactoryDependencies*&&)
13 MeetingRoom 0x1a9078 rtc::scoped_refptr<webrtc::PeerConnectionFactory> rtc::make_ref_counted<webrtc::PeerConnectionFactory, rtc::scoped_refptr<webrtc::ConnectionContext>&, webrtc::PeerConnectionFactoryDependencies*, (webrtc::PeerConnectionFactory*)0>(rtc::scoped_refptr<webrtc::ConnectionContext>&, webrtc::PeerConnectionFactoryDependencies*&&)
14 MeetingRoom 0x1a9000 webrtc::PeerConnectionFactory::Create(webrtc::PeerConnectionFactoryDependencies)
15 MeetingRoom 0x1a8ee8 webrtc::CreateModularPeerConnectionFactory(webrtc::PeerConnectionFactoryDependencies)
16 MeetingRoom 0x1aa530 rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> rtc::FunctionView<rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> ()>::CallVoidPtr<webrtc::CreateModularPeerConnectionFactory(webrtc::PeerConnectionFactoryDependencies)::$_2>(rtc::FunctionView<rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> ()>::VoidUnion)
17 MeetingRoom 0x1aa4bc rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> rtc::Thread::Invoke<rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface>, void>(rtc::Location const&, rtc::FunctionView<rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> ()>)::'lambda'()::operator()() const
18 MeetingRoom 0x2880fc webrtc::webrtc_new_closure_impl::ClosureTask<rtc::Thread::Send(rtc::Location const&, rtc::MessageHandler*, unsigned int, rtc::MessageData*)::$_2>::Run()
19 MeetingRoom 0x2871d0 rtc::Thread::QueuedTaskHandler::OnMessage(rtc::Message*)
20 MeetingRoom 0x28693c rtc::Thread::Dispatch(rtc::Message*)
21 MeetingRoom 0x2859c8 rtc::Thread::ProcessMessages(int)
22 MeetingRoom 0x286dcc rtc::Thread::PreRun(void*)
23 libsystem_pthread.dylib 0x1bfc (Missing UUID 496dc4232dd43031bccf93e889035a34)
24 libsystem_pthread.dylib 0xa758 (Missing UUID 496dc4232dd43031bccf93e889035a34)
Expected behavior
Shouldn't crash
Device (please compete the following information):
1_MX40NDc3MzQxMn5-MTY3NzY5MjI1MjM0NH5MdVBKNUUyTVFMaVA0YVdoNkcxWENhSnl-UH5-
Hi,
I am facing some random issue here. If I locked my iPhone or put application background during video call then sometimes publisher camera freezes or black out. I go through documentation but I could not find out the solution.
Can you please help me ?
Any near future PRs for Picture in Picture support?
In latest version(2.16) of OpenTok getting Image stretch problem randomly for publisher and subscriber, but OpenTok version(2.15.3) works fine, No image stretch.Any suggestions for this.
Hello, I would like to know when you will be giving support for iOS Swift Package Manager. It would make it much simpler to update than just throwing the updated version of "OpenTok.framework" into the project.
Obj-C FrameMetadata doesn't actually send metadata to remote subscribers. It all appears to be a local loop-back in the example project. When I connect the Windows SDK to the same session, the frame.metadata is null. Or maybe this is just the same problem as below:
Swift FrameMetadata doesn't work either.. but sometimes when I set a breakpoint in the Windows client and inspect the frame.metadata property (it's null) and then step over suddenly there's metadata. But the timestamps start repeating, the same two timestamps over and over. Then, with the FrameMetadata app still running on the phone, I disconnect the Windows client from the OT Session and reconnect, no metadata again.
I create a new Session and Token, video is working between Windows and iOS clients but no metadata. Suddenly the metadata shows up but it's more than 30 seconds old and it keeps repeating. Everything has been running now for 5 mins and I keep getting this in the Windows client over and over: 2020-05-11T03:26:28-04:00
Now I'm getting this over and over:
Subscriber video frame metadata: 2020-05-11T03:30:13-04:00
Subscriber video frame metadata: 2020-05-11T03:26:28-04:00
Subscriber video frame metadata: 2020-05-11T03:28:58-04:00
Subscriber video frame metadata: 2020-05-11T03:30:13-04:00
Is this a server issue? The only changes I've made to all the example code is the Key, SessionId, and Token.
OTSubscriberKitNetworkStatsDelegate and OTSubscriberKitRtcStatsReportDelegate Methods are not called unless the subscriber's delegate
(OTSubscriberKitDelegate) conforms to the protocol and implements the method. This is fine if the fine if the subscriber's networkStatsDelegate
and rtcStatsReportDelegate
are the same object as the delegate
, but can fail if they are different objects.
A crash can also occur if the delegate
does conform to the stats delegate protocols and implements the methods, but the actual networkStatsDelegate
and rtcStatsReportDelegate
do not implement the methods.
OTSubscriberKitDelegate
doens't implement the OTSubscriberKitNetworkStatsDelegate
and OTSubscriberKitRtcStatsReportDelegate
methods.OTSubscriberKitNetworkStatsDelegate
and OTSubscriberKitRtcStatsReportDelegate
methods.OTSubscriberKitDelegate
will cause the methods in the stats delegate to get executed.OTSubscriberKitDelegate
, but not in the stats delegate will cause the app to crash.class SubscriberDelegate: NSObject,
OTSubscriberKitDelegate,
OTSubscriberKitNetworkStatsDelegate,
OTSubscriberKitRtcStatsReportDelegate
{
...
// Uncomment theses lines to get the StatsDelegate to work.
// func subscriber(_ subscriber: OTSubscriberKit, videoNetworkStatsUpdated stats: OTSubscriberKitVideoNetworkStats) {}
// func subscriber(_ subscriber: OTSubscriberKit, audioNetworkStatsUpdated stats: OTSubscriberKitAudioNetworkStats) {}
// func subscriber(_ subscriber: OTSubscriberKit, rtcStatsReport jsonArrayOfReports: String) {}
...
}
class StatsDelegate: NSObject,
OTSubscriberKitNetworkStatsDelegate,
OTSubscriberKitRtcStatsReportDelegate
{
// Comment out these methods and uncomment the methods above to observe a crash.
func subscriber(_ subscriber: OTSubscriberKit, videoNetworkStatsUpdated stats: OTSubscriberKitVideoNetworkStats) {
print("subscriber video stats")
}
func subscriber(_ subscriber: OTSubscriberKit, audioNetworkStatsUpdated stats: OTSubscriberKitAudioNetworkStats) {
print("subscriber audio stats")
}
func subscriber(_ subscriber: OTSubscriberKit, rtcStatsReport jsonArrayOfReports: String) {
print("subscriber rtc stats")
}
}
class SessionDelegate: NSObject,
OTSessionDelegate
{
...
let subscriberDelegate = SubscriberDelegate()
let statsDelegate = StatsDelegate()
func session(_ session: OTSession, streamCreated stream: OTStream) {
guard let subscriber = OTSubscriber(stream: stream) else { return }
subscriber.delegate = self.subscriberDelegate
subscriber.networkStatsDelegate = self.statsDelegate
subscriber.rtcStatsReportDelegate = self.statsDelegate
DispatchQueue.main.asyncAfter(deadline: .now() + .seconds(1), execute: {
subscriber.getRtcStatsReport()
})
}
...
}
I don't have access to the source code, but I am guessing that the reason this happens is because when deciding whether or not to call the stats delegate methods, there is a guard incorrectly checking to see if the subscriber's delegate
conforms to the protocol and implements the method, this should be changed to check the networkStatsDelegate
/rtcStatsReportDelegate
as appropriate.
In the latest OpenTok version (2.12.0) from September 2017, fixes to AirPlay support were introduced. However, if using the OTDefaultAudioDevice.h and OTDefaultAudioDevice.m files and using them as an audio driver, the AirPlay support is broken.
I am unable to choose any AirPlay devices for mirroring if using this driver and streaming from TokBox. However, if I do make some changes (mainly change the session category to AVAudioSessionCategoryPlayback - as we do not use microphone input from our end users), the mirroring does work. But only for about 20-30 seconds, and then the connection is lost.
Would it be possible to update the audio driver examples to match the 2.12.0 OpenTok version - meaning having an example where the AirPlay streaming does work.
Describe the bug
We are working on a lone-worker safety app in which we want to start a call by clicking Bluetooth button, its working fine when app is open but when app is in background call is started but no sound listen on web end.
To Reproduce
We are working on a lone-worker safety app in which we want to start a call by clicking Bluetooth button, its working fine when app is open but when app is in background call is started but no sound listen on web end.
Expected behavior
no sound hear on web end even i enable background services for audio
Device (please compete the following information):
Additional context
Add any other context about the problem here.
We can detect the subscriber video disabled from a protocol with a reason why it got disabled.
One of the reason for disabling is Quality Changed
As per the documentation this quality changed would be trigger whenever quality of network/cpu degraded on either publisher/subscriber device. How do we know the exact reason here, like which device has the problem so that we can inform user that either your's or your partner's device has lower bandwidth. Which help users to have better knowledge why the video feed got suspended.
Is there any api which can tell more detail about this quality change trigger?
Version:
Opentok iOS verison 2.23.1
Steps to reproduce
What is the current bug behaviour?
Even though the session is connected successfully and the publisher stream is created and publishing, a stream destroyed event is received and it terminates the video call.
The bug was recreated on the Bacic-Video-Chat sample.
What is the expected correct behaviour?
If the session is connected and already publishing, a stream destroyed event should not be received for no reason.
Possible related issue: opentok/opentok-react-native#201
how can I add a watermark or sticker to video call?
With iOS 14, Apple introduces local network privacy (see this video).
Beginning in iOS 14, the operating system will prompt the user for permission when an application attempts to subscribe to clients on the same local network in a relayed session.
If your application uses a relayed session, it is encouraged to add a descriptive custom usage string to inform the user why the application needs access to their local area network. The Vonage Video API uses the local network to discover and connect to video participants on your same network where possible. The Apple video (above) shows how you can edit the string displayed in this message.
If the user does not accept the permission, the attempt to subscribe will fail. After the permission is rejected, any future attempts to subscribe to clients on the same network will also fail unless the user changes the permission in Settings. Unfortunately, iOS does not provide an API for an application to determine if the user has accepted or rejected this permission.
It is important to note that this does not apply to video sessions that use the OpenTok Media Router, as media is sent over the internet rather than the local network.
For applications which cannot use routed sessions and do not wish the user to ever be prompted for local network access, it is recommended to use TURN by using the following API:
OTSessionICEConfig *myICEServerConfiguration = [[OTSessionICEConfig alloc] init];
OTSessionSettings *settings = [[OTSessionSettings alloc] init];
myICEServerConfiguration.transportPolicy = OTSessionICETransportRelay;
settings.iceConfig = myICEServerConfiguration;
_session = [[OTSession alloc] initWithApiKey:kApiKey
sessionId:kSessionId
delegate:self settings:settings];
I am trying to convert the SampleHandler to Swift 3 and I'm running into the following challenges:
How do you define the videoCaptureConsumer
in SampleHandler.h?
opentok-ios-sdk-samples/Broadcast-Ext/OpenTok Live/SampleHandler.m
Lines 92 to 95 in 754aedc
Swift suggests the following template, but what should be returned?
var videoCaptureConsumer: OTVideoCaptureConsumer? {
get {
<#code#>
}
set(videoCaptureConsumer) {
<#code#>
}
}
i download example run work ok
but use voip didReceiveIncomingPushWithPayload received and open view video doSubscribe(OTSubscriber delegate callbacks) not asynchronously
Thx. all
i have implemented the same code from sample project still there is no sound which making a video call
Energy impact is very high when streamers reach more than 3 or 4 streamers, which may cause app suspension in some cases.
Hi,
I recently had to add a Cocoa Touch Framework in my iOS project (Swift 3.0) which uses OpenTok as an internal dependency. Everything works great, but the only problem i'm having is my build on CircleCI. We use Fastlane and CircleCI to distribute our app to various environments. And since the Opentok.framework(static library) is 130MB in size and GitHub has a limit of 100MB, i'm having trouble to make my builds work on CircleCI. Would you please suggest a work around maybe. Appreciate any sort of help.
With thanks in advance,
Tejas
publisher.publishVideo = false
then too camera indicator is on all the time.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.