Git Product home page Git Product logo

kickflip-ios-sdk's People

Contributors

chrisballinger avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kickflip-ios-sdk's Issues

audio only 941 or 940 buffer size for iphone 6s. causing the video to get out of sync.

Hi,

Im using the iphone 6s and the audio from the internal mic works differently, it sends out smaller buffers which causing the video to get out of sync

What should I do in order to fix it, I know its should be somewhere here:

CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
CFRetain(blockBuffer);
OSStatus status = CMBlockBufferGetDataPointer(blockBuffer, 0, NULL, &_pcmBufferSize, &_pcmBuffer);
NSError *error = nil;
if (status != kCMBlockBufferNoErr) {
error = [NSError errorWithDomain:NSOSStatusErrorDomain code:status userInfo:nil];
}
NSLog(@"PCM Buffer Size: %zu", _pcmBufferSize);

    memset(_aacBuffer, 0, _aacBufferSize);
    AudioBufferList outAudioBufferList = {0};
    outAudioBufferList.mNumberBuffers = 1;
    outAudioBufferList.mBuffers[0].mNumberChannels = 1;
    outAudioBufferList.mBuffers[0].mDataByteSize = _aacBufferSize;
    outAudioBufferList.mBuffers[0].mData = _aacBuffer;
    AudioStreamPacketDescription *outPacketDescription = NULL;
    UInt32 ioOutputDataPacketSize = 1;
    status = AudioConverterFillComplexBuffer(_audioConverter, inInputDataProc, (__bridge void *)(self), &ioOutputDataPacketSize, &outAudioBufferList, outPacketDescription);
    NSLog(@"ioOutputDataPacketSize: %d", (unsigned int)ioOutputDataPacketSize);
    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

Problems getting this to build

Hello! I'm trying to get your iOS SDK example to build and I'm having issues.

The first is that, after cloning the repo and running git submodule update --init in the repo, I come across two errors:

import "KFSecrets.h" <- not found

import "KFHLSUploader.h" <- not found

Could you please add templates for the KFSecrets.h and the KFHLSUploader.h as well? Really looking forward to trying this out!

UPDATE:

After commenting out the KFSecrets.h line and finding the KFHLSUploader.h by downloading the repo, not cloning it (weird issue), I was able to realize that I needed to do a few more submodule updating to get the files I needed. Now, I'm only down one file. I can't find the following files:

In FFFile.h, #import "libavformat/avformat.h" <- not found

Error compiling project when using Kickflip pod

I followed the instructions on the kickflip-ios-sdk's README.

  • Installed dependencies via cocoapods;
  • Imported "Kickflip.h"
  • Added demo code

When trying to compile and build the app, I get the following error:

Undefined symbols for architecture i386:
  "std::terminate()", referenced from:
      ___clang_call_terminate in libPods-KickflipApp.a(KFH264Encoder.o)
  "vtable for __cxxabiv1::__class_type_info", referenced from:
      typeinfo for NALUnit in libPods-KickflipApp.a(AVEncoder.o)
      typeinfo for NALUnit in libPods-KickflipApp.a(NALUnit.o)
  NOTE: a missing vtable usually means the first non-inline virtual member function has no definition.
  "operator delete(void*)", referenced from:
      NALUnit::~NALUnit() in libPods-KickflipApp.a(AVEncoder.o)
      NALUnit::~NALUnit() in libPods-KickflipApp.a(NALUnit.o)
  "___cxa_begin_catch", referenced from:
      ___clang_call_terminate in libPods-KickflipApp.a(KFH264Encoder.o)
  "___gxx_personality_v0", referenced from:
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(AVEncoder.o)
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(KFH264Encoder.o)
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(NALUnit.o)
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(AVEncoder.o)
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(KFH264Encoder.o)
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(NALUnit.o)
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(AVEncoder.o)
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(KFH264Encoder.o)
      Dwarf Exception Unwind Info (__eh_frame) in libPods-KickflipApp.a(NALUnit.o)
ld: symbol(s) not found for architecture i386
clang: error: linker command failed with exit code 1 (use -v to see invocation)

Any idea?

FFmpeg legal issue

First of all kickflip is a terrific work.. hats off to the developer .But one question is bothering me. That is kickflip have used ffmpeg wrapper which means ffmpeg library has been used. But FFmpeg is LGPL compliant which is against app store .. So isn't it that if my ios app ...if it uses kickflip may face future problem in due to ffmpeg Lgpl licesce..

An answer would be appreciated

AFNetworking 2.0 support

HI ,

In kickflip.io it reports missing files and classes because they were removed in AFNetworking 2.0. Can you guys update the SDk to work with latest AFNetworking version please or should i ?

eg "AFJSONRequestOperation.h" is not found
"AFHTTPClient' not found etc etc.

libavcodec.a static library clash with gstreamer libavcode.a

issue:
after install kickflip sdk to a gstreamer based project, gstreamer links to the kickflip version of libavcode through pods, and it causes conflicts and random crashes.

suggested change:
not using precompiled static library or at least Can we change the pre-compiled FFmpeg lib included in the kickflip dependence with a prefix like KFlibavcodec ?
This post is illustrates this issue pretty well: http://landonf.bikemonkey.org/code/ios/Radar_15800975_iOS_Frameworks.20140112.html

#import "librtmp/log.h" file not found

I am trying to implement kick flip IOS SDK in my project but after setup pod file its show me error message

<Bolts/BFTask.h> file not found

after searching on internet i found replace
pod 'Kickflip'

to

pod 'Kickflip', :git => 'https://github.com/zummenix/kickflip-ios-sdk'

but after doing that it give me new error message

import "librtmp/log.h" file not found

so please tell me what should do to solve this problem?

stream url taking to long to response

After I present the broadcast controller with:

    Kickflip.presentBroadcasterFromViewController(self,
            ready: { stream in
                if ((stream.streamURL) != nil){
                   print(stream.streamURL)
                }
            }, completion:{ success, error in
                if (!success) {
                    print("Error setting up stream: %@", error);
                } else {
                    print("Done broadcasting");
                }
        })

First I'm getting this

codec not found: h264
[mpegts @ 0x17c38600] muxrate VBR, pcr every 3 pkts, sdt every 200, pat/pmt every 40 pkts

and then it takes up to 14 seconds to get the streaming URL. Am I missing something or should I do something different because 14 seconds is to long to get the streaming URL ? (I'm using iPhone 4S and and iPad to test it)

Thank you in advance.

kickflip.io is down

I don't know if there is another way to get in touch with the administrator of the servers... but the service is down

Live Streaming

Good morning. Having a problem while using your SDK. I have about 30+ seconds of delay with buffering label. Could I make this process faster? And one more question - will I see the newest recorded frames while watching live streaming when I play video from url provided by your sdk, or I will start from the start of the video?

Signature not match

Message = "The request signature we calculated does not match the signature you provided. Check your key and signing method.";

I am getting this issue while calling [self.s3 putObject:uploadRequest].

I am not getting why signature is not match .

Please help me with this.

Please reply to support emails :(

Guys , i have been sending many emails to support , we are working on an app and kickflip.io is best fit for us. But unfortunalty we cant start working on it untill you response to our emails. having no response also gives hint for an un certain future about support when we go live , we cant spend millions of dollars and stay in dark.

Experiencing Problems

We are experimenting with kickflip. But every time we load the stream, it seems to be starting from the beginning of the stream, so it's not really "live". What are we doing wrong or did we miss the concept of kickflip?
And also we are having problems to start a stream from time to time , maybe lowering the packet size for the uploads will be a solution but even there are docs , we could not figure out what to do.
And one final thing , we read all the issues here and many people asked for that but could not find the solution for :
"codec not found: h264
[mpegts @ 0x13006be00] muxrate VBR, pcr every 3 pkts, sdt every 200, pat/pmt every 40 pkts"

We are working on an app for iOS , testing on iPhone 6 plus and did the whole pod updates also cleaned xCode Derived data but still the same.

Video recorded with Kickflip example doesn´t work

Hi. I am running the example proyect to iOS and i have experimented two problems:

  1. The block method to get the stream URL is never called.

Code:
[Kickflip presentBroadcasterFromViewController:self ready:^(KFStream stream) {
if (stream.streamURL) {
DDLogInfo(@"Stream is ready at URL: %@", stream.streamURL);
}
} completion:^(BOOL success, NSError error){
if (!success) {
DDLogError(@"Error setting up stream: %@", error);
} else {
DDLogInfo(@"Done broadcasting");
}
}];

  1. When I try reproduce the videos recorded with the app, in the web page, appears a "Buffering" message, but the video is never reproduce.

Thanks.

AVAssetWriter appendSampleBuffer failure

hi, i use kickflip to record video, but sometimes when i start a recording, the terminal says: "writer error, the operation couldn't be completed", and i think it is the avassetwriter problem, in VideoEncoder.m file, _writerInput appendSampleBuffer method sometimes fails, and this problem reproduces easily, about ten percent of all recordings.
the worst thing is , when this situation occurs, i cannot solve this by restarting another recording, the following recording will always failure. i searched the net and couldn't find a answer, plz help, thanks in advance..
the _writer.error message is "Error Domain=AVFoundationErrorDomain Code=-11800, this operation couldn't be completed. OSStatus error -12983, NSLocalizedFailureReason=Unknown Error"

Cannot play video after create by ios sdk

I use [Kickflip presentBroadcasterFromViewController:ready:completion:] to create Stream. After record finish, completion block has been call with success = YES, error = nil.

But I cannot play that video, other video has been upload by android sdk play ok.

the cpu use is huge

hi:

i am still testing this sdk, and i found that the cpu using is almost 80% on my iphone5.

any plan to profile that?

Can't install using Cocoapods

Getting the following issue: The 'Pods' target has transitive dependencies that include static binaries
This happens when use_frameworks! is defined in the PodFile

Duplicate Symbol Issue

Hi KickFlip team I like your Product, In future we will have plan to buy this, i am testing Kickflip in my project and I got this Error paste.ubuntu.com/12637254/
Please check this Error log, and i am using Facebook Framework in my project, and this is conflict with BFUrl.h please check this error and Reply us ASAP,

Memory leak in KFH264Encoder

when set _recorder to nil, the KFH264Encoder member is not being released since there is a reference to it.
When instantiating the KFH264Encoder object (in KFH264Encoder::initWithBitrate) you pass a completion handler to _encoder.

my (simple) solution was to add a "shutdown" method to KFH264Encoder and remove the references to the completion handlers.

-(void) shutdown
{
[_encoder encodeWithBlock:nil onParams:nil];
}
call it before set _h264Encoder = nil;

this solution will free the encoder object

Error starting stream, fetch an active user first - kKFAPIClientErrorDomain

Hi,

We're a paying customer and have launched an app in de appstore with kickflip in it. But although we had issues with the webservice last month, it seems that it has returned again.

If we want to start a livestream, we get this error: "Error starting stream: Fetch an active user first."
And after that "kKFAPIClientErrorDomain error 105". We haven't changed our code, but it's not working anymore.
This afternoon I also tried it with the new v1.3 kickflip sdk, but we're experiencing the same issue. Can you please look into this issue?

BFTask error.

After installing new version via pods we are getting attached error. Please help us.
screen shot 2015-07-03 at 12 21 24 pm

Sometimes streaming doesn't work

We are considering usage of your service in our project. But sometimes streaming doesn't work even on your site in the dashboard section. Is it due to Free Trial Plan?

AssetWritterInput Code=-11800 "The operation could not be completed

Im trying to substitute the CMSamplebuffer that comes out of the native camera in the method
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

with my own generated CMSampleBufferRef that I build out of a GPUImage output. But I get an error when the kickflip AssetWriterInput tries to appendBuffer

writer error Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSUnderlyingError=0x133a94f60 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}, NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed}

//THS IS MY CODE

broadcastOutputTarget = [[GPUImageRawDataOutput alloc] initWithImageSize:CGSizeMake(1920, 1080) resultsInBGRAFormat:YES];
        [filterChain.output addTarget:broadcastOutputTarget];
        self.streamImage = [[UIImage alloc] init];
        __block ToyViewController *safeSelf = self;

        __weak GPUImageRawDataOutput *weakRawOutput = broadcastOutputTarget;
        [broadcastOutputTarget setNewFrameAvailableBlock:^{
            GLubyte *outputBytes = [weakRawOutput rawBytesForImage];
            NSInteger bytesPerRow = [weakRawOutput bytesPerRowInOutput];

            CVPixelBufferRef pixelBuffer = NULL;
            OSStatus result = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                                           1920,
                                                           1080,
                                                           kCVPixelFormatType_32BGRA,
                                                           outputBytes,
                                                           bytesPerRow, nil, nil, nil,
                                                           &pixelBuffer);

            CMVideoFormatDescriptionRef videoInfo = NULL;
            result = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo);




            CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
            timingInfo.duration = frameDuration;
            timingInfo.presentationTimeStamp = nextPTS;

            //NSLog(@"TIME: %f", CMTimeGetSeconds(timingInfo.presentationTimeStamp));


            CMSampleBufferRef sampleBuffer = NULL;
            result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
                                                        pixelBuffer,
                                                        true, NULL, NULL,
                                                        videoInfo,
                                                        &timingInfo,
                                                        &sampleBuffer);
            NSLog(@"BUFFER : %@", sampleBuffer);
            //THIS IS WHERE I SEND THE BUFFER TO THE ENCODER
           ** [safeSelf.broadCastRecorder captureGenerateVideoOutput:sampleBuffer];**
            //Increment presentation time
            nextPTS = CMTimeAdd(frameDuration, nextPTS);
            // release the copy of the sample buffer we made
            CFRelease(sampleBuffer);
}];

And my custom captureGenerateVideoOutput:CMSampleBufferRef method that I integrated in my Recorder class. that is pretty much a clone of the KFRecorder.h/m

-(void) captureGenerateVideoOutput:(CMSampleBufferRef)sampleBuffer{
    if (!_hasScreenshot) {
        UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
        NSString *path = [self.hlsWriter.directoryPath stringByAppendingPathComponent:@"thumb.jpg"];
        NSData *imageData = UIImageJPEGRepresentation(image, 0.7);
        [imageData writeToFile:path atomically:NO];
        _hasScreenshot = YES;
    }
    if(_h264Encoder)
        [_h264Encoder encodeSampleBuffer:sampleBuffer];

}

Attempt to use RTMP instead of HLS

I'm trying to replace hls with rtmp in the sdk. Just to test, I change outputPath in setupOutputPath function in KFHLSWriter to my rtmp end point, change output format to flv and add bitstream filter aac_adtsoasc:

- (void) setupOutputFile {
//    NSString *outputPath = [_directoryPath stringByAppendingPathComponent:@"index.m3u8"];
    NSString *outputPath = @"rtmp://<rtmp-endpoint>/live/test";

    _outputFile = [[FFOutputFile alloc] initWithPath:outputPath options:@{kFFmpegOutputFormatKey: @"flv"}];

    FFBitstreamFilter *bitstreamFilter = [[FFBitstreamFilter alloc] initWithFilterName:@"h264_mp4toannexb"];
    FFBitstreamFilter *bitstreamFilterAAC = [[FFBitstreamFilter alloc] initWithFilterName:@"aac_adtstoasc"];
    [_outputFile addBitstreamFilter:bitstreamFilter];
    [_outputFile addBitstreamFilterAudio:bitstreamFilterAAC];
}

(I added addBitstreamFilterAudio: function in FFmpegWrapper so the filter aac_adtstoasc will be added only if codec_id is AV_CODEC_ID_AAC).

That's all I can do for now, the app can run without problem. But if I check the output file with ffmpeg:

ffmpeg -i test.flv

then here's the output:

[flv @ 0x7fe764001000] Could not find codec parameters for stream 1 (Audio: none, 0 channels): unknown codec
Consider increasing the value for the 'analyzeduration' and 'probesize' options
Input #0, flv, from 'test.flv':
  Duration: 00:00:14.38, start: 0.402000, bitrate: 1640 kb/s
    Stream #0:0: Video: h264 (Baseline), yuv420p(tv, smpte170m/bt709/bt709), 854x480, 29.97 tbr, 1k tbn, 2k tbc
    Stream #0:1: Audio: none, 0 channels

There's something wrong when encoding audio stream. Any ideas how to fix this?

Recorded video stops not at start and end

Sometimes the recording starts the recorded video after more than 60 seconds (both on web and on device) and sometimes it also ends earlier (it misses the last 20 or 5 seconds). When recording via the iOS SDK. Othertimes it works well and only misses the first second (not a big issue). Tested with some recordings of 1 or 2 minute(s).

Is this a SDK issue, or something from the encoder settings? And can anything be done to fix this?

Handle 'applicationDidEnterBackground' gracefully

User has broken streams due to this issue. Steps to reproduce:

  1. Start streaming, wait some time;
  2. Tap Home button of the device (the app becomes inactive), wait some time;
  3. Tap on icon of the app (the app becomes active);
  4. User sees that streaming continues, but there is an error writer error Operation Interrupted in the console.

KickFlip SDK as an iOS Framework?

Is there anyway you could provide the SDK as an iOS framework? I would like to add the SDK to my own project without having to use CocoaPods and the example application.

Codec not found: h264

At startup of broadcasting in the Example app this string appears in the log:
Codec not found: h264
also obtained chunks (indexX.ts) are not always played players due to missing 'moov' atom.

Reduce latency of H.264 encoder

I use codecs to encode a stream video , but i found some frame output are very slow. It should be more then 80ms even 200ms . It's not for all frame .
I think may be the result of file locking .
Any good Suggestions

Crash in AVEncoder.mm

Kickflip (1.3)
Device: iPod Touch 5 (iOS 8.3)
untitled

stacktrace:

(lldb) thread backtrace all
  thread #1: tid = 0xd64c4, 0x3852c49c libsystem_kernel.dylib`mach_msg_trap + 20, queue = 'com.apple.main-thread'
    frame #0: 0x3852c49c libsystem_kernel.dylib`mach_msg_trap + 20
    frame #1: 0x3852c294 libsystem_kernel.dylib`mach_msg + 40
    frame #2: 0x29f3a7f2 CoreFoundation`__CFRunLoopServiceMachPort + 146
    frame #3: 0x29f38db8 CoreFoundation`__CFRunLoopRun + 1016
    frame #4: 0x29e849a0 CoreFoundation`CFRunLoopRunSpecific + 476
    frame #5: 0x29e847b2 CoreFoundation`CFRunLoopRunInMode + 106
    frame #6: 0x316361a8 GraphicsServices`GSEventRunModal + 136
    frame #7: 0x2d60f694 UIKit`UIApplicationMain + 1440
    frame #8: 0x0015fc64 proto`main + 196 at AppDelegate.swift:12

  thread #2: tid = 0xd657f, 0x3852c24c libsystem_kernel.dylib`kevent64 + 24, queue = 'com.apple.libdispatch-manager'
    frame #0: 0x3852c24c libsystem_kernel.dylib`kevent64 + 24
    frame #1: 0x00f08bf8 libdispatch.dylib`_dispatch_mgr_invoke + 280
    frame #2: 0x00efcc92 libdispatch.dylib`_dispatch_mgr_thread + 38

  thread #3: tid = 0xd6580, 0x385409c0 libsystem_kernel.dylib`__workq_kernreturn + 8
    frame #0: 0x385409c0 libsystem_kernel.dylib`__workq_kernreturn + 8
    frame #1: 0x385bde24 libsystem_pthread.dylib`_pthread_wqthread + 792
    frame #2: 0x385bdafc libsystem_pthread.dylib`start_wqthread + 8

  thread #7: tid = 0xd65ab, 0x3852c49c libsystem_kernel.dylib`mach_msg_trap + 20, name = 'AFNetworking'
    frame #0: 0x3852c49c libsystem_kernel.dylib`mach_msg_trap + 20
    frame #1: 0x3852c294 libsystem_kernel.dylib`mach_msg + 40
    frame #2: 0x29f3a7f2 CoreFoundation`__CFRunLoopServiceMachPort + 146
    frame #3: 0x29f38db8 CoreFoundation`__CFRunLoopRun + 1016
    frame #4: 0x29e849a0 CoreFoundation`CFRunLoopRunSpecific + 476
    frame #5: 0x29e847b2 CoreFoundation`CFRunLoopRunInMode + 106
    frame #6: 0x2abeddc0 Foundation`-[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 264
    frame #7: 0x2ac3be7c Foundation`-[NSRunLoop(NSRunLoop) run] + 80
    frame #8: 0x001705d2 proto`+[AFURLConnectionOperation networkRequestThreadEntryPoint:](self=0x005b8af4, _cmd=0x0050ca11, object=0x00000000) + 354 at AFURLConnectionOperation.m:168
    frame #9: 0x2acb318a Foundation`__NSThread__main__ + 1118
    frame #10: 0x385bfdea libsystem_pthread.dylib`_pthread_body + 138
    frame #11: 0x385bfd5e libsystem_pthread.dylib`_pthread_start + 118
    frame #12: 0x385bdb08 libsystem_pthread.dylib`thread_start + 8

  thread #8: tid = 0xd65ae, 0x3852c49c libsystem_kernel.dylib`mach_msg_trap + 20, name = 'com.apple.NSURLConnectionLoader'
    frame #0: 0x3852c49c libsystem_kernel.dylib`mach_msg_trap + 20
    frame #1: 0x3852c294 libsystem_kernel.dylib`mach_msg + 40
    frame #2: 0x29f3a7f2 CoreFoundation`__CFRunLoopServiceMachPort + 146
    frame #3: 0x29f38db8 CoreFoundation`__CFRunLoopRun + 1016
    frame #4: 0x29e849a0 CoreFoundation`CFRunLoopRunSpecific + 476
    frame #5: 0x29e847b2 CoreFoundation`CFRunLoopRunInMode + 106
    frame #6: 0x29a22646 CFNetwork`+[NSURLConnection(Loader) _resourceLoadLoop:] + 486
    frame #7: 0x2acb318a Foundation`__NSThread__main__ + 1118
    frame #8: 0x385bfdea libsystem_pthread.dylib`_pthread_body + 138
    frame #9: 0x385bfd5e libsystem_pthread.dylib`_pthread_start + 118
    frame #10: 0x385bdb08 libsystem_pthread.dylib`thread_start + 8

  thread #10: tid = 0xd65b5, 0x38540080 libsystem_kernel.dylib`__select + 20, name = 'com.apple.CFSocket.private'
    frame #0: 0x38540080 libsystem_kernel.dylib`__select + 20
    frame #1: 0x29f3efa4 CoreFoundation`__CFSocketManager + 532
    frame #2: 0x385bfdea libsystem_pthread.dylib`_pthread_body + 138
    frame #3: 0x385bfd5e libsystem_pthread.dylib`_pthread_start + 118
    frame #4: 0x385bdb08 libsystem_pthread.dylib`thread_start + 8

  thread #12: tid = 0xd65bf, 0x3852c4ec libsystem_kernel.dylib`semaphore_wait_trap + 8, name = 'com.apple.coremedia.player.async'
    frame #0: 0x3852c4ec libsystem_kernel.dylib`semaphore_wait_trap + 8
    frame #1: 0x00f0703e libdispatch.dylib`_dispatch_semaphore_wait_slow + 190
    frame #2: 0x2b9bb696 MediaToolbox`fpa_AsyncMovieControlThread + 1966
    frame #3: 0x2a6e4d2e CoreMedia`figThreadMain + 186
    frame #4: 0x385bfdea libsystem_pthread.dylib`_pthread_body + 138
    frame #5: 0x385bfd5e libsystem_pthread.dylib`_pthread_start + 118
    frame #6: 0x385bdb08 libsystem_pthread.dylib`thread_start + 8

  thread #13: tid = 0xd65c0, 0x3852c49c libsystem_kernel.dylib`mach_msg_trap + 20, name = 'AVAudioSession Notify Thread'
    frame #0: 0x3852c49c libsystem_kernel.dylib`mach_msg_trap + 20
    frame #1: 0x3852c294 libsystem_kernel.dylib`mach_msg + 40
    frame #2: 0x29f3a7f2 CoreFoundation`__CFRunLoopServiceMachPort + 146
    frame #3: 0x29f38db8 CoreFoundation`__CFRunLoopRun + 1016
    frame #4: 0x29e849a0 CoreFoundation`CFRunLoopRunSpecific + 476
    frame #5: 0x29e847b2 CoreFoundation`CFRunLoopRunInMode + 106
    frame #6: 0x28b8bf2c libAVFAudio.dylib`GenericRunLoopThread::Entry(void*) + 132
    frame #7: 0x28b7e426 libAVFAudio.dylib`CAPThread::Entry(CAPThread*) + 194
    frame #8: 0x385bfdea libsystem_pthread.dylib`_pthread_body + 138
    frame #9: 0x385bfd5e libsystem_pthread.dylib`_pthread_start + 118
    frame #10: 0x385bdb08 libsystem_pthread.dylib`thread_start + 8

  thread #21: tid = 0xd6787, 0x385409c0 libsystem_kernel.dylib`__workq_kernreturn + 8
    frame #0: 0x385409c0 libsystem_kernel.dylib`__workq_kernreturn + 8
    frame #1: 0x385bde24 libsystem_pthread.dylib`_pthread_wqthread + 792
    frame #2: 0x385bdafc libsystem_pthread.dylib`start_wqthread + 8

  thread #23: tid = 0xd67b5, 0x385409c0 libsystem_kernel.dylib`__workq_kernreturn + 8
    frame #0: 0x385409c0 libsystem_kernel.dylib`__workq_kernreturn + 8
    frame #1: 0x385bde24 libsystem_pthread.dylib`_pthread_wqthread + 792
    frame #2: 0x385bdafc libsystem_pthread.dylib`start_wqthread + 8

  thread #24: tid = 0xd67b6, 0x385409c0 libsystem_kernel.dylib`__workq_kernreturn + 8
    frame #0: 0x385409c0 libsystem_kernel.dylib`__workq_kernreturn + 8
    frame #1: 0x385bde24 libsystem_pthread.dylib`_pthread_wqthread + 792
    frame #2: 0x385bdafc libsystem_pthread.dylib`start_wqthread + 8

* thread #25: tid = 0xd67d5, 0x0029d716 proto`to_host(p=0x00000000) + 6 at AVEncoder.mm:16, queue = 'RemoteClientNotifyQueue', stop reason = EXC_BAD_ACCESS (code=1, address=0x0)
  * frame #0: 0x0029d716 proto`to_host(p=0x00000000) + 6 at AVEncoder.mm:16
    frame #1: 0x0029d3be proto`-[AVEncoder swapFiles:](self=0x1564fce0, _cmd=0x0051dc82, oldPath=0x16974d90) + 330 at AVEncoder.mm:299
    frame #2: 0x0029d126 proto`__25-[AVEncoder encodeFrame:]_block_invoke_2 + 158 at AVEncoder.mm:281
    frame #3: 0x28a4d938 AVFoundation`-[AVAssetWriterFigAssetWriterNotificationHandler _handleCompletedWritingNotification] + 28
    frame #4: 0x28a4d562 AVFoundation`AVAssetWriterFigAssetWriterHandleCompletedNotification + 70
    frame #5: 0x28ad923c AVFoundation`AVCMNotificationDispatcherCallback + 192
    frame #6: 0x29e8d68a CoreFoundation`__CFNotificationCenterAddObserver_block_invoke + 126
    frame #7: 0x29f2d0c4 CoreFoundation`__CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__ + 12
    frame #8: 0x29e86cbc CoreFoundation`_CFXNotificationPost + 1800
    frame #9: 0x29ea80b4 CoreFoundation`CFNotificationCenterPostNotification + 108
    frame #10: 0x2a6c90ee CoreMedia`CMNotificationCenterPostNotification + 86
    frame #11: 0x2bb03cb0 MediaToolbox`FigRemakerFamilyCallbacksServer_NotificationIsPending + 220
    frame #12: 0x2bb09b86 MediaToolbox`figremakerfamilycallbacks_server + 70
    frame #13: 0x00effa62 libdispatch.dylib`dispatch_mig_server + 362
    frame #14: 0x00f0e072 libdispatch.dylib`_dispatch_source_latch_and_call + 1810
    frame #15: 0x00efcd6c libdispatch.dylib`_dispatch_source_invoke + 212
    frame #16: 0x00f0391e libdispatch.dylib`_dispatch_queue_drain + 582
    frame #17: 0x00efdac4 libdispatch.dylib`_dispatch_queue_invoke + 88
    frame #18: 0x00f05b1a libdispatch.dylib`_dispatch_root_queue_drain + 1294
    frame #19: 0x00f06e20 libdispatch.dylib`_dispatch_worker_thread3 + 108
    frame #20: 0x385bdda8 libsystem_pthread.dylib`_pthread_wqthread + 668
    frame #21: 0x385bdafc libsystem_pthread.dylib`start_wqthread + 8

  thread #26: tid = 0xd680a, 0x385bdaf4 libsystem_pthread.dylib`start_wqthread
    frame #0: 0x385bdaf4 libsystem_pthread.dylib`start_wqthread

  thread #27: tid = 0xd681b, 0x29e6ed3c CoreFoundation`CFBasicHashFindBucket + 12012, queue = 'com.apple.avfoundation.videodataoutput.bufferqueue'
    frame #0: 0x29e6ed3c CoreFoundation`CFBasicHashFindBucket + 12012
    frame #1: 0x29ea6134 CoreFoundation`CFDictionaryGetValueIfPresent + 112
    frame #2: 0x2a7257f6 CoreMedia`sbufAtom_appendKeyValuePairAtom + 90
    frame #3: 0x2a7252d8 CoreMedia`sbufAtom_appendDictionaryAtom + 152
    frame #4: 0x2a722958 CoreMedia`sbufAtom_createSerializedDataAndSurfaceForSampleBuffer + 444
    frame #5: 0x2a7226a2 CoreMedia`FigRemote_CreateSerializedAtomDataAndSurfaceForSampleBuffer + 318
    frame #6: 0x2a72252e CoreMedia`FigRemote_CreateSerializedAtomDataForSampleBuffer + 66
    frame #7: 0x2bb05eca MediaToolbox`remoteWriter_AddSampleBuffer + 110
    frame #8: 0x28a5d2c2 AVFoundation`-[AVFigAssetWriterTrack addSampleBuffer:error:] + 78
    frame #9: 0x28a59a38 AVFoundation`-[AVAssetWriterInputWritingHelper appendSampleBuffer:] + 184
    frame #10: 0x28a578fc AVFoundation`-[AVAssetWriterInput appendSampleBuffer:] + 120
    frame #11: 0x002c18f4 proto`-[VideoEncoder encodeFrame:](self=0x156ba0f0, _cmd=0x0051ec7a, sampleBuffer=0x156f5810) + 492 at VideoEncoder.m:75
    frame #12: 0x0029cdd2 proto`-[AVEncoder encodeFrame:](self=0x1564fce0, _cmd=0x0051ec7a, sampleBuffer=0x156f5810) + 1282 at AVEncoder.mm:286
    frame #13: 0x002ab704 proto`-[KFH264Encoder encodeSampleBuffer:](self=0x155dec30, _cmd=0x0051deb4, sampleBuffer=0x156f5810) + 116 at KFH264Encoder.mm:69
    frame #14: 0x002b7a30 proto`-[KFRecorder captureOutput:didOutputSampleBuffer:fromConnection:](self=0x1566a700, _cmd=0x28b4ee52, captureOutput=0x155cb3d0, sampleBuffer=0x156f5810, connection=0x15555a90) + 524 at KFRecorder.m:152
    frame #15: 0x28ab3db0 AVFoundation`-[AVCaptureVideoDataOutput _handleRemoteQueueOperation:] + 308
    frame #16: 0x28ab3c58 AVFoundation`__47-[AVCaptureVideoDataOutput _updateRemoteQueue:]_block_invoke + 164
    frame #17: 0x2a6e32be CoreMedia`__FigRemoteOperationReceiverCreateMessageReceiver_block_invoke + 210
    frame #18: 0x2a6f5df4 CoreMedia`__FigRemoteQueueReceiverSetHandler_block_invoke2 + 172
    frame #19: 0x00f0e072 libdispatch.dylib`_dispatch_source_latch_and_call + 1810
    frame #20: 0x00efcd6c libdispatch.dylib`_dispatch_source_invoke + 212
    frame #21: 0x00f0391e libdispatch.dylib`_dispatch_queue_drain + 582
    frame #22: 0x00efdac4 libdispatch.dylib`_dispatch_queue_invoke + 88
    frame #23: 0x00f0391e libdispatch.dylib`_dispatch_queue_drain + 582
    frame #24: 0x00efdac4 libdispatch.dylib`_dispatch_queue_invoke + 88
    frame #25: 0x00f05b1a libdispatch.dylib`_dispatch_root_queue_drain + 1294
    frame #26: 0x00f06e20 libdispatch.dylib`_dispatch_worker_thread3 + 108
    frame #27: 0x385bdda8 libsystem_pthread.dylib`_pthread_wqthread + 668
    frame #28: 0x385bdafc libsystem_pthread.dylib`start_wqthread + 8
(lldb)

<Bolts/BFTask.h> file not found

I installed dependencies via cocoa pods but when I compiled the project there is an error in the files KFAWSCredentialsProvider.h and in the file KFAWSCredentialsProvider.m

In KFAWSCredentialsProvider.m the error is in the line:

import <Bolts/BFTask.h>

Bolts/BFTask.h file not found

In KFAWSCredentialsProvider.h the error is in the line:

  • (BFTask *)refresh;
    Expected a type

It works before but I update cocoa because Kickflip not work in my mobile with the version 1.2 but now not compiled with the version 1.3

feature question

I send the mail to [email protected]

but cant get any reply !

  1. Kickflip Can provide the same function just periscope or meerkat ?

(i mean We mainly need to use the iOS SDK to stream the camera video up to your platform and use your SDK to play the stream down to one or more clients, this is the first thing we need to try it now.)

2.how many people can upload streaming at same time ? have any restriction ?

3.(when user registered on our platform, we have to create account for the user and allow them to stream their own videos up to your platform)

Whether can manage user by our backend server ?

  1. charge by app number or broadcast number ?
  2. where the data will be storage ? we need to use AWS , or upload to your cloud ?
    transfer data will have extra charge ?

6.can provide record feature ? , let people can see video later

7.what is average delay time from broadcaster to people watch ?

Is this project dead?

Trial didn't work. Many problems with the demo app. Code hasn't being touched for a while.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.