laifengios / lflivekit Goto Github PK
View Code? Open in Web Editor NEWLaiFeng IOS Live Kit,H264 and AAC Hard coding,support GPUImage Beauty, rtmp transmission,weak network lost frame,Dynamic switching rate
License: MIT License
LaiFeng IOS Live Kit,H264 and AAC Hard coding,support GPUImage Beauty, rtmp transmission,weak network lost frame,Dynamic switching rate
License: MIT License
How could I add custom filters ?
推流画面是白屏
Just wanted to say, keep up the good work, the best live stream framework I have found so far, makes thing easier than ever... Thank you 👍
滤镜可以根据不同的需求自己写,还是希望把人脸检测加上,这样使用更方便些
你好,我们在测试的过程发现,网络不好的时候内存会一直往上长,多到100多MB。网络好的时候没有这个问题,不知道这是什么 BUG?!
谢谢解答
Hi,
I use iOS7.1 on iPad mini2.
Assert - (kernResult == kIOReturnSuccess) - f: /SourceCache/AppleVXE380/AppleVXE380-403/Library/AppleVXE380UserLandLibrary.cpp l: 494
AppleVXE380VA ERROR: IOServiceOpen failed
Assert - (false) - f: /SourceCache/AppleVXE380/AppleVXE380-403/Library/AppleVXE380FIGwrapper.cpp l: 3106
Any suggestion?
Thanks;
真的太失败了我。。。
ld: library not found for -lPods
clang: error: linker command failed with exit code 1 (use -v to see invocation)
有又一样错误的人嘛,谢谢告知。
我是pod 1.0.1
下载源码后,我就只在demo目录下面运行了pod install ,编译项目就报错了。
现在无法检测到意外断开之类的error
] Failed to compile fragment shader
2016-08-20 14:57:37.193 SystemTeq[589:132757] Program link log: (null)
2016-08-20 14:57:37.193 SystemTeq[589:132757] Fragment shader compile log: ERROR: 0:1: 'mainScreen' : syntax error: syntax error
2016-08-20 14:57:37.194 SystemTeq[589:132757] Vertex shader compile log: (null)
2016-08-20 14:57:37.194 SystemTeq[589:132757] *** Assertion failure in -[LFGPUImageBeautyFilter initWithVertexShaderFromString:fragmentShaderFromString:], /Users/macbookpro/Downloads/work/XimalayaSDK_iOS_2.12/SystemTeq/SystemTeq/VideoLibrary/LFLiveKit/Vendor/GPUImage/GPUImageFilter.m:94
推流过程中退到后台拨打电话再返回,拉流端会变成只有画面没有声音
在iPhone5连接Xcode中真机测试中出现内存错误.
断开数据线运行并没有影响.
我将异常信息提供在下面.非常感谢你们的开源分享,希望做的更好.
xcode Version 7.3.1 (7D1014)
iPhone5 iOS 8.3
pod 'LFLiveKit', '~> 1.6'
代码位置:
GPUImageView.m Line 168
[[[GPUImageContext sharedImageProcessingContext] context] renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer*)self.layer];
日志信息:
device iPhone5,1
model iPhone5
num 5
线程信息:
EAGLContext_renderbufferStorageFromDrawable(EAGLContext*, objc_selector*, unsigned int, id<EAGLDrawable>) + 204, queue = 'com.sunsetlakesoftware.GPUImage.openGLESContextQueue', stop reason = EXC_BAD_ACCESS (code=EXC_ARM_DA_ALIGN, address=0x103b) frame #0: 0x0145a3bc libglInterpose.dylib
EAGLContext_renderbufferStorageFromDrawable(EAGLContext_, objc_selector_, unsigned int, id) + 204
frame #1: 0x00364402 -[GPUImageView createDisplayFramebuffer](self=0x073eed60, _cmd="createDisplayFramebuffer") + 342 at GPUImageView.m:168
frame #2: 0x00363efc __26-[GPUImageView commonInit]_block_invoke(.block_descriptor=0x0128a4f8) + 1608 at GPUImageView.m:129
frame #3: 0x014fdea4 libdispatch.dylib`_dispatch_barrier_sync_f_invoke + 96
frame #4: 0x0033f70c runSynchronouslyOnVideoProcessingQueue(block=0x0128a4f8) + 102 at GPUImageOutput.m:44
frame #5: 0x003638a0 -[GPUImageView commonInit](self=0x073eed60, _cmd="commonInit") + 810 at GPUImageView.m:97
frame #6: 0x00363492 -[GPUImageView initWithFrame:](self=0x073eed60, _cmd="initWithFrame:", frame=%28origin = %28x = 0, y = 0%29, size = %28width = 320, height = 568%29%29) + 168 at GPUImageView.m:63
frame #7: 0x00306ce6 -[LFVideoCapture initWithVideoConfiguration:](self=0x073d39c0, _cmd="initWithVideoConfiguration:", configuration=0x073cf280) + 874 at LFVideoCapture.m:37
frame #8: 0x002fa960 -[LFLiveSession videoCaptureSource](self=0x073cf6b0, _cmd="videoCaptureSource") + 132 at LFLiveSession.m:219
frame #9: 0x002fa40a -[LFLiveSession setRunning:](self=0x073cf6b0, _cmd="setRunning:", running=YES) + 204 at LFLiveSession.m:173
同时,如果服务器未打开,点击开始直播后,再次点击播报会崩溃到LFStreamRtmpSockert.m
- (void)_stop{
if(self.delegate && [self.delegate respondsToSelector:@selector(socketStatus:status:)]){
[self.delegate socketStatus:self status:LFLiveStop];
}
if(_rtmp != NULL){
PILI_RTMP_Close(_rtmp, &_error);
PILI_RTMP_Free(_rtmp);
_rtmp = NULL;
}
}
在4G环境下采集推流控制台提示:
handleRouteChange reason is The category of the session object changed.
handleRouteChange reason is The output route was overridden by the app.
数据应该没有推出去!
连接失败是什么原因
如题,当我项目中pod同时引入 ReactiveCocoa 和 LFLiveKit 的时候,LFLiveKit 就会编译失败
我的pod写法入下:
source 'https://github.com/CocoaPods/Specs.git'
platform :ios, "8.0"
use_frameworks!
target 'test' do
pod 'LFLiveKit'
pod 'ReactiveCocoa'
end
测试时用到了iOS 8.3 发现在GPU处崩溃
在 NALUnit.h 里提示 class 不可用。
NALUnit.h 改成 NALUnit.hpp 后,又提示 #include <openssl/bn.h> 没找到。
我参照你们的库,想学习一下RTMP推流相关的东西,推流的时候遇到了下面的错误:
2016-07-18 09:32:47.802 LiveStudy[923:159136] 数据包
2016-07-18 09:32:47.802 LiveStudy[923:159136] 时间戳52582565
ERROR: WriteN, PILI_RTMP send error 32, Broken pipe, (144 bytes)
ERROR: WriteN, PILI_RTMP send error 32, Broken pipe, (39 bytes)
ERROR: WriteN, PILI_RTMP send error 32, Broken pipe, (42 bytes)
我学习的代码仓库:https://github.com/mengxiangyue/LiveStudy,
如果有时间帮忙看一下,没时间的话能给一下思路也十分感谢。
我的逻辑是使用摄像头去捕获数据,目前只是处理视频。然后使用硬件编码最后使用RTMP发送,RTMP发送的代码是基本跟你们库相同,另外我所有的操作都是在主线程的。
十分感谢。
将一个label设置为waterMarkView,能明显的感觉到字体是模糊的
Great library! I wonder can you make it more flexible by allowing users to specify the video input, for example, making LFVideoCapture
an protocol and then let LFLiveSession
accept an LFVideoCapture
implementation. The implementation can call - (void)captureOutput:(nullable LFVideoCapture*)capture pixelBuffer:(nullable CVImageBufferRef)pixelBuffer;
to specify the video input.
Thanks!
iPhone6 Plus下推流有时会出现音视频不同步的问题,音频会比画面延时大概1~3s
有可能不是推流端问题,但是我这边很难定位到点,请问能提供一点思路方向不?
Hey @LaiFengiOS, your library is really interesting.
The only problem I found was the README.md
, which lacks information.
I created this iOS Open source Readme Template so you can take a look on how to better organize.
If you want, I can help you to do it.
What are your thoughts?
arondeMacBook-Pro:MiaowShow aron$ pod search LFLive
[!] Unable to find a pod with name matching `LFLiveKit'
Resolving dependencies of Podfile
[!] Unable to find a specification for LFLiveKit
通过pod搜不到这个库
How do I get rtmp authentication working?
LFLiveStreamInfo *stream = [LFLiveStreamInfo new];
stream.url = @"rtmp://test:test@myhost/test";
Does not work.
It prints
2016-08-04 16:37:58.737 LFLiveKitDemo[1447:493017] MicrophoneSource: startRunning 2016-08-04 16:37:58.936 LFLiveKitDemo[1447:493066] handleRouteChange reason is The category of the session object changed. 2016-08-04 16:38:22.112 LFLiveKitDemo[1447:492940] liveStateDidChange: 1 ERROR: Problem accessing the DNS. (addr: test) 2016-08-04 16:38:22.130 LFLiveKitDemo[1447:492940] errorCode: 203 2016-08-04 16:38:22.130 LFLiveKitDemo[1447:492940] liveStateDidChange: 4
Thanks
Hello
Could you please tell me how can I switch to back camera?
The lib uses frontal camera by default.
Thanks
pod install
Unable to find a specification for `LFLiveKit
pod search LFLiveKit
[!] Unable to find a pod with name, author, summary, or descriptionmatching LFLiveKit
I use xcode 7.2
不然的话,#import <LFLiveKit/GPUImage.h>会提示找不到文件。我是用swift做的项目,cocopods enable了framework
CocoaPods
and Carthage
are awesome tools and make our life really easier, but there are some devs who still don't know how to use them.
It would be cool to add the Manual installation guide in your README.md
. You can take a look at my iOS Readme Template to see how you can do it.
前置摄像头成像是反的
`- (void)setWaterMark
{
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
NSDate *startTime = [NSDate date];
UILabel *timeLabel = [[UILabel alloc] initWithFrame:CGRectMake(0.0, 0.0, 240.0f, 320.0f)];
timeLabel.font = [UIFont systemFontOfSize:17.0f];
timeLabel.text = @"Time: 0.0 s";
timeLabel.textAlignment = NSTextAlignmentCenter;
timeLabel.backgroundColor = [UIColor clearColor];
timeLabel.textColor = [UIColor whiteColor];
GPUImageUIElement *uiElementInput = [[GPUImageUIElement alloc] initWithView:timeLabel];
[_filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:_gpuImageView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
__weak typeof(self) _self = self;
[_filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
timeLabel.text = [NSString stringWithFormat:@"Time: %f s", -[startTime timeIntervalSinceNow]];
[weakUIElementInput update];
[_self processVideo:filter];
}];
[_videoCamera addTarget:_gpuImageView];
}`
这是我添加水印的代码 可是在[weakUIElementInput update] 这里崩溃
Hello
Thank you for your awesome library, it's so much better than VideoCore!
But it's hard to find a reliable library to actually view the stream. Could you please advice me the best one for iOS in your opinion?
推上去的流是不是有问题?在观看端看到的画面时常很模糊,不管怎么设置,明显不如直播端预览的画面清晰,尤其直播端在动的时候,观看端看到的画面都是胡的。不知道是什么原因呢?
另外滤镜感觉反光好严重啊,感觉还没有最早的版本好呢。
As title
hello.
I develop app from based react-native
so i make react-native module to customize based your project
thanks.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.