Comments (37)
Hi @nihtin9mk,
There are certain types of view/layer that can't be automatically recorded - AVCaptureVideoPreviewLayer is one such case. To record these views you need to manually write the pixel data into the CGContext using the writeBackgroundFrameInContext:(CGContextRef*)contextRef
method.
As you described your use case as a 'video chat' application, I assume this is probably the problem you are encountering. Is one of the video views a live preview?
Al
from asscreenrecorder.
@alskipp
Hi, yes exactly this is the issue in my app. The view is a live preview.
Please give me a solution for this.
I am so much thankful if you can provide some sample codes.
In my view controller I wrote the below codes
-(void)viewDidLoad
{
[super viewDidLoad];
[ASScreenRecorder sharedInstance].delegate = self;
}
//myVideoView is the imageview with video preiview
-(void)writeBackgroundFrameInContext:(CGContextRef *)contextRef{
CGSize imgSize = [myVideoView.image size];
UIGraphicsBeginImageContext(imgSize);
contextRef = UIGraphicsGetCurrentContext();
}
I believe this is not the proper way - please guide me.
from asscreenrecorder.
@alskipp Plz help me to do this
from asscreenrecorder.
Hi I'll do my best to point you in the right direction either later today or tomorrow. As you can probably appreciate this open source library doesn't pay the bills and I'm not independently wealthy, consequently I'm currently working for 'the man'.
As a quick pointer - you need to get direct access to the pixel data in your live video input. If memory serves correctly you'll need to implement a method in AVCaptureVideoDataOutputSampleBufferDelegate. Probably the easiest thing to do is to create a CGImage iVar in your controller that you continuously update in -captureOutput:didOutputSampleBuffer:FromConnection.
You then implement the delgate method like y
from asscreenrecorder.
Whoops. Typing this on phone and accidentally tapped close and comment.
Anyway. You need to implement the delegate method, but you then draw the CGImage (which you created in captureOutput:didOutputSampleBuffer… into the context.
from asscreenrecorder.
Hi @alskipp - Thank you for your help in this, and of course all your open source works are really helpful and great for the developers like me.
I am not so much familiar with AVFoundation and captureOutput:didOutputSampleBuffer things. Hope you can give a better picture on how to implement the delegate method.
Please take your time.
And once again a lot of thanks for your selfless work.
from asscreenrecorder.
Hi, I'm on my way back home now. I'll try and post a few code examples this evening or tomorrow. If you get chance take a look at the documentation for AVCaptureVideoDataOutputSampleBufferDelegate as I think your view controller will need to implement this delegate to receive the live video data. You'll then use this pixel data to write into the video context.
Al
from asscreenrecorder.
I'll give a code example of how to turn the pixel data from captureOutput:didOutputSampleBuffer:FromConnection into an image you can use.
from asscreenrecorder.
OK. Here we go:
First your view controller will need to implement AVCaptureVideoDataOutputSampleBufferDelegate
- declared something like:
@interface CameraViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
As you're already previewing the video on screen I assume there's already an AVCaptureSession
set up which is then used to init a AVCaptureVideoPreviewLayer
.
In viewDidLoad
you'll want to register to receive a notification for when your captureSession starts - something like:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(captureBegan)
name:AVCaptureSessionDidStartRunningNotification
object:self.captureSession];
Then you'll need to implement the method named in the selector:
- (void)captureBegan
{
[ASScreenRecorder sharedInstance].delegate = self;
}
Don't forget to remove the delegate when the captureSession ends [ASScreenRecorder sharedInstance].delegate = nil
.
The next comment will show the basics of AVCaptureVideoDataOutputSampleBufferDelegate
it won't be the full implementation you need just yet. But let's just try and get an image created from the sampleBuffer and then we can continue from there…
from asscreenrecorder.
This doesn't do anything too useful yet - but let's see if the CGImageRef
is successfully created, if so most of the hard work has been achieved, there are just a few more pieces to put in place.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CGImageRef image = [self createCGImageFromSampleBuffer:sampleBuffer];
// check if image creation is successful
CGImageRelease(image);
}
Here is how to get a CGImage from the sampleBuffer:
- (CGImageRef)createCGImageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef imageContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(imageContext);
CGContextRelease(imageContext);
CGColorSpaceRelease(colorSpace);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
return newImage;
}
Let me know if all this works so far. Then I'll continue with the final bits that write the captured image into the video buffer.
(for now don't call [ASScreenRecorder sharedInstance].delegate = self;
as we've not implemented the delegate method yet, so the compiler will complain. We'll get to that bit next).
Good luck.
from asscreenrecorder.
Just realised I missed out a vital bit. Sorry! We need to declare ourselves as delegate to AVCaptureVideoDataOutput. Otherwise we won't receive any calls from captureOutput:didOutputSampleBuffer:FromConnection
. I'll post some code examples in a moment.
from asscreenrecorder.
First we need a dispatch queue to receive calls to captureOutput:didOutputSampleBuffer:FromConnection
- we don't want to block the main thread with video processing.
@property (strong, nonatomic) dispatch_queue_t videoQueue;
We then initialize the queue and use it when we declare ourselves as delegate to the captureSession output.
Here's some code that shows how the capture session is setup:
- (void)setupCamera
{
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (![captureDevice hasMediaType:AVMediaTypeVideo]) {
return;
}
_videoQueue = dispatch_queue_create("CameraViewController.videoQueue", DISPATCH_QUEUE_SERIAL);
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil];
self.captureSession = [[AVCaptureSession alloc] init];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
output.videoSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
[output setSampleBufferDelegate:self queue:_videoQueue];
[self.captureSession addOutput:output];
[self.captureSession addInput:input];
[self.captureSession setSessionPreset:AVCaptureSessionPreset640x480];
_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
[self.cameraView setVideoPreviewLayer:_previewLayer];
}
The vital bit that you need to receive the sampleBuffers is:
[output setSampleBufferDelegate:self queue:_videoQueue];
Once that is declared you should start receiving calls to captureOutput:didOutputSampleBuffer:FromConnection
from asscreenrecorder.
Yes, this is a jigsaw puzzle with many pieces : )
from asscreenrecorder.
By the way if you're using a third party library to do the video capture then most of this boilerplate code should already be set up for you.
In that case you would set the 3rd party library as the delegate of ASScreenRecorder and locate where captureOutput:didOutputSampleBuffer:FromConnection
is declared and add the extra code there.
from asscreenrecorder.
Hi @alskipp - everything works fine and the [self.delegate writeBackgroundFrameInContext:&bitmapContext]; getting called.
But them the app got crashed with this message -
[VideoViewController writeBackgroundFrameInContext:]: unrecognized selector sent to instance
How can I implement this method ?
from asscreenrecorder.
Hi, I can post some example code, but not until after 19:00 GMT today. If I get a spare moment during the day, I'll try and point you in the right direction to complete the task.
Al
from asscreenrecorder.
Thank you so much @alskipp
I have done the below but don't know what to do -
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CGImageRef image = [self createCGImageFromSampleBuffer:sampleBuffer];
UIImage* uiimage = [[UIImage alloc] initWithCGImage:image];
NSLog(@"uiimage---%@",uiimage);
CGImageRelease(image);
}
-(void)writeBackgroundFrameInContext:(CGContextRef *)contextRef{
}
from asscreenrecorder.
Hi,
The idea is reasonably simple, but due to threading the implementation needs to be very careful.
writeBackgroundFrameInContext
will be called regularly - you just need to draw into the context with an image using CGContextDrawImage
. We need to have a CGImage instance variable ready to draw into the context - however the details are a bit complicated.
writeBackgroundFrameInContext
will be called on a background queue, the CGImage instance variable is mutable state which we will be updating elsewhere - we need to be very careful to prevent threading issues.
What we'll need:
2 instance vars
{
CGImageRef _capturedImage;
BOOL _needsNewImage; // set to YES before recording starts
}
1 dispatch_queue_t declared as a property - will be used to access _capturedImage
@property (strong, nonatomic) dispatch_queue_t imageQueue;
- (void)viewDidLoad {
_imageQueue = dispatch_queue_create("CameraViewController.imageQueue", DISPATCH_QUEUE_SERIAL);
}
We need to update the CGImageRef:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
dispatch_sync(_imageQueue, ^{
if (_needsNewImage) {
_needsNewImage = NO;
CGImageRelease(_capturedImage); // safe to use on NULL
_capturedImage = [self createCGImageFromSampleBuffer:sampleBuffer];
}
});
}
Then to write the image into the context. You might need to adjust the context and position of the drawing for your own needs - the following is an example:
- (void)writeBackgroundFrameInContext:(CGContextRef*)contextRef;
{
dispatch_sync(_imageQueue, ^{
if (_capturedImage) {
CGContextSaveGState(*contextRef);
CGAffineTransform flipRotate = CGAffineTransformMake(0.0, 1.0, 1.0, 0.0, 0.0, 0.0);
CGContextConcatCTM(*contextRef, flipRotate);
CGContextDrawImage(*contextRef, CGRectMake(0,0, CGRectGetHeight(_cameraView.bounds), CGRectGetWidth(_cameraView.bounds)), _capturedImage);
CGContextRestoreGState(*contextRef);
_needsNewImage = YES;
}
});
}
The final thing to remember is to release _capturedImage
when you have finished recording, otherwise the memory will leak.
from asscreenrecorder.
I've just edited the above post (forgot to call CGContextRestoreGState(*contextRef);
- that's really important!).
Depending on how your view is positioned you might need to to adjust the position you draw in CGContextDrawImage
. To make this easier to experiment with, it might be a good idea to change the following:
In ASScreenRecorder.m
locate this code:
if (self.delegate) {
[self.delegate writeBackgroundFrameInContext:&bitmapContext];
}
// draw each window into the context (other windows include UIKeyboard, UIAlert)
// FIX: UIKeyboard is currently only rendered correctly in portrait orientation
dispatch_sync(dispatch_get_main_queue(), ^{
UIGraphicsPushContext(bitmapContext); {
for (UIWindow *window in [[UIApplication sharedApplication] windows]) {
[window drawViewHierarchyInRect:CGRectMake(0, 0, _viewSize.width, _viewSize.height) afterScreenUpdates:NO];
}
} UIGraphicsPopContext();
});
Move the delegate code to appear after the main drawing code listed above. Your video preview will then be drawn on top of everything else, making it easier to see if you are drawing it in the correct position.
from asscreenrecorder.
Ohh @alskipp - Thank you very much It worked at last.
Only two minor problems - may be you can point me the mistakes from my side.
The myvideo view's alignment is wrong - it is rotated 90 degree right and situated left side.
Also after I ends record the app crashes.
from asscreenrecorder.
"_cameraView" in my example is the view that contains the live video preview. It's only used to calculate the size and position to draw into the context.
Could you confirm that a CGImage is being created in - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
from asscreenrecorder.
If the image is created successfully then it's just a matter of getting the positioning right when drawing it into the context.
from asscreenrecorder.
Sorry for my first comment - I just forgot to move the delegate code to appear after the main drawing code listed above.Thats why that view is not came in the video. now it worked fine.
only the above issue.
from asscreenrecorder.
Is everything working now? To get the correct positioning you'll have to experiment with the CGRect in CGContextDrawImage
CGContextDrawImage(*contextRef, CGRectMake(50, 50, 100, 100), _capturedImage);
from asscreenrecorder.
Yes now everything works fine. Thank you for your tremendous support.
Now I am experimenting with CGRect in CGContextDrawImage.
But the orientation is also different in the video.
from asscreenrecorder.
If the orientation is incorrect you just have to transform the context. For my use, I needed the following:
CGAffineTransform flipRotate = CGAffineTransformMake(0.0, 1.0, 1.0, 0.0, 0.0, 0.0);
CGContextConcatCTM(*contextRef, flipRotate);
You could try and comment out that bit, or try a different transformation to see if it works.
from asscreenrecorder.
Yea I am trying for proper orientation.
At times it crashes on -
BOOL success = [_avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
with ERROR - Thread 3: EXC_BAD_ACCESS (code=1, address=0x4bc000)
from asscreenrecorder.
Hmmm, the dreaded EXC_BAD_ACCESS. Has this only happened since adding the new code? I've not encountered the EXC_BAD_ACCESS crash, but threaded crashes are notoriously unpredictable. I'll have to think about what the cause could be.
from asscreenrecorder.
yea, after adding these new codes.
It not happening continuously, but surely happens once in 3-4 uses.
from asscreenrecorder.
Hi @alskipp - any solutions ?
from asscreenrecorder.
Just wondering whether you took my advice when I stated the following?:
"The final thing to remember is to release _capturedImage when you have finished recording, otherwise the memory will leak."
This is just a guess, but it could potentially cause the issue (if it crashes consistently on the 2nd record). Perhaps what occurs is that for the first recording the CGImageRef in _capturedImage
is ready for the first frame, but in the 2nd recording the CGImageRef isn't ready, so the crash occurs? This is just a guess - I might be wrong.
If you are releasing _capturedImage
when the first recording finishes could you test what happens if you don't release it and see if the crash still occurs on the 2nd recording?
(Letting a potential memory leak happen certainly isn't a fix, but it might help identify why the crash occurs).
from asscreenrecorder.
I am using ARC. so after finishing record - I am making _capturedImage image as Nil.
_capturedImage = Nil;
Hope this is fine.
from asscreenrecorder.
That could well be causing the issue. To release it correctly you need to use CGImageRelease(_capturedImage);
See if that works, if it doesn't then try not releasing it at all to see if the crash still happens. (We'd then have to figure out how best to deal with memory release).
from asscreenrecorder.
@alskipp - I tried this. I am doing CGImageRelease(_capturedImage); after recording finishes.
But second record option is removed from my app.
But for the first recording it crashes at times.
BOOL success = [_avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
Even the example project you have provided is also crashing at same point, I am using iPod 5th generation and iPhone 5 for testing.
from asscreenrecorder.
@alskipp Thanks so much for this. It helped me a lot. Just found one problem( I think) with your code.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
dispatch_sync(_imageQueue, ^{
if (_needsNewImage) {
_needsNewImage = NO;
CGImageRelease(_capturedImage); // safe to use on NULL
_capturedImage = [self createCGImageFromSampleBuffer:sampleBuffer];
}
});
}
It won't execute the methods inside dispatch_async, the reason is you already assigned the queue to the delegate method.
[output setSampleBufferDelegate:self queue:_imageQueue];
After removing this
dispatch_sync(_imageQueue, ^{
From your sample, it works fine for me.
from asscreenrecorder.
Hi all,
I have read through the entire discussion and also trying to implement the same in my code, but still confused on which methods have to be declared in which class. So can you please help?
Also I'm unable to spot the datatype of variable "cameraView" and where would be the method "setupCamera" called?
Thanks!
from asscreenrecorder.
If you use the pause mode then add
- (void)stopRecordingWithCompletion:(VideoCompletionBlock)completionBlock;
{
if (_isRecording) {
_isRecording = NO;
_displayLink.paused = NO;
[_displayLink removeFromRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
[self completeRecordingSession:completionBlock];
self.pauseResumeTimeRanges = nil;
}
}
from asscreenrecorder.
Related Issues (20)
- How to record particular view only ? HOT 1
- UIAlertview not visible HOT 2
- Urgent Help: Black video on iOS9? HOT 1
- how to add Audio ? HOT 1
- Warning: Unable to write buffer to video HOT 1
- Not working properly HOT 1
- Appstore? HOT 1
- [FPS] How change FPS of the video to 60/1? HOT 4
- In background Mode HOT 1
- Recored only Subview HOT 1
- After once record,Twice will be crash. HOT 1
- How to Capture AVCaptureVideoPreviewLayer? (camera.previewlayer) HOT 4
- Carsh on my iPhone HOT 1
- cpu very high HOT 4
- record half size resolution for retina iPads????
- unable to capture the camera preview layer HOT 8
- Unable to record youtube full screen video
- Unable to build on Swift 5
- FPS HOT 12
- Possible to include video from AVPlayer? HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from asscreenrecorder.