Comments (6)
Found relevant PR but it's closed.
#1546
from react-native-vision-camera.
Hey all!
I just spent a few days on thinking about a battleproof timestamp synchronization solution, and I came up with a great idea.
I built a TrackTimeline
helper class which represents a video or audio track - it can be started & stopped, paused & resumed, and even supports nesting pauses without issues.
- The total duration of the video is summed up from the difference between the first and the last actually written timestamps, minus the total duration of all pauses between a video. No more incorrect
video.duration
! π₯³ - Whereas before I just had a 4 second timeout if no frames arrive, I now just wait twice the frame latency (a few milliseconds) to ensure no frames are left out at maximum! π
- A video can be stopped while it is paused without any issues, as a pause call is taken into consideration before stopping πͺ
- A video file's session now exactly starts at the start() timestamp, and ends at the exact timestamp of the last video frame - this ensures there can never be any blank frames in the video, even if the audio track is longer π€©
This was really complex to built as I had to synchronize timestamps between capture sessions, and the entire thing is a producer model - a video buffer can come like a second or so later than the audio buffer, but I need to make sure the video track starts before the audio track starts, and ends after the audio track ends - that's a huge brainf*ck! π€―π
There's also no helper APIs for this on iOS, and it looks like no other Camera framework (not even native Swift/ObjC iOS Camera libraries) support this - they all break when timestamps have a delay (e.g. video stabilization enabled) (or dont even support delays at all) ; so I had to build the thing myself.
Check out this PR and try if it fixes the issue for you; #2948
Thanks! β€οΈ
from react-native-vision-camera.
Any update on this? facing similar issue
"react": "18.2.0",
"react-native": "0.74.1",
"react-native-vision-camera": "^4.0.3"
from react-native-vision-camera.
Think it must be some timing issue with the asset writer but I don't know enough swift to find actual issue.
from react-native-vision-camera.
same here
from react-native-vision-camera.
This is my naive workaround patch for 3.9.2. Only tested with iPhone 12 Pro. I'm not an expert on iOS and swift.
Even though we pause recording, captureSession
's clock keeps going (we cannot stop captureSession
because we should show the camera preview to the users). It seems that AVAssetWriter
only considers the timestamp recorded in CMSampleBuffer
s. The idea is to adjust timestamp in the buffer.
This is a demo video same as the author did.
4ABB1FFF-8570-468C-874D-166EEB1FE5B1.MP4
diff --git a/ios/Core/CameraSession+Video.swift b/ios/Core/CameraSession+Video.swift
index 00ff941b1d4cee15323f1f960a19a14613acab01..69e57e4092d99104793b994e9273a37dd301c18f 100644
--- a/ios/Core/CameraSession+Video.swift
+++ b/ios/Core/CameraSession+Video.swift
@@ -157,11 +157,12 @@ extension CameraSession {
func pauseRecording(promise: Promise) {
CameraQueues.cameraQueue.async {
withPromise(promise) {
- guard self.recordingSession != nil else {
+ guard let recordingSession = self.recordingSession else {
// there's no active recording!
throw CameraError.capture(.noRecordingInProgress)
}
self.isRecording = false
+ try recordingSession.pause(clock: self.captureSession.clock)
return nil
}
}
@@ -173,11 +174,12 @@ extension CameraSession {
func resumeRecording(promise: Promise) {
CameraQueues.cameraQueue.async {
withPromise(promise) {
- guard self.recordingSession != nil else {
+ guard let recordingSession = self.recordingSession else {
// there's no active recording!
throw CameraError.capture(.noRecordingInProgress)
}
self.isRecording = true
+ try recordingSession.resume(clock: self.captureSession.clock)
return nil
}
}
diff --git a/ios/Core/RecordingSession.swift b/ios/Core/RecordingSession.swift
index 85e9c622573143bd38f0b0ab6f81ad2f40e03cc3..8c4836c97b562bbda362c14f314a0ce96f113d2a 100644
--- a/ios/Core/RecordingSession.swift
+++ b/ios/Core/RecordingSession.swift
@@ -33,6 +33,8 @@ class RecordingSession {
private var startTimestamp: CMTime?
private var stopTimestamp: CMTime?
+ private var pauseTimestamp: CMTime?
+ private var pauseTimestampOffset: CMTime?
private var lastWrittenTimestamp: CMTime?
@@ -67,7 +69,12 @@ class RecordingSession {
let startTimestamp = startTimestamp else {
return 0.0
}
- return (lastWrittenTimestamp - startTimestamp).seconds
+
+ if let pauseTimestampOffset = pauseTimestampOffset {
+ return (lastWrittenTimestamp - startTimestamp - pauseTimestampOffset).seconds
+ } else {
+ return (lastWrittenTimestamp - startTimestamp).seconds
+ }
}
init(url: URL,
@@ -158,6 +165,8 @@ class RecordingSession {
// Start the sesssion at the given time. Frames with earlier timestamps (e.g. late frames) will be dropped.
assetWriter.startSession(atSourceTime: currentTime)
startTimestamp = currentTime
+ pauseTimestamp = nil
+ pauseTimestampOffset = nil
ReactLogger.log(level: .info, message: "Started RecordingSession at time: \(currentTime.seconds)")
if audioWriter == nil {
@@ -195,6 +204,56 @@ class RecordingSession {
}
}
+ /**
+ Record pause timestamp to calculate timestamp offset using the current time of the provided synchronization clock.
+ The clock must be the same one that was passed to start() method.
+ */
+ func pause(clock: CMClock) throws {
+ lock.wait()
+ defer {
+ lock.signal()
+ }
+
+ let currentTime = CMClockGetTime(clock)
+ ReactLogger.log(level: .info, message: "Pausing Asset Writer(s)...")
+
+ guard pauseTimestamp == nil else {
+ ReactLogger.log(level: .error, message: "pauseTimestamp is already non-nil")
+ return
+ }
+
+ pauseTimestamp = currentTime
+ }
+
+ /**
+ Update pause timestamp offset using the current time of the provided synchronization clock.
+ The clock must be the same one that was passed to start() method.
+ */
+ func resume(clock: CMClock) throws {
+ lock.wait()
+ defer {
+ lock.signal()
+ }
+
+ let currentTime = CMClockGetTime(clock)
+ ReactLogger.log(level: .info, message: "Resuming Asset Writer(s)...")
+
+ guard let pauseTimestamp = pauseTimestamp else {
+ ReactLogger.log(level: .error, message: "Tried resume but recording has not been paused")
+ return
+ }
+
+ let pauseOffset = currentTime - pauseTimestamp
+ self.pauseTimestamp = nil
+ if let currentPauseTimestampOffset = pauseTimestampOffset {
+ pauseTimestampOffset = currentPauseTimestampOffset + pauseOffset
+ ReactLogger.log(level: .info, message: "Current pause offset is \(pauseTimestampOffset!.seconds)")
+ } else {
+ pauseTimestampOffset = pauseOffset
+ ReactLogger.log(level: .info, message: "Current pause offset is \(pauseTimestampOffset!.seconds)")
+ }
+ }
+
/**
Appends a new CMSampleBuffer to the Asset Writer.
- Use clock to specify the CMClock instance this CMSampleBuffer uses for relative time
@@ -238,12 +297,32 @@ class RecordingSession {
}
// 3. Actually write the Buffer to the AssetWriter
+ let buf: CMSampleBuffer
+ if let pauseTimestampOffset = pauseTimestampOffset {
+ // let newTime = timestamp - pauseTimestampOffset
+ var count: CMItemCount = 0
+ CMSampleBufferGetSampleTimingInfoArray(buffer, entryCount: 0, arrayToFill: nil, entriesNeededOut: &count)
+ var info = [CMSampleTimingInfo](repeating: CMSampleTimingInfo(duration: CMTimeMake(value: 0, timescale: 0), presentationTimeStamp: CMTimeMake(value: 0, timescale: 0), decodeTimeStamp: CMTimeMake(value: 0, timescale: 0)), count: count)
+ CMSampleBufferGetSampleTimingInfoArray(buffer, entryCount: count, arrayToFill: &info, entriesNeededOut: &count)
+
+ for i in 0..<count {
+ info[i].decodeTimeStamp = info[i].decodeTimeStamp - pauseTimestampOffset
+ info[i].presentationTimeStamp = info[i].presentationTimeStamp - pauseTimestampOffset
+ }
+
+ var out: CMSampleBuffer?
+ CMSampleBufferCreateCopyWithNewTiming(allocator: nil, sampleBuffer: buffer, sampleTimingEntryCount: count, sampleTimingArray: &info, sampleBufferOut: &out)
+ buf = out!
+ } else {
+ buf = buffer
+ }
let writer = getAssetWriter(forType: bufferType)
guard writer.isReadyForMoreMediaData else {
ReactLogger.log(level: .warning, message: "\(bufferType) AssetWriter is not ready for more data, dropping this Frame...")
return
}
- writer.append(buffer)
+ writer.append(buf)
+ ReactLogger.log(level: .info, message: "append \(bufferType) Buffer (at \(timestamp.seconds) seconds)...")
lastWrittenTimestamp = timestamp
// 4. If we failed to write the frames, stop the Recording
My concerns on this workaround are:
- Because the latest pause and resume timestamp are considered, there can be some race condition due to out-of-order buffer processing (I guess it is rare)
- The only way to change the timestamp of the buffer I found is to copy it and I am not sure how much performance would be affected.
from react-native-vision-camera.
Related Issues (20)
- π Hello, Which version of react-native-vision-camera supports RN 0.67? HOT 1
- Build failed with face-detector pluginπ§ HOT 3
- π Torch does not work on CodeScanner HOT 2
- π How to translate preview point to screen point with resizeMode contain? HOT 2
- π Error when switching front camera on iPhone HOT 2
- π On some Android devices, the camera image is very dark and almost invisible HOT 1
- π Different metadata frame count and decoded frame count HOT 4
- π Camera preview blank screen when using Skia Frame Processors HOT 10
- π Camera Preview is Stretched on ShadowLens App on a Samsung J6 (Android 10) HOT 1
- β¨ How can I save the video recording when the app state changes to "background" or "inactive". HOT 1
- π Mirror issue with front camera HOT 6
- Using react-native-vision-camera 4.0.5 there is increase of 3MB size in android bundle HOT 3
- π ERROR Camera.onError(session/recoverable-error): An unknown error occurred while creating the Camera Session, but the Camera can recover from it. [session/recoverable-error: An unknown error occurred while creating the Camera Session, but the Camera can recover from it.] HOT 4
- π Android's getCameraPermissionStatus and getLocationPermissionStatus always returned 'denied' on a fresh install of the app HOT 1
- Disable Frame processor on route changeπ HOT 4
- π No onOutputOrientationChanged event on v4.1.0 HOT 2
- π useSkiaFrameProcessor renders black screen using front cameras HOT 15
- I am getting react-native-vision-camera:compileDebugKotlin'. errorπ§ HOT 5
- π Camera preview doesn't show the full image even with resizeMode='contain' HOT 3
- orientationOutput isn't applied in IOS with skia HOT 15
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from react-native-vision-camera.