dji-sdk / djiwidget Goto Github PK
View Code? Open in Web Editor NEWThe DJIWidget is a delightful DJI library includes VideoPreviewer for video decoding.
License: Other
The DJIWidget is a delightful DJI library includes VideoPreviewer for video decoding.
License: Other
We have migrated all of our frameworks to Swift Package Manager for Xcode 12, but DJI libraries currently do NOT support Swift Package Manager.
When can we expect SPM support?
Thank you
After installing the framework via Cocoapods, I try to build the xcworkspace
file and the DJIWidget
framework generates the following error from the line below:
#include "libavutil/avconfig.h" // 'libavutil/avconfig.h' file not found
Here's my Podfile
:
target 'MyApp' do
use_frameworks!
# Pods for MyApp
pod 'DJIWidget', '~> 1.0'
pod 'DJI-SDK-iOS', '~> 4.7.1'
target 'MyAppTests' do
inherit! :search_paths
# Pods for testing
end
end
DJIWidget ~1.0
macOS Mojave 10.14
Xcode 10.0.0
EDIT: it seems I was able to resolve the issue with the following steps, however, I'm leaving this issue open as this solution is only temporary as it requires each user to replicate each time the repo is cloned:
Podfile
to include use_frameworks!
pod install
Podfile
to exclude use_frameworks!
pod install
On pod install :
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
When I pull the latest UXSDK-ios, and try to build in Xcode 10, the DJIWidget "blows up" with 99 warnings and a link error: "Undefined symbols for architecture x86_64"
Any help in getting past this would be greatly appreciated. Thanks.
The DJIWidget is a delightful DJI library includes VideoPreviewer for video decoding
Where is the documentation for this feature? I need to use the video feed from DJIVideoFeedListener
's delegate method func videoFeed(_ videoFeed: DJIVideoFeed, didUpdateVideoData rawData: Data)
and convert it into something I can use with VNImageRequestHandler
, but I don't see any docs on how to use the decoder to turn the raw data into something meaningful.
EDIT: Stack Overflow question I created for more support
Hi, I have a pretty simple setup.
UPD:
platform :ios, '15.1'
source 'https://github.com/CocoaPods/Specs.git'
inhibit_all_warnings!
target 'DummyDronie' do
pod 'DJI-SDK-iOS', '~> 4.16'
pod 'DJIWidget', '~> 1.6.6'
pod 'DJIFlySafeDatabaseResource', '~> 01.00.01.18'
pod 'SwiftyBeaver'
pod 'SwiftLint'
end
Mavic Air2
V01.01.0720
My view:
import SwiftUI
struct FPVView: UIViewRepresentable {
@ObservedObject var droneController: DroneController
func makeUIView(context: Context) -> UIView {
let fpvPreview = UIView.init()
fpvPreview.backgroundColor = UIColor(.gray)
return fpvPreview
}
func updateUIView(_ uiView: UIView, context: Context) {
droneController.setupVideoPreviewer(fpvPreview: uiView)
}
func dismantleUIView(_ uiView: UIView, coordinator: ()) {
droneController.resetVideoPreviewer()
}
}
My controller:
func setupVideoPreviewer(fpvPreview: UIView) {
guard let videoPreviewer = DJIVideoPreviewer.instance() else {
log.error("Could not fetch DJIVideoPreviewer instance")
return
}
guard let videoFeeder = DJISDKManager.videoFeeder() else {
log.error("Could not fetch the video feeder instance")
return
}
videoPreviewer.setView(fpvPreview)
videoFeeder.primaryVideoFeed.add(self, with: nil)
videoPreviewer.start()
log.info("Setup video previewer")
}
func resetVideoPreviewer() {
guard let videoPreviewer = DJIVideoPreviewer.instance() else {
log.error("Could not fetch DJIVideoPreviewer instance")
return
}
guard let videoFeeder = DJISDKManager.videoFeeder() else {
log.error("Could not fetch the video feeder instance")
return
}
videoPreviewer.unSetView()
videoFeeder.primaryVideoFeed.remove(self)
log.info("Reset video previewer")
}
func videoFeed(_ videoFeed: DJIVideoFeed, didUpdateVideoData videoData: Data) {
guard let videoPreviewer = DJIVideoPreviewer.instance() else {
log.error("Could not fetch DJIVideoPreviewer instance")
return
}
let nsVideoData = videoData as NSData
let videoBuffer = UnsafeMutablePointer<UInt8>.allocate(capacity: nsVideoData.length)
nsVideoData.getBytes(videoBuffer, length: nsVideoData.length)
videoPreviewer.push(videoBuffer, length: Int32(nsVideoData.length))
}
It works fine while the drone hasn't taken off. But right after I just keep receiving:
[h264 @ 0x10880e610] non-existing PPS 0 referenced
[h264 @ 0x10880e610] non-existing PPS 0 referenced
[h264 @ 0x10880e610] non-existing PPS 0 referenced
[h264 @ 0x10880e610] non-existing PPS 0 referenced
[h264 @ 0x10880e610] non-existing PPS 0 referenced
[h264 @ 0x10880e610] non-existing PPS 0 referenced
[h264 @ 0x10880e610] non-existing PPS 0 referenced
Dependency does not contain bitcode which causes Xcode to throw an error:
DJIWidget/libDJIWidget.a(DJIAudioSampleBuffer.o)' does not contain bitcode. You must rebuild it with bitcode enabled (Xcode setting ENABLE_BITCODE), *obtain an updated library from the vendor*, or disable bitcode for this target. file '/Users/.../Library/Developer/Xcode/DerivedData/.../Build/Products/Debug-iphoneos/DJIWidget/libDJIWidget.a' for architecture arm64
Workaround: In order to bypass this (as a temporary solution), disable bitcode in the project
DJIWidget ~1.0
macOS Mojave 10.14
Xcode 10.0.0
Hi, I've been trying to activate the over exposure filter on the video previewer without any success.
I have seen there is a setOverExposedWarningThreshold
method in the VideoPreviewer file but setting it to a value different than 0 does not seem to make the trick. I could not find any documentation on this, could you provide some guidance on how to achieve this?
After setting enableHardwareDecode
to true
/YES
(Swift/Objc), cv_pixelbuffer_fastupload
always has a value of nil
/null
.
Edit: Although this bug is still open for addressing enableHardwareDecode
it’s important to note that part of the issue is a lack of documentation. If you are seeing a value of nil
for that value, make sure you add DJIVideoPreviewer.instance()?.registFrameProcessor(self)
to your code (see below comments for full context).
DJIWidget 1.5
$(inherited) $(PROJECT_DIR)/Framewroks $(PROJECT_DIR)/FFmpeg
The LICENSE.txt file is ignored in favour of the text in the podspec. Cocoapods automatically generates plist and md files for acknowledgments, which can be used to provide details of all licenses used in an app
The LICENSE.txt correctly attributes ffmpeg, but the license text in the podspec does not, so the ffmpeg attribution is missed in the cocoapods auto generated acknowledgment files.
Installing cocoa pods with DJIWidget does not work in DJISDK demo project.
I tried running pod install
to create a workspace and I get the following error in terminal:
[!] Unable to find a specification for
DJIWidget (~> 1.0)
Podfile Contents:
`# platform :ios, '9.0'
target 'DJISdkDemo' do
pod 'DJI-SDK-iOS', '> 4.7.1'> 1.0'
pod 'DJIWidget', '
end`
Thanks,
Dylan
class DJIH264VTDecode
why do you loadDummyIframe in decodeInit:Size?
I would like to stream aircraft's video feed over RTMP, preferably using something like LFLiveKit. Would appreciate an example of how to implement this, if it is at all possible. Thanks.
Maybe jumping the gun, but it doesn't look like the there's a podspec in cocoapods repo
I cannot upload an updated version of my app which contains the DJI-SDK, DJIWidget(which contains FFMpeg) to Testflight after updating to Xcode 13. When I try I get:
Invalid CFBundleSupportedPlatforms value. The CFBundleSupportedPlatforms key in the Info.plist file in “Payload/app.app/Frameworks/FFmpeg.framework” bundle contains an invalid value, [iPhoneSimulator]. Consider removing the CFBundleSupportedPlatforms key from the Info.plist file. If this bundle is part of a third-party framework, consider contacting the developer of the framework for an update to address this issue. With error code STATE_ERROR.VALIDATION_ERROR.90542 for id f2327257-ad48-4b60-8fc8-a518da5fbac8
# relevant pods
pod 'DJI-SDK-iOS', '~> 4.16'
pod 'DJIWidget', '~> 1.6.6'
However FFMpeg seems to be a binary file Framework attached to DJIWidget so I can't edit the Info.plist directly. Is there any way to workaround this or is a change required in DJIWidget?
I created a demo project here : https://github.com/valentary/DJISDKTest/tree/DJIWidgetBuildError
The project will not build as it cannot find the header files from FFmpeg in DJICustomVideoFrameExtractor
It's a default single view project, all I did was to install the cocoapods and try to build.
We use use_frameworks! which seems to be the cause,
I've tried various workarounds, but we either can't find the headers, or we get error with the FFmpeg static binary.
We are having issues with:
Does DJIWidget officially support the DJI Zenmuse P1 camera with the DJI M300 drone?
This proejct can not compile since you removed Info.plist and DJIWidgetPrefix.pch.
We are seeing these warnings when building for device.
DJIRtmpMuxer.m:964:58: warning: values of type 'OSStatus' should not be used as format arguments; add an explicit cast to 'int' instead [-Wformat]
NSLog(@"start audio queue failed %d, disable audio", status);
~~ ^~~~~~
%d (int)
1 warning generated.
DJIImageCalibrateHelper.o
DJIImageCalibrateHelper.m:112:95: warning: values of type 'NSUInteger' should not be used as format arguments; add an explicit cast to 'unsigned long' instead [-Wformat]
NSString* workHash = [NSString stringWithFormat:@"image.calibrate(%ld).working.queue",self.hash];
~~~ ^~~~~~~~~
%lu (unsigned long)
DJIImageCalibrateHelper.m:122:94: warning: values of type 'NSUInteger' should not be used as format arguments; add an explicit cast to 'unsigned long' instead [-Wformat]
NSString* rendHash = [NSString stringWithFormat:@"image.calibrate(%ld).render.queue",self.hash];
~~~ ^~~~~~~~~
%lu (unsigned long)
2 warnings generated.
When using the standard FPV Template at
"https://github.com/DJI-Mobile-SDK-Tutorials/iOS-FPVDemo"
I receive the following decoding errors after registering the app and connecting to the bridge app
[h264 @ 0x7f9817887e20] SEI type 1 size 128 truncated at 8
[h264 @ 0x7f9817887e20] SEI type 1 size 128 truncated at 8
[h264 @ 0x7f9817887e20] missing picture in access unit with size 9148
[h264 @ 0x7f9817887e20] missing picture in access unit with size 3025
Looking online this seems to be a FFMPEG issue, but looking for further information.
My setup is:
iOS 15.4 simulator to iOS 15.4 phone running the bridge app connected by lighting cable to my controller.
Output on the demo app and on custom projects is a black screen.
Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.