Git Product home page Git Product logo

android-videostreamdecodingsample's People

Contributors

danlongchen avatar dji-dev avatar dji-william avatar hoker1 avatar michael-dji avatar oliverou avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

android-videostreamdecodingsample's Issues

Failed resolution of: Ldji/sdk/codec/DJICodecManager$YuvDataCallback;

Good afternoon, I'm trying to combine your application with mine, I kind of did everything right.

I have configured the packages in the hail and moved the rest of the classes as it should, but when I start ConnectionActivity, the following error occurs, what could be wrong?
Help...

Caused by: java.lang.ClassNotFoundException: Didn't find class "dji.sdk.codec.DJICodecManager$YuvDataCallback" on path: DexPathList[[zip file "/data/app/r*************-1/base.apk", zip file "/data/app/*************-
1/split_lib_dependencies_apk.apk", zip file "/data/app/*************-1/split_lib_slice_0_apk.apk", zip file "/data/app/*************-1/split_lib_slice_1_apk.apk", zip file "/data/app/*************-1/split_lib_slice_2_apk.apk", zip file "/data/app/*************-1/split_lib_slice_3_apk.apk", zip file "/data/app/*************-1/split_lib_slice_4_apk.apk", zip file "/data/app/*************-1/split_lib_slice_5_apk.apk", zip file "/data/app/*************-1/split_lib_slice_6_apk.apk", zip file "/data/app/*************-1/split_lib_slice_7_apk.apk", zip file "/data/app/*************-1/split_lib_slice_8_apk.apk", zip file "/data/app/*************-1/split_lib_slice_9_apk.apk"],nativeLibraryDirectories=[/data/app/*************-1/lib/arm64, /data/app/*************-1/base.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_dependencies_apk.apk!/lib/arm64-v8a, /data/app/*************-1/split_lib_slice_0_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_1_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_2_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_3_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_4_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_5_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_6_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_7_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_8_apk.apk!/lib/arm64-v8a, /data/app/*************-
1/split_lib_slice_9_apk.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]]

ndk-build for DJIVIDEOSTREAMDECODE: APP_ABI problems.

Hi everybody,
I've just re-compiled the source C ++, using the latest Android NDK builder, after the download of the original libffmpeg.so and after replacing the "placeholder" libffmpeg.so located in app/jni (thanks to this post: #5).

Now I can recompile the source, but only for APP_ABI :=armeabi-v7a. This works fine with my configuration (Samsung Galaxy Grand Prime - Android 5.0.2), but the app crashes on a Nexus 7 tablet. So, i tried to recompile also the others APP_ABI, by using the following makefile:

APP_ABI := all

but it didn't compile correctly, with the following error message:

C:\Users\Gianpaolo\AppData\Local\Android\Sdk\ndk-bundle\ndk-build [arm64-v8a] SharedLibrary : libdjivideojni.so ./obj/local/arm64-v8a/libffmpeg.so: error adding symbols: File in wrong format clang++.exe: error: linker command failed with exit code 1 (use -v to see invocation) make: *** [obj/local/arm64-v8a/libdjivideojni.so] Error 1
Instead, this is the error message if I try to compile for x86 ABI:

[x86] Compile        : djivideojni <= dji_video_jni.c
jni/dji_video_jni.c:63:51: warning: passing 'uint8_t *' (aka 'unsigned char *') to parameter of type 'const jbyte *' (aka 'const signed char *') converts between pointers to integer
      types with different sign [-Wpointer-sign]
        (*env)->SetByteArrayRegion(env, jarray, 0, size, buf);
                                                         ^~~
1 warning generated.
[x86] Prebuilt       : libffmpeg.so <= jni/
[x86] SharedLibrary  : libdjivideojni.so
C:/Users/Gianpaolo/AppData/Local/Android/sdk/ndk-bundle/build//../toolchains/x86-4.9/prebuilt/windows-x86_64/lib/gcc/i686-linux-android/4.9.x/../../../../i686-linux-android/bin\ld: e
rror: ./obj/local/x86/libffmpeg.so: incompatible target
jni/dji_video_jni.c:41: error: undefined reference to 'av_register_all'
jni/dji_video_jni.c:43: error: undefined reference to 'av_codec_next'
jni/dji_video_jni.c:77: error: undefined reference to 'avcodec_register_all'
jni/dji_video_jni.c:78: error: undefined reference to 'av_register_all'
jni/dji_video_jni.c:81: error: undefined reference to 'avcodec_find_decoder'
jni/dji_video_jni.c:82: error: undefined reference to 'avcodec_alloc_context3'
jni/dji_video_jni.c:83: error: undefined reference to 'av_parser_init'
jni/dji_video_jni.c:96: error: undefined reference to 'avcodec_open2'
jni/dji_video_jni.c:102: error: undefined reference to 'av_frame_alloc'
jni/dji_video_jni.c:126: error: undefined reference to 'av_init_packet'
jni/dji_video_jni.c:132: error: undefined reference to 'av_parser_parse2'
jni/dji_video_jni.c:165: error: undefined reference to 'av_free_packet'
jni/dji_video_jni.c:225: error: undefined reference to 'avcodec_close'
jni/dji_video_jni.c:229: error: undefined reference to 'av_free'
jni/dji_video_jni.c:230: error: undefined reference to 'av_free'
jni/dji_video_jni.c:231: error: undefined reference to 'av_parser_close'
clang++.exe: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [obj/local/x86/libdjivideojni.so] Error 1

How can I do to compile also for the others ABI in order to try this demo on Nexus 7 tablet?

How can I check the JNI to connect the android plantform

First, this is instruction. void sendDataToDecoder(byte[] videoBuffer, int size)
it is succeeded to decode the data and display in TextureView. Can I get the image data from videoBuffer? Use the videoBuffer to do the image processing.

Second, How do I use the getYuvDataCallback instruction to get Yuv Data?
Can I anyone help me and suggest?

  • Android SDK API 19
  • Phantom 3 standard
  • Android 6.0.1
  • OPPO A57
  • Android Studio 3.0.1

x86 Android

Hi,

Is there a way to compile this to an x86 android platform?

Thanks
Nitay

Tello supported?

I'm wondering if the sample is relevant to the Tello drone. Many different DJI drones are supported in DJIVideoStreamDecoder, but I don't see Tello specifically. The getIframeRawId method in DJIVideoStreamDecoder does a switch with drone names but it doesn't include Tello. Does anyone know what the iframes for the Tello are and if that can that be added?

Streaming in RTMP

Hey,

Would that be possible for you guys to drop any information or sample about a custom RTMP implementation for this app ?

Want to send a RTMP stream in a push configuration to the server while keeping the video preview displayed onto the Android Surface.

It's basically the same thing (think it's called "Custom RTMP") as in the DJI Go app (but as I can't find the open-sourced version of it, I'm looking for some help).

Thanks

H264 profile.

Is the h264 profile main?

Is this issue about bugs or crash issues of the Sample Code?

Yep, then please delete this template and provide the following infos for us to help investigate the issue:

  • Description of the issue.
  • Steps to reproduce the bug or crash issues(It would be great if you can provide this)
  • Crash logs (If you can find it, it would be very helpful)
  • DJI Android SDK version you are using (Like Android SDK 3.4, etc)
  • DJI Product you are using (Like Phantom 4, Mavic Pro, etc)
  • Android system version you are using (Like Android 6.0.1, Android 5.1.1, etc)
  • Android device you are using (Like Samsung Galaxy Note 5, Nexus 5, etc)
  • Android Studio version you are using (Like Android Studio 2.2, etc)

Why the .so file is only support andorid 5.x ?

I can only run this example on android5.x or later . But there a lots of android4.x version devices. How Can I support those ?

When load libraries on android4.x devices , the error shows below:

`
AndroidRuntime: FATAL EXCEPTION: main

java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "atof" referenced by "libffmpeg.so"...

at java.lang.Runtime.loadLibrary(Runtime.java:365)

at java.lang.System.loadLibrary(System.java:526)
`

Repo fails to clone

When cloning this repo I get an error during the git lfs fetch phase.

$ git clone [email protected]:DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample.git

Cloning into 'Android-VideoStreamDecodingSample'...
remote: Counting objects: 480, done.
remote: Compressing objects: 100% (323/323), done.
remote: Total 480 (delta 112), reused 480 (delta 112), pack-reused 0
Receiving objects: 100% (480/480), 54.56 MiB | 6.95 MiB/s, done.
Resolving deltas: 100% (112/112), done.
Checking connectivity... done.
Downloading android-videostreamdecodingsample/jni/lib/libavcodec.a (184.02 MB)

Error downloading object: android-videostreamdecodingsample/jni/lib/libavcodec.a
This repository is over its data quota. Purchase more data packs to restore access.
Git LFS: (0 of 3 files) 0 B / 346.37 MB

Problem with changing camera mode.

I'm having problems with solving issue that I have in my project and can be reproduced here. After changing camera mode with DJICameraSettingsDef.CameraPhotoAspectRatio.AspectRatio_16_9 or DJICameraSettingsDef.CameraPhotoAspectRatio.AspectRatio_4_3 video preview just stops most of the times. Does anybody have any suggestion how to solve this?

EDIT:
In case someone else will be trying to do this: the problem can be fixed with changing the condition for reinitializing decoder in onFrameQueueIn method in DJIVideoStreamDecoder. In my case only one dimention of the video frame changed, and the condition expected both of them to change in order to trigger reinitialization.

Demo custom decoder

I ran the sample code and clicked demo custom decoder button. I found if I connected the remote control through the wifi, It worked well. However, If I connected the remote control through the USB line, and clicked demo custom decoder button, the screen was black. It didn't receive the video data because I didn't find Log.d(TAG, "camera recv video data size: " + size) in logcat. If I tried other buttons like Demo SurfaceView and Demo TextureView and connected the remote control through the USB line, They worked well. I also tried other cellphone to test it, and the result was the same.

Save the buffered data into a JPG image file will delay

Hi, all:
When I press "Screen Shot" button, it will save buffered data into a JPG image file steadily.
The frequency is about one JPG file per second.
Does MainActivity.screenShot() method have code to limit the freqeuncy?
Or my Android device causes this delay problem?

DJI Android SDK version is 3.5.1.
My DJI product is Phantom 3 Professional.
My Android system version is Android 6.0.
My Android device is Nexus 7.
My Android Studio version is 2.2.3.
3 screen shot

My Opinion About Decoding Problem

Hi,
i have been tested decoding sample with my more device mobile:
-nexus 5x
-samsung galaxy s7
-moto g5 plus
-Nvidia Shield K1

and WITH OSMO CAMERA. with this... there is a problem with decoding so Image is broken and it has noise error.

Another times i have been test with phantom 4. And DECODING IS OK.

What do you think ? I need to use decoding with my osmo mobile. Is there necessary that change my device osmo mobile with another new ?

Is Possibile that osmo mobile doesn't support perfectly DJI SDK v4 + ?

Thanks

Crystal Sky - speed and video decoding - need faster device

Could we request a faster crystal sky device to handle video decoding? Any chance we can get a crystal sky device build around the snapdragon 845 chip?

I put an app on crystal sky using the decoding which passes the frames to openCv, does some object recognition, and then displays the frame.

The program works on the 7.87" crystal sky device. However even doing a small amount of processing in openCv, the frames turns green because the device is too slow to quickly complete all the processing.

I'm trying to figure out how to get a clean display. We want to transition to only using crytal sky devices for our apps for a number of reasons. As a comparison - using the nVidia k1 shields works without slowing down the app.

Could the sample decoding app be expanded to include some openCv processing that also works quickly enough on a crystal sky device?

Thanks

Unable to login and register product with USB connect. Can't get TranscodedVideoFeed via USB

Hi
I was downloaded the project and it was build successfully. However while I run the project in the phone, the aircraft can not be discovered, and it toasted "Register sdk fails, check network is available". and I check the network connection which had connected to Wifi.

I am in China, my phone is Xiaomi, but I also tried Smartisan and it has same problem. I was ok to log in my user name in Dji go APP, but same set up does't works for this version 4.8 project.

PS:

I also tried to use an existing Logged-in project to try to get videofeeder via TranscodedVideoFeed, My connection with the pad is also usb, but the Feeder callback doesn't call at all. and it has a log told me that: LocalConnector: Cannot connect the wmserver-data-reciever from time to time, don't know if this log is relateded to usb connection.

Wait for your reply.

  • Product Marvic Pro & Marvic Pro Zoom
  • Android Studio: Latest
  • SDK Version: 4.8.1

精灵3P连接成功后点“OPEN”程序直接闪退

不知道为何,直接git clone下来的,只改了KEY那部分,想看看示例工程的效果,然而精灵3P连接成功后点“OPEN”程序就直接闪退,用ADB调试按下“OPEN”后的logcat如下图:
1
所用机型为 精灵3 Professional,手机型号是荣耀6PLUS,安卓6.0,CPU是armeabi-v7a架构,符合示例工程要求,工程的compileSdkVersion 是25

OSMO+ and the video stream decoding sample

Can the video stream decoding sample work with the OSMO+?

With the OSMO+ and using a texture view the video displays fine.

However, following the sample, displaying the video on a surface view - the video does not display correctly. Then when parsing the individual frames the program displays a blank frame.

The same code works correctly with a phantom 4 pro.

OMSO+
Android 7
nvidia k1 shield
Android studio 2.3.3

Error on startup

When the application starts, it displays the Register SDK Succes dialog box and closes the application.

Fears on these three lines in the ConnectionActivity class:

1. UserAccountManager.getInstance (). LogIntoDJIUserAccount (...)
2. It collapses into the class itself
3. loginDJIUserAccount ()

dji_error

What's the function of variables "nu" and "nv" in method saveYuvDataToJPEG?

byte[] y = new byte[width * height];
byte[] u = new byte[width * height / 4];
byte[] v = new byte[width * height / 4];
byte[] nu = new byte[width * height / 4]; //
byte[] nv = new byte[width * height / 4];

System.arraycopy(yuvFrame, 0, y, 0, y.length);
for (int i = 0; i < u.length; i++) {
      v[i] = yuvFrame[y.length + 2 * i];
      u[i] = yuvFrame[y.length + 2 * i + 1];
}

The code above seems that the YUV data received from the UVA is already encoded by NV21, the array u and array v represent the data U and V respectively. So I'm confused about the function of the following code due to I don't know what the variables "nu" and "nv" are for.

int uvWidth = width / 2;
int uvHeight = height / 2;
for (int j = 0; j < uvWidth / 2; j++) {
	for (int i = 0; i < uvHeight / 2; i++) {
		byte uSample1 = u[i * uvWidth + j];
		byte uSample2 = u[i * uvWidth + j + uvWidth / 2];
		byte vSample1 = v[(i + uvHeight / 2) * uvWidth + j];
		byte vSample2 = v[(i + uvHeight / 2) * uvWidth + j + uvWidth / 2];
		nu[2 * (i * uvWidth + j)] = uSample1;
		nu[2 * (i * uvWidth + j) + 1] = uSample1;
		nu[2 * (i * uvWidth + j) + uvWidth] = uSample2;
		nu[2 * (i * uvWidth + j) + 1 + uvWidth] = uSample2;
		nv[2 * (i * uvWidth + j)] = vSample1;
		nv[2 * (i * uvWidth + j) + 1] = vSample1;
		nv[2 * (i * uvWidth + j) + uvWidth] = vSample2;
		nv[2 * (i * uvWidth + j) + 1 + uvWidth] = vSample2;
	}
}

//nv21test
byte[] bytes = new byte[yuvFrame.length];
System.arraycopy(y, 0, bytes, 0, y.length);
for (int i = 0; i < u.length; i++) {
	bytes[y.length + (i * 2)] = nv[i];
	bytes[y.length + (i * 2) + 1] = nu[i];
}

x86 /;'>.

Sorry, that's my cat posted that issue. Closing

Unable to get video data from Mavic 2 via DJICodecManager in SDK 4.8

I encounter another problem from SDK 4.8.
With failure of getting the data from VideoFeeder. I start tried to use CodecManager and implement the interface onYuvDataReceived and register to DJICodecManager instance.

firstly, it works very well in Mavic Pro via .USE can get the data from onYuvDataReceived, and render the Yuv data to texture.

But when I switch the Usb to the Mavic 2, the onYuvDataReceived interface was never call. Any idea which direction I should look on?

Images broken.

Hi, I got a problem here .
my picture broken like this:
screenshot_1487659132707

I run this project on P3S and android 4.4.4 phone

精灵3P连接成功后点“OPEN”程序直接闪退

不知道为何,直接git clone下来的,只改了KEY那部分,想看看示例工程的效果,然而精灵3P连接成功后点“OPEN”程序就直接闪退,用ADB调试按下“OPEN”后的logcat如下图:
Uploading 1.jpg…

所用机型为 精灵3 Professional,手机型号是荣耀6PLUS,CPU是armeabi-v7a架构,符合示例工程要求,单从logcat我没找到问题所在,求解决

NativeHelper methods does not work.

When i integrated with other application,
No implementation found for boolean com.dji.sdk.sample.media.NativeHelper.parse(byte[], int) (tried Java_com_dji_sdk_sample_media_NativeHelper_parse and Java_com_dji_sdk_sample_media_NativeHelper_parse___3BI)

Demo is not running on Android Nougat

I try to run the demo on a Nexus 6P running Android 7.1.2 and I get the following error on startup:

Video Stream Decoding Sample
Detected problems with app native libraries (please consult log for detail):
libffmpeg.so: text relocations

Can you check this out please?

Thank you!

DJICodecManager.YuvDataCallback makes the preview stuck

Bug

As long as I click the button "YUV Screen Shot", the preview will be stuck and then the screen shots will all be the same.

My devices' information:

  • DJI Android SDK version: 4.5.1
  • DJI Product: Phantom 4 Pro
  • Android system version: 7.1.1
  • Android Studio version: 3.1.1

Decoding failure

1.After decoding a few seconds ,dequeueOutputBuffer return -1。
2.the screenHot method in the MainActivity page ,save YuvImage color wrong

  • DJI Android SDK version using DJI Android SDK 4.3.2
  • DJI Product using PHANTON 3 PROFESSIONAL
  • Android system version using Android 7.0
  • Android device using Huawei Honor 9
  • Android Studio version using Android Studio 2.2.2

Video Stream Decoding: how to retrieve NV21 data from MediaCodec output

I'm trying to start from "Video Stream Decoding Sample" demo in order to obtain raw video data from DJI Phantom 3 Professional drone, and pass it to my Augmented Reality framework (Wikitude SDK). Particularly, I need to pass YUV 420 format data, arranged to be compliant to the NV21 standard to my framework, so I'm trying to retrieve this data from the MediaCodec output.

About this point, I tried to retrieve bytebuffers from the MediaCodec output (and this is possible by setting Surface parameter to null into configure() method, which have the effect to invoke a callback and pass it out to an external listener), but I'm having some issues about colours in visualization, because the encoded video colour is not right (blue and red seem to be reversed, and there is too much noise when camera moves).. (please note that, when I pass a Surface not null, after the instruction codec.releaseOutputBuffer(outIndex, true), MediaCodec renders frames on that and shows video stream properly, but I need to pass the video stream to Wikitude Plugin and so I must set surface to null).

I tried to set different MediaFormat.KEY_COLOR_FORMAT but none of them works properly. How can I proceed in order to retrieve NV21 data from MediaCodec output?

Building for Android API <21

Hello!

I successfully builded this sample. And it works fine on API >=21. Then this solution was embedded in my project. I replaced all neccessary methods in dji_video_jni.c with my packages. And my project working fine on API >= 21. When I trying to run it on my phone with Android API level 19, then i getting the following error:

FATAL EXCEPTION: main
Process: uiip.dji.pcapi.com, PID: 8922
java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "atof" referenced by "libffmpeg.so"...
at java.lang.Runtime.loadLibrary(Runtime.java:364)
at java.lang.System.loadLibrary(System.java:526)
at uiip.dji.pcapi.com.media.NativeHelper.<clinit>(NativeHelper.java:65)
at uiip.dji.pcapi.com.MainActivity.onCreate(MainActivity.java:21)
at android.app.Activity.performCreate(Activity.java:5275)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1087)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2166)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2252)
at android.app.ActivityThread.access$800(ActivityThread.java:139)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1200)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5103)
at java.lang.reflect.Method.invokeNative(Native Method)

Foremost, I tried to set targetSdkVersion to 19 in Build.Gradle (Module: app) and add the line APP_PLATFORM := android-19 in libs/Application.mk and rebuild jni libs. But this didn't help.

Then I found this post. And decided to try to change compileSdkVersion to 19, I deleted all api-specific stuff as styles appCompat and also I made the method decodeFrame() in DJIVideoStreamDecoder empty (just as mock), because it contains method mediacodec.reset(), which is 21 level api. So it didn't help too.

Then I tried to compile with different older ndk versions (9b, 10e, 11c), and different Android Plugin Version (2.1.3). It didn't help.

Is it problem in libffmpeg.so? How should i deal with it?

Which products can be supported by "Android-VideoStreamDecodingSample" project?

Hi, all:
I run the project on my Android device and try to connect to DJI Phantom 3 Professional.
But the app shows "No Product Connected".

The message is as follows:
https://drive.google.com/open?id=0B726Quay7lRDUGVOc3hqb1NOY0U

My questions are as follows:

  1. Which products can be supported by "Android-VideoStreamDecodingSample" project?

  2. How can I modify this project to make it support DJI Phantom 3 Professional and DJI M100?

Please help answer them, thank you.

Input Video Buffer

Hi,

I was able to successfully execute this project, however, I'm left with certain questions.

  1. What is the input VideoBuffer format so that it decodes to YUV data?
  2. What are the changes required to be able to stream to RTMP/ RTSP?
  3. If the input videoBuffer from the drone is h.264 encoded, why can't we play it when we dump those frames to a file?

Please shed some light here. I'm stuck.

Regards,
Abhijit Nathwani

How to test "Android-VideoStreamDecodingSample" project?

Hi, all:
I have finished making "Android-VideoStreamDecodingSample" project, but I don't know how to test it.
My questions are as follows:

  1. Is "Android-VideoStreamDecodingSample" project a Android app?

  2. I use Android Test to test this project, but it shows a error message and then the app is immediately closed. The error message is as follows:
    https://drive.google.com/file/d/0B726Quay7lRDSVE0TjFBNGgyTkk/view?usp=sharing

  3. How can I test this project to watch if it is correct?

Please help answer them, thank you.

How to obtain PTS

I need to push video stream to the RTSP server, the callback function onDataRecv(byte[] data, int size, int frameNum, boolean isKeyFrame, int width, int height), but there is no timestamp field.I see function Java_com_dji_videostreamdecodingsample_media_NativeHelper_parse PTS, but it is zero, should how to get?
Thank you for your precious time!

Cannot add widgets for take off etc.

Tried to add widgets for takeoff and return home, but fails and says that it cannot inflate the class dji.ux.widget. tried adding com.dji:dji-uxsdk:4.6 as a library dependency, but that failed saying that more than one file was found with os independent path....

What changes do I need to make to the code from this repo to add a takeoff button widget?

Re-Compile JNI DJIVIDEOSTREAMDECODE

Hello to all,
I'm trying to re-compile the source C ++, using the latest Android NDK builder.
The code is posted here:
https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample/tree/master/android-videostreamdecodingsample/jni

I have this problem, this is the log:

[armeabi-v7a] Compile thumb : djivideojni <= dji_video_jni.c
jni/dji_video_jni.c:65:51: warning: passing 'uint8_t *' (aka 'unsigned char *') to parameter of type 'const jbyte *' (aka 'const signed char *') converts between pointers to integer types with different sign [-Wpointer-sign]
(*env)->SetByteArrayRegion(env, jarray, 0, size, buf);
^~~
1 warning generated.
[armeabi-v7a] Prebuilt : libffmpeg.so <= jni/
[armeabi-v7a] SharedLibrary : libdjivideojni.so
D:/Users/Lorenzo/AppData/Local/Android/sdk/ndk-bundle/build//../toolchains/arm-linux-androideabi-4.9/prebuilt/windows-x86_64/lib/gcc/arm-linux-androideabi/4.9.x/../../../../arm-linux-androideabi/bin\ld: error: ./obj/local/armeabi-v7a/libffmpeg.so:1:9: syntax er
ror, unexpected STRING
D:/Users/Lorenzo/AppData/Local/Android/sdk/ndk-bundle/build//../toolchains/arm-linux-androideabi-4.9/prebuilt/windows-x86_64/lib/gcc/arm-linux-androideabi/4.9.x/../../../../arm-linux-androideabi/bin\ld: error: ./obj/local/armeabi-v7a/libffmpeg.so: not an object
or archive
jni/dji_video_jni.c:43: error: undefined reference to 'av_register_all'
jni/dji_video_jni.c:45: error: undefined reference to 'av_codec_next'
jni/dji_video_jni.c:78: error: undefined reference to 'avcodec_register_all'
jni/dji_video_jni.c:79: error: undefined reference to 'av_register_all'
jni/dji_video_jni.c:82: error: undefined reference to 'avcodec_find_decoder'
jni/dji_video_jni.c:83: error: undefined reference to 'avcodec_alloc_context3'
jni/dji_video_jni.c:84: error: undefined reference to 'av_parser_init'
jni/dji_video_jni.c:97: error: undefined reference to 'avcodec_open2'
jni/dji_video_jni.c:103: error: undefined reference to 'av_frame_alloc'
jni/dji_video_jni.c:127: error: undefined reference to 'av_init_packet'
jni/dji_video_jni.c:132: error: undefined reference to 'av_parser_parse2'
jni/dji_video_jni.c:165: error: undefined reference to 'av_free_packet'
jni/dji_video_jni.c:223: error: undefined reference to 'avcodec_close'
jni/dji_video_jni.c:227: error: undefined reference to 'av_free'
jni/dji_video_jni.c:228: error: undefined reference to 'av_free'
jni/dji_video_jni.c:229: error: undefined reference to 'av_parser_close'
clang++.exe: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [obj/local/armeabi-v7a/libdjivideojni.so] Error 1

Send video to the server via RTMP

Implemented sending video to the server using the library ( https://github.com/pedroSG94/rtmp-rtsp-stream-client-java )
Wrote the constructor to send - everything works ( https://pastebin.com/haWHc3T2 )
In MainActivity, I start the stream as follows ( https://pastebin.com/5aYV16Gg )

After a couple of seconds, the video sent to the server begins to deteriorate. Quality deteriorates and huge pixels appear. This video is received by RTMP stream and is broadcast on video player ( https://www.youtube.com/watch?v=-TuDgkB7OsY )

P.S.

What could be the problem? And where to start digging?

showToast "disconnected"

Run this sample on Huawei Tablet connected to remote controller of Phantom 4, the app shows "disconnected".
Running DJIGo on it works well.

VideoFeeder.VideoDataCallback is never called on dji spark when connected via wifi

Description:
It seems like the VideoFeeder.VideoDataCallback is never called when there is no instance of DJICodecManager
It is called, but with garbage values if there is an fake instance of DJICodecManager
It is only called with data that can be decoded by Android MediaCodec if there is a valid instance of DJICodecManager. Valid, in this context, means configured with a true output surface.

Especially the first one sounds like a bug. If I understand correctly, the purpose of the VideoFeeder.VideoDataCallback is to make .h264 nalus available to the application such that they can be decoded with a custom decoder ( and not the DJICodecManager).
This has various use-cases, e.g. my application is tuned for ultra low latency and already has a proven MediaCodec decoder. But I cannot find a way to get rid of the DJICodecManager because of the issues described above.
This might also relate to issue #41 , because for the described use-case there is no need to transcode the video into a different bitrate. If thats the purpose of VideoStreamDecodingSample, please clarify that and add the option to register a callback for raw .h264 nalus, as they are coming from the connected drone.

  • DJI Android SDK version & Android Studio: latest versions
  • DJI Product: Spark
  • Android system: Nougat
  • Android device: ZTE axon 7

Image Broken Problem HIGH

Hello,
i tryed to implement your decoding sample.
I think I have implemented it correctly.

MY DEVICE IS: NEXUS 5X and OSMO CAMERA.

NEXUS 5x android version 8.0
and also
Samsung galaxy s7 with android 7.0

I USED THE LAST SDK 4.3.2

After decoding in FFMPEG image results is broken or not OK.(you can see attached).

Thanks in advance
screenshot_1510831096300

Record video

I use FFmpegFrameRecorder to record video from byte[] yuvFrame , but it seems that delayed

        byte[] bytes = new byte[yuvFrame.length];
        System.arraycopy(y, 0, bytes, 0, y.length);
        for (int i = 0; i < u.length; i++) {
            bytes[y.length + (i * 2)] = nv[i];
            bytes[y.length + (i * 2) + 1] = nu[i];
        }
        opencv_core.IplImage yuvImage1 = opencv_core.IplImage.create(width, height * 3 / 2, IPL_DEPTH_8U, 1);
                yuvImage1.getByteBuffer().put(bytes);
                opencv_core.IplImage bgrImage = opencv_core.IplImage.create(width, height, IPL_DEPTH_8U, 3);
                cvCvtColor(yuvImage1, bgrImage, CV_YUV2BGR_NV21);
                Bitmap modifiedBitmap = IplImageToBitmap(bgrImage);
                org.bytedeco.javacv.AndroidFrameConverter converter = new AndroidFrameConverter();
                Frame modifiedFrame = converter.convert(modifiedBitmap);
                recorder.record(modifiedFrame);

what is solution for recording?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.