Git Product home page Git Product logo

immersive-video-sample's People

Contributors

chenxiaomin0306 avatar dahanhan avatar daijh avatar dependabot[bot] avatar gaofengzzz avatar hzhan80 avatar inteltiger avatar jhou5 avatar jsunintel avatar luoying1234 avatar stephanie-qi avatar u1x6wk avatar wenquan-mao avatar yanyings avatar yzhou51 avatar zhang-shizhao avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

immersive-video-sample's Issues

thrift ver.0.12.0 package has been removed from the current url

This project requires "thrift 0.12.0" package from the url "http://apache.osuosl.org/thrift/0.12.0/"
However, after the 0.16.0 version released in Feb.2022, the package has been removed from the url above in 16Mar.2022.
The package has been moved to archive, with the url "http://archive.apache.org/dist/thrift/0.12.0/"

Shifting to thrift 0.16.0 causes some unknown issue (maybe related to FFMPEG module) in the OMAF server for my case.
So modify the url in L28 "/Immersive-Video-Sample-master/src/external/install_thrift.sh" to the new archived url may help.

Unable to run Player for WebRTC Sample.

Hello!

I'm trying to run a WebRTC Sample and experiencing an issue with the player. I'm running the server and can go to the localhost:3001 and see some debug player playing a video. I was able to compile both the server and player successfully. However, I'm trying to run a player which points to localhost:3001 player is receiving the session token and crashing with an unclear error

[error] handle_read_http_response error: websocketpp.transport:7 (End of File)

My guess will that player is unable to create a web socket connection however I'm not sure why. Is there any hidden requirement that is undocumented? I can guess that maybe WebSocket requires SSL up and running.

OMAF-Sample crash in server side

I have built omaf-sample server in virtual machine enviroment below:
CentOS Linux release 7.9.2009 (Core)
Chip : Intel xeon E5-2670v3

It built successfully, but when I run it crash with log below
image

Please help me clarify this problem!

How to enable in_parallel option?

image

when I use the in_parallel parameter, the above error occurred during encoding.

the encoding command is as follows.

numactl -N 0,1,2,3 ./ffmpeg -stream_loop -1 \
        -i $1 -input_type 1 -rc 1 \
        -c:v:0 distributed_encoder \
        -in_parallel 1 \
        -s:0 7680x3840 \
        -tile_row:0 6 -tile_column:0 12 \
        -config_file:0 config_high.xml \
        -la_depth:0 0 -r:0 25 -g:0 25 \
        -b:0 50M -map 0:v \
        -c:v:1 distributed_encoder \
        -in_parallel 1 \
        -s:1 1280x1280 -sws_flags neighbor \
        -tile_row:1 2 -tile_column:1 2 \
        -config_file:1 config_low.xml \
        -la_depth:1 0 -r:1 25 -g:1 25 \
        -b:1 2M -map 0:v -vframes 5400 \
        -f omaf_packing \
        -is_live 0 -split_tile 1 -seg_duration 1 \
        -base_url ${URLBASE}/VOD8K/ \
        -out_name Test /usr/local/nginx/html/VOD8K/

join_on_failure_callback: Invalid token

I have built and started the owt-server on 1 machine. And I have installed and built the owt-linux-player on another machine. When I run the script ./render, I receive the error:
E0803 01:01:26.115042 3000 WebRTCMediaSource.cpp:232] join_on_failure_callback: Invalid token.

How can I resolve this issue?

How to push stream using RTMP?

Hi! I'm working with WebRTC-sample recently. However, I haven't found the RTMP support of the server. Is there anybody know how to push stream using RTMP? Thanks!

License of video files

I was wondering whether the video files themselves are under a free license. The License section of the readme mentions OMAF sample and WebRTC sample code, but not the video files in the Sample-Videos directory.

High-quality immersive video files are already difficult to find, and if these ones were freely licensed, they could be used as test material by various open-source projects.

Distribute Encoder Plugin

Hi:
We will plan to build the VR FOV system. I have some questions about Distribute Encoder Plugin. Please help, thanks.

  1. Is it use svt-HEVC? The range of tile_column and tile_row are 1256, however, the ranage of them are 116. What ranges are correct?
  2. How should I set HDR information by Distribute Encoder Plugin. I found only one params hdr. However, many params(ex.MaxCLL..) should be set. SVT-HEVC offers these params for HDR too.
  3. Could I use SVT-HEVC directly instead of via Distribute Encoder Plugin?
  4. Will the source of Distribute Encoder Plugin release in the future?

Fail to compile ffmpeg when --enable-libOmafDashAccess

Hi:
When I compile ffmpeg with --enable-libOmafDashAccess, it can't build. The following are the error messages:
libavformat/tiled_dash_dec.c:163:15: error: ‘HeadSetInfo’ {aka ‘struct HEADSETINFO’} has no member named ‘input_geoType’
163 | clientInfo->input_geoType = E_SVIDEO_EQUIRECT;
| ^~
libavformat/tiled_dash_dec.c:164:15: error: ‘HeadSetInfo’ {aka ‘struct HEADSETINFO’} has no member named ‘output_geoType’
164 | clientInfo->output_geoType = E_SVIDEO_VIEWPORT;
| ^~

I checked data_type.h and found input_geoType and output_geoType are not defined in data_type.h. What should I do? Thanks!

Failed to compile immersive of omaf-sample

Build: 2021072001
HW: SKL,ICX
OS: CentOS8.3,Ubuntu 21.04,Ubuntu 20.04,Ubuntu 18.04,RHEL 8.3
Case:CIR_BMRA_Basic_IMVideo_OMAF
Command:ansible-playbook -i immersive_inventory.ini playbooks/cir.yml --extra-vars profile=full_nfv bmra_version=basic

Error_log:
2021-07-21 02:50:18,177 p=6755 u=root n=ansible | TASK [immersive_install : compile immersive of omaf-sample] ********************
2021-07-21 02:50:18,180 p=6755 u=root n=ansible | fatal: [av09-03-wp]: FAILED! =>

{ "changed": true, "cmd": "cmake .. && make build -j 5", "delta": "0:09:21.925136", "end": "2021-07-21 02:50:17.835787", "rc": 2, "start": "2021-07-21 02:40:55.910651" }
......

---> Running in 23af08f29710
Removing intermediate container 23af08f29710
---> 455dd6d9de72
Step 14/42 : ARG YASM_REPO=https://www.tortall.net/projects/yasm/releases/yasm-${YASM_VER}.tar.gz
---> Running in 2703594cdcd5
Removing intermediate container 2703594cdcd5
---> 2b78bf7a0920
Step 15/42 : RUN wget O - ${YASM_REPO} | tar xz && cd yasm${YASM_VER} && sed -i "s/) ytasm./)/" Makefile.in && source /opt/rh/devtoolset-7/enable && ./configure --prefix="/usr" --libdir=/usr/lib/x86_64-linux-gnu && make -j$(nproc) && make install && cd ${WORKDIR} && rm -rf ./
---> Running in 38796abfa57a
�[91m-2021-07-21 09:50:16- https://www.tortall.net/projects/yasm/releases/yasm-1.3.0.tar.gz
�[0m�[91mResolving proxy-chain.intel.com (proxy-chain.intel.com)... �[0m�[91m10.22.230.62
Connecting to proxy-chain.intel.com (proxy-chain.intel.com)|10.22.230.62|:912... �[0m�[91mconnected.
�[0m�[91mERROR: cannot verify www.tortall.net's certificate, issued by '/C=US/O=Let's Encrypt/CN=R3':
Issued certificate has expired.
To connect to www.tortall.net insecurely, use `--no-check-certificate'.
�[0m�[91m
gzip: stdin: unexpected end of file
�[0m�[91mtar: Child returned status 1
tar: Error is not recoverable: exiting now
�[0m

STDERR:

CMake Warning (dev) in CMakeLists.txt:
No project() command is present. The top-level CMakeLists.txt file must
contain a literal, direct call to the project() command. Add a line of
code such as

project(ProjectName)

near the top of the file, but after cmake_minimum_required().

CMake is pretending there is a "project(Project)" command on the first
line.
This warning is for project developers. Use -Wno-dev to suppress it.

++ echo /usr/src/immersive/OMAF-Sample/server
++ awk -F OMAF-Sample '

{print $1}
'

  • REPOPATH=/usr/src/immersive/
  • SRCPATH=/usr/src/immersive/src/
  • DSTPATH=/usr/src/immersive/OMAF-Sample/server/src/
  • mkdir -p /usr/src/immersive/OMAF-Sample/server/src/
  • cd /usr/src/immersive/OMAF-Sample/server/src/..
  • cp -r /usr/src/immersive/src/360SCVP /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/external /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/ffmpeg /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/player /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/utils /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/isolib /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/trace /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/plugins /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/VROmafPacking /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/OmafDashAccess /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/src/CMakeLists.txt /usr/src/immersive/OMAF-Sample/server/src/
  • cp -r /usr/src/immersive/Sample-Videos /usr/src/immersive/OMAF-Sample/server/src/
  • '[' 1 = 1 ']'
  • docker build -t immersive_server:v1.4 .
    The command '/bin/sh c wget -O - ${YASM_REPO} | tar xz && cd yasm${YASM_VER} && sed -i "s/) ytasm./)/" Makefile.in && source /opt/rh/devtoolset-7/enable && ./configure --prefix="/usr" --libdir=/usr/lib/x86_64-linux-gnu && make -j$(nproc) && make install && cd ${WORKDIR} && rm -rf ./' returned a non-zero code: 2
    make[3]: *** [CMakeFiles/build.dir/build.make:76: CMakeFiles/build] Error 2
    make[2]: *** [CMakeFiles/Makefile2:153: CMakeFiles/build.dir/all] Error 2
    make[1]: *** [CMakeFiles/Makefile2:160: CMakeFiles/build.dir/rule] Error 2
    make: *** [Makefile:163: build] Error 2

MSG:

non-zero return code

OMAF-Sample server gets size=N/A and bitrate=N/A

I have successfully built OMAF-Sample server on CentOS 7.

However, when I tried to run the server, it always show "size=N/A" and "bitrate=N/A".
It always show something like the line below:
frame= 15 fps= 11 q=-0.0 q=-0.0 size=N/A time=00:00:00.13 bitrate=N/A speed=0.0993x

If I run the client when the server is running, the client will show that it is timed out without receiving any data.

Also I got a warning each time when I try to run the server. The warning says:
SVT [WARNING] Elevated privileges required to run with real-time policies! Check Linux Best Known Configuration in User Guide to run application in real-time without elevated privileges!
It is not clear if this warning is related to the N/A problem.

Any help on telling what may cause the problem will be appreciated.

OMAF Ubuntu player "decoder find error"

I was running the OMAF-sample UBUNTU player with a PC (CPU : i9-9900, VGA : RTX2070S)
All things were going well until there was something wrong with the VGA (maybe a bug fell into it and caused short circuits).
After the failure of the VGA, I removed the graphic card and switched to the CPU graphics of i9-9900, and re-install UBUNTU 18.04 and the OMAF-sample UBUNTU player.
However, after the re-installation, I were not able to run the UBUNTU player again.
The following error message will be shown on the terminal when trying to run the player (./render).
The player window GUI will show nothing and crash in a few seconds.

E0615 22:06:53.933619 3213 VideoDecoder.cpp:104] decoder find error!
E0615 22:06:53.933637 3213 DecoderManager.cpp:158] Video 0 : Failed to create a decoder for it
E0615 22:06:53.933619 3213 VideoDecoder.cpp:104] decoder find error!
E0615 22:06:53.933637 3213 DecoderManager.cpp:158] Video 0 : Failed to create a decoder for it
XIO: fatal IO error 11 (Resource temporarily unavailable) on X server ":0"
after 356 requests (356 known processed) with 11 events remaining.

I had tried to re-install the driver of CPU graphics of i9-9900 and reboot, but it didn't help.
And I also tried to install UBUNTU 18.04 through UEFI and Legacy mode, but it did no difference.

Any clue about what may have caused the problem will be appreciated.

How to enable the "trace" module

I have sccessfully run the OMAF-Sample and stream 4K tiled immersive video through network.

However, I have trouble trying to activate the "trace" module and output trace data.
The "USE_TRACE" option set to "OFF" by default in "/Immersive-Video-Sample-master/src/CMakeLists.txt" has been modified to be "ON" before make, but there is still trouble output trace log files.
There is still no any trace log file exist in the docker container after video streaming.

Any help on telling what may resolve the problem will be appreciated.

Compile Error while Building Server Compoenents

Hi, I am trying to compile the server components, but I have got some errors.

I used the code './build.sh server y' at 'src/exteral' directory you mentioned. The errors are follows.

[ 93%] Building CXX object VROmafPacking/CMakeFiles/VROmafPacking.dir/MpdGenerator.cpp.o
/home/wangyu/ovc/Immersive-Video-Sample/src/VROmafPacking/MpdGenerator.cpp: In member function 'int32_t VCD::VRVideo::MpdGenerator::WriteMpd(uint64_t)':
/home/wangyu/ovc/Immersive-Video-Sample/src/VROmafPacking/MpdGenerator.cpp:452:26: error: 'int ftime(timeb*)' is deprecated [-Werror=deprecated-declarations]
  452 |         ftime(&timeBuffer);
      |                          ^
In file included from /home/wangyu/ovc/Immersive-Video-Sample/src/VROmafPacking/MpdGenerator.cpp:36:
/usr/include/x86_64-linux-gnu/sys/timeb.h:39:12: note: declared here
   39 | extern int ftime (struct timeb *__timebuf)
      |            ^~~~~
/home/wangyu/ovc/Immersive-Video-Sample/src/VROmafPacking/MpdGenerator.cpp:452:26: error: 'int ftime(timeb*)' is deprecated [-Werror=deprecated-declarations]
  452 |         ftime(&timeBuffer);
      |                          ^
In file included from /home/wangyu/ovc/Immersive-Video-Sample/src/VROmafPacking/MpdGenerator.cpp:36:
/usr/include/x86_64-linux-gnu/sys/timeb.h:39:12: note: declared here
   39 | extern int ftime (struct timeb *__timebuf)
      |            ^~~~~
cc1plus: all warnings being treated as errors
make[2]: *** [VROmafPacking/CMakeFiles/VROmafPacking.dir/build.make:141: VROmafPacking/CMakeFiles/VROmafPacking.dir/MpdGenerator.cpp.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:435: VROmafPacking/CMakeFiles/VROmafPacking.dir/all] Error 2
make: *** [Makefile:130: all] Error 2

Thank you!

How to enable CMAF option

I have enable CMAF option but when run ffmpeg command it can not find libCMAFSegmentWriter.so. How can i enable to build libCMAFSegmentWriter

image

libDistributedEncoder.so

Hi,
I am new to Immersive-Video-Sample.
I notice there are two library in "src\ffmpeg\dependency" --libDistributedEncoder.so and libEncoder.so.
What is the use and are they open source?
Thanks.

Any GStreamer plugin plan for 360SCVP

The 360SCVP library is really a good library for the viewport-dependent streaming case. As there's a GStreamer plugin for SVT-HEVC available, is there any plan to make a Gstreamer plugin for 360 SCVP? Or done already /under planning by any forked projects?

Yu

WebRTC sample crash when the second client joined to sample room

Dear team,

I run WebRTC server on the first machine and run client on the second machine. It connect and play stream successfully

But when I run another client on the second machine, It seem to just connected but not stream is displayed. I checked that at that time,
cpu and ram usage's server are down. It seem to restarted

Here is my system configuration:

  • Server :
    image

  • Client:
    image

Please check my case and guide me the way to check any logs to find problem

Thank you so much!

Congestion control supported in WebRTC-Sample?

@U1X6WK @luoying1234 @hzhan80 Hi professionals, I am sorry to bother you, and I just wonder whether any congestion control algorithm (CCA) is supported in WebRTC-Sample?

I have checked throughout the procedure from VideoFramePacketizer::onFrame to Pipeline::write, but I didn't find any CCA called.

It is said that GoogCC is supported in WebRTC framework. Does the WebRTC framework adopted in our sample support GoogCC?

Thanks for your attention, and looking forward to your kindly reply.

FFMPEG Plugin Issues

Hi, I'm trying to build and run the source code and encountered the following problem.
When I start ffmpeg on the server side, it shows that some parameters in the ffmpeg plugin can't be found, such as input_type in distribute encoder.
image
I have checked the dynamic dependencies of ffmpeg (ldd ffmpeg) and found libDistributedEncoder.so in /usr/local/lib.
image
I wonder if you have any idea about this issue. Thank you!

Try to install Android player from https://github.com/OpenVisualCloud/Immersive-Video-Sample/releases/tag/V1.6.0

I am trying to install the Android player from the release 1.6. (https://github.com/OpenVisualCloud/Immersive-Video-Sample/releases/tag/V1.6.0)
I have downloaded the tar file of this release.
Linux used is Ubuntu 20.04.1 LTS"

-When I launch ./prebuild_android.sh I'm receiving the next error: FIPS_SIG does not specify incore module. Please edit this script.
where the script expects a Patch file, but I have skipped this step.

-After, launching ./make_android.sh I receive the next error:

warning and 1 error generated.
make[2]: *** [CMakeFiles/MediaPlayer.dir/build.make:115: CMakeFiles/MediaPlayer.dir/MediaSource/DashMediaSource.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
1 warning generated.
27 warnings generated.
1 warning generated.
1 warning generated.
1 warning generated.
make[1]: *** [CMakeFiles/Makefile2:76: CMakeFiles/MediaPlayer.dir/all] Error 2
make: *** [Makefile:130: all] Error 2

  • cd ../../../../
  • mkdir -p ./player/app/android/app/src/main/jniLibs/arm64-v8a/
  • sudo cp /usr/local/lib/libcurl.so ./player/app/android/app/src/main/jniLibs/arm64-v8a/
  • sudo cp /usr/local/lib/libsafestring_shared.so ./player/app/android/app/src/main/jniLibs/arm64-v8a/
  • sudo cp ./build/external/android/openssl-output/lib/libssl.so ./player/app/android/app/src/main/jniLibs/arm64-v8a/
    cp: cannot stat './build/external/android/openssl-output/lib/libssl.so': No such file or directory
  • sudo cp /usr/local/lib/libglog.so ./player/app/android/app/src/main/jniLibs/arm64-v8a/
  • sudo cp ./build/external/android/openssl-output/lib/libcrypto.so ./player/app/android/app/src/main/jniLibs/arm64-v8a/
    cp: cannot stat './build/external/android/openssl-output/lib/libcrypto.so': No such file or directory
  • sudo cp /usr/local/lib/lib360SCVP.so ./player/app/android/app/src/main/jniLibs/arm64-v8a/
  • sudo cp /usr/local/lib/libOmafDashAccess.so ./player/app/android/app/src/main/jniLibs/arm64-v8a/
    cp: cannot stat '/usr/local/lib/libOmafDashAccess.so': No such file or directory
  • sudo cp /usr/local/lib/libdashparser.a ./player/app/android/app/src/main/jniLibs/arm64-v8a/
  • sudo cp /usr/local/lib/libMediaPlayer.so ./player/app/android/app/src/main/jniLibs/arm64-v8a/
    cp: cannot stat '/usr/local/lib/libMediaPlayer.so': No such file or directory

Do you have any clue about hot to proceed ?

Thanks

Encoder.cpp:81 Failed to get cpu core number from /proc/cpuinfo! !

Hi, i haved built the OMAF-SAMPLE on docker, while i am running the ffmpeg by
[root@cd2f2e35c575 Sample-Videos]# ./run.sh 4K VOD

I got the following errors, it seems that the encoder cannot get the /proc/info, but i was already in the root mode, and by running cat /proc/cpuinfo, i could get the output correctly.

E20210305 02:24:07.059995   176 Log.cpp:558] Encoder.cpp:81  Failed to get cpu core number from /proc/cpuinfo! !
E20210305 02:24:07.061525   176 Log.cpp:558] TaskDispatcher.cpp:241  No resource ! 
E20210305 02:24:07.061581   176 Log.cpp:558] WorkerManager.cpp:304  Dispatch failed!! !
Error initializing output stream 0:1 -- Error while opening encoder for output stream #0:1 - maybe incorrect parameters such as bit_rate, rate, width or height

Any help on this issue will be appriciated.

[360SCVP] sample code for single viewport stream output

Hi,

The doc says "when given the view port information and one frame bitstream, the library can calculate the viewport area in the frame and output the corresponding bitstream". Is there any example code available to showcase the single stream output, when the usedType = E_MERGE_AND_VIEWPORT? Or some hints on the function call sequences to implement that logic. I saw "param_oneStream_info" but still not sure how to use it.

Thanks.

High CPU usage when run OMAF sample

I running OMAF sample with option 4K, LIVE. I don't understanding server side take almost cpu load. I have seen video segment at /usr/local/nginx/html/LIVE4K/. It mean it done step that take cpu usage.
How can i improve this issue by modifying config file?

image

OmafDashAccess/DownloadManager.cpp error while compiling Omaf client code

I am trying to execute ./deploy.sh in OMAF-Sample/client but there is error during compilation,

[ 86%] Building CXX object OmafDashAccess/CMakeFiles/OmafDashAccess.dir/OmafExtractorTracksSelector.cpp.o
/home/osboxes/Immersive-Video-Sample/src/OmafDashAccess/DownloadManager.cpp: In member function ‘int VCD::OMAF::DownloadManager::enum_directory(const char*, bool, enum_dir_item, void*, const char*)’:
/home/osboxes/Immersive-Video-Sample/src/OmafDashAccess/DownloadManager.cpp:218:54: error: argument to ‘sizeof’ in ‘char* strncat(char*, const char*, size_t)’ call is the same expression as the source; did you mean to use the size of the destination? [-Werror=sizeof-pointer-memaccess]

Please help

where is libstitch?

I'm glad to enable HEVC tile stitch library to use the tile encoder and failed to find the libstitch library, please share it so that I have chance to test it. Thanks for the great job.

Transcoder crash

image
When I transcode to more than 5,000 frames each time, a crash occurs, and it seems to crash at the same location. SubEncoderSVT.cpp seems not to be found in the entire project. How should I solve the crash?

Fail to link libDistributedEncoder.so

Hi:
I try to build ffmpeg for server. I use the followoing configure:
./configure --prefix=/home/felix/prog_work/intel_omaf/ --libdir=/home/felix/prog_work/intel_omaf/target/lib --enable-static --disable-shared --enable-gpl --enable-nonfree --disable-optimizations --disable-vaapi --enable-libDistributedEncoder
--extra-cflags="-I/home/felix/prog_work/intel_omaf/target/include/DistributedEncoder -I/home/felix/prog_work/intel_omaf/package/Immersive-Video-Sample-master/src/360SCVP"
--extra-ldflags="-L/home/felix/prog_work/intel_omaf/target/lib -lDistributedEncoder"

and then type:
make -j nproc

The following errors are generated:
LD ffmpeg_g
/usr/bin/ld: libavcodec/libavcodec.a(distributed_encoder.o): in function de_init': /home/felix/prog_work/intel_omaf/src/FFmpeg/libavcodec/distributed_encoder.c:245: undefined reference to DistributedEncoder_Init'
/usr/bin/ld: libavcodec/libavcodec.a(distributed_encoder.o): in function de_send_frame': /home/felix/prog_work/intel_omaf/src/FFmpeg/libavcodec/distributed_encoder.c:299: undefined reference to DistributedEncoder_Process'
/usr/bin/ld: /home/felix/prog_work/intel_omaf/src/FFmpeg/libavcodec/distributed_encoder.c:340: undefined reference to DistributedEncoder_Process' /usr/bin/ld: libavcodec/libavcodec.a(distributed_encoder.o): in function de_receive_packet':
/home/felix/prog_work/intel_omaf/src/FFmpeg/libavcodec/distributed_encoder.c:362: undefined reference to DistributedEncoder_GetPacket' /usr/bin/ld: /home/felix/prog_work/intel_omaf/src/FFmpeg/libavcodec/distributed_encoder.c:374: undefined reference to DistributedEncoder_GetPacket'
/usr/bin/ld: /home/felix/prog_work/intel_omaf/src/FFmpeg/libavcodec/distributed_encoder.c:411: undefined reference to DistributedEncoder_GetParam' /usr/bin/ld: libavcodec/libavcodec.a(distributed_encoder.o): in function de_close':
/home/felix/prog_work/intel_omaf/src/FFmpeg/libavcodec/distributed_encoder.c:491: undefined reference to `DistributedEncoder_Destroy'
collect2: error: ld returned 1 exit status
make: *** [Makefile:108:ffmpeg_g] 錯誤 1

Please help me. Thanks!

How to run the OMAF_Sample?

Command: ./run.sh 4K LIVE HTTPS
Follwing is what I use the default run.sh to run gets:
image
Then I modify this line to enable config_high.xml and then I get this:
image
It still has errors. Maybe the arguments are wrong?

ERROR: DistributedEncoder not found using pkg-config

Hi, I've been trying to compile the server component for the OMAF-sample but I get the following error:

ERROR: DistributedEncoder not found using pkg-config

I've tried building the docker image and compiling from source code,
I've tried the latest git head as well as release 1.8.0 source code

My server is CentOS 7 and i've installed devtoolset-7 for the required compilers.

Any idea what the issue is?

OpenGL at the client side

I am trying to run ./render in my ubuntu 22.04 machine, but when I run it, I got the following error.

root@ekpk:/home/khin/Immersive-Video-Sample/src/build/client/player/app# ./render
X Error of failed request: BadAccess (attempt to access private resource denied)
Major opcode of failed request: 152 (GLX)
Minor opcode of failed request: 5 (X_GLXMakeCurrent)
Serial number of failed request: 188
Current serial number in output stream: 188

I also checked the GLX version as follow:

root@ekpk:/home/khin/Immersive-Video-Sample/src/build/client/player/app# glxinfo |grep -i "version"
server glx version string: 1.4
client glx version string: 1.4
GLX version: 1.4
Version: 23.3.0
Max core profile version: 4.6
Max compat profile version: 4.6
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.2
OpenGL core profile version string: 4.6 (Core Profile) Mesa 23.3.0-devel
OpenGL core profile shading language version string: 4.60
OpenGL version string: 4.6 (Compatibility Profile) Mesa 23.3.0-devel
OpenGL shading language version string: 4.60
OpenGL ES profile version string: OpenGL ES 3.2 Mesa 23.3.0-devel
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
GL_EXT_shader_group_vote, GL_EXT_shader_implicit_conversions,

I am not sure my X window is supported by the client. Do you have any idea how to solve this?

Thank You.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.