Git Product home page Git Product logo

Comments (9)

buptqq avatar buptqq commented on May 14, 2024

Did you clone the repo with 'git clone --recursive' ?
If yes, you can find the cmake/config in this directory : 'TPAT/3rdparty/blazerml-tvm'

from tpat.

heluocs avatar heluocs commented on May 14, 2024

It seems clone the TensorRT code and rename the directory as TPAT in the document?not clone the really TPAT repository.

git clone -b master https://github.com/nvidia/TensorRT TPAT && cd TPAT && git submodule update --init --recursive

from tpat.

buptqq avatar buptqq commented on May 14, 2024

Sorry. our mistakes...
it should be TPAT directory.
git clone --recursive https://github.com/Tencent/TPAT.git
we have modified the repo address. thanks

from tpat.

LegendSun0 avatar LegendSun0 commented on May 14, 2024

请问,您跑通了么?我安装遇到一些问题,方便留个联系方式,请教一下么?

from tpat.

buptqq avatar buptqq commented on May 14, 2024

请问,您跑通了么?我安装遇到一些问题,方便留个联系方式,请教一下么?
[email protected]

from tpat.

GeneralJing avatar GeneralJing commented on May 14, 2024

can't build docker image.
Sending build context to Docker daemon 401.9kB
Step 1/9 : FROM nvcr.io/nvidia/tensorflow:20.06-tf1-py3
---> 61568efc3e0e
Step 2/9 : RUN wget -O "llvm-9.0.1.src.tar.xz" https://github.com/llvm/llvm-project/releases/download/llvmorg-9.0.1/llvm-9.0.1.src.tar.xz && tar -xvf llvm-9.0.1.src.tar.xz && mkdir llvm-9.0.1.src/build && cd llvm-9.0.1.src/build && cmake -G "Unix Makefiles" -DLLVM_TARGETS_TO_BUILD=X86 -DCMAKE_BUILD_TYPE="Release" -DCMAKE_INSTALL_PREFIX="/usr/local/llvm" .. && make -j8 && make install PREFIX="/usr/local/llvm"
---> Using cache
---> 81790a52d1ca
Step 3/9 : RUN pip install pycuda onnx nvidia-pyindex && pip install onnx-graphsurgeon onnxruntime tf2onnx xgboost
---> Using cache
---> 7b262fc70ae1
Step 4/9 : RUN git clone --recursive https://github.com/Tencent/TPAT.git /workspace/TPAT && cd /workspace/TPAT/3rdparty/blazerml-tvm && mkdir build && cp cmake/config.cmake build && cd build
---> Running in 6cbc94a98aad
Cloning into '/workspace/TPAT'...
Submodule '3rdparty/blazerml-tvm' (https://github.com/Tencent/BlazerML-tvm.git) registered for path '3rdparty/blazerml-tvm'
Cloning into '/workspace/TPAT/3rdparty/blazerml-tvm'...
fatal: unable to access 'https://github.com/Tencent/BlazerML-tvm.git/': Failed to connect to github.com port 443: Connection timed out
fatal: clone of 'https://github.com/Tencent/BlazerML-tvm.git' into submodule path '/workspace/TPAT/3rdparty/blazerml-tvm' failed
Failed to clone '3rdparty/blazerml-tvm'. Retry scheduled
Cloning into '/workspace/TPAT/3rdparty/blazerml-tvm'...
fatal: unable to access 'https://github.com/Tencent/BlazerML-tvm.git/': Failed to connect to github.com port 443: Connection timed out
fatal: clone of 'https://github.com/Tencent/BlazerML-tvm.git' into submodule path '/workspace/TPAT/3rdparty/blazerml-tvm' failed
Failed to clone '3rdparty/blazerml-tvm' a second time, aborting
The command '/bin/sh -c git clone --recursive https://github.com/Tencent/TPAT.git /workspace/TPAT && cd /workspace/TPAT/3rdparty/blazerml-tvm && mkdir build && cp cmake/config.cmake build && cd build' returned a non-zero code: 1

from tpat.

buptqq avatar buptqq commented on May 14, 2024

can't build docker image. Sending build context to Docker daemon 401.9kB Step 1/9 : FROM nvcr.io/nvidia/tensorflow:20.06-tf1-py3 ---> 61568efc3e0e Step 2/9 : RUN wget -O "llvm-9.0.1.src.tar.xz" https://github.com/llvm/llvm-project/releases/download/llvmorg-9.0.1/llvm-9.0.1.src.tar.xz && tar -xvf llvm-9.0.1.src.tar.xz && mkdir llvm-9.0.1.src/build && cd llvm-9.0.1.src/build && cmake -G "Unix Makefiles" -DLLVM_TARGETS_TO_BUILD=X86 -DCMAKE_BUILD_TYPE="Release" -DCMAKE_INSTALL_PREFIX="/usr/local/llvm" .. && make -j8 && make install PREFIX="/usr/local/llvm" ---> Using cache ---> 81790a52d1ca Step 3/9 : RUN pip install pycuda onnx nvidia-pyindex && pip install onnx-graphsurgeon onnxruntime tf2onnx xgboost ---> Using cache ---> 7b262fc70ae1 Step 4/9 : RUN git clone --recursive https://github.com/Tencent/TPAT.git /workspace/TPAT && cd /workspace/TPAT/3rdparty/blazerml-tvm && mkdir build && cp cmake/config.cmake build && cd build ---> Running in 6cbc94a98aad Cloning into '/workspace/TPAT'... Submodule '3rdparty/blazerml-tvm' (https://github.com/Tencent/BlazerML-tvm.git) registered for path '3rdparty/blazerml-tvm' Cloning into '/workspace/TPAT/3rdparty/blazerml-tvm'... fatal: unable to access 'https://github.com/Tencent/BlazerML-tvm.git/': Failed to connect to github.com port 443: Connection timed out fatal: clone of 'https://github.com/Tencent/BlazerML-tvm.git' into submodule path '/workspace/TPAT/3rdparty/blazerml-tvm' failed Failed to clone '3rdparty/blazerml-tvm'. Retry scheduled Cloning into '/workspace/TPAT/3rdparty/blazerml-tvm'... fatal: unable to access 'https://github.com/Tencent/BlazerML-tvm.git/': Failed to connect to github.com port 443: Connection timed out fatal: clone of 'https://github.com/Tencent/BlazerML-tvm.git' into submodule path '/workspace/TPAT/3rdparty/blazerml-tvm' failed Failed to clone '3rdparty/blazerml-tvm' a second time, aborting The command '/bin/sh -c git clone --recursive https://github.com/Tencent/TPAT.git /workspace/TPAT && cd /workspace/TPAT/3rdparty/blazerml-tvm && mkdir build && cp cmake/config.cmake build && cd build' returned a non-zero code: 1

Can you check your environment of Git? Mirror will clone submodule of TPAT:'https://github.com/Tencent/BlazerML-tvm.git.
Tips: Try to unset proxy of git?

from tpat.

GeneralJing avatar GeneralJing commented on May 14, 2024

可能是docker镜像内部设置的问题,我在容器外面把仓库clone下来了,然后在启动的容器内部去执行了dockfile里面的剩余命令就可以了,看文档Plugin Compiler Env这块:
And export TensorRT/include to Environment Variables : CPLUS_INCLUDE_PATH and C_INCLUDE_PATH:
这块是怎么设置的?我在容器内部执行dpkg -l | grep TensorRT 看到有如下输出:
ii graphsurgeon-tf 7.1.2-1+cuda11.0 amd64 GraphSurgeon for TensorRT package
ii libnvinfer-bin 7.1.2-1+cuda11.0 amd64 TensorRT binaries
ii libnvinfer-dev 7.1.2-1+cuda11.0 amd64 TensorRT development libraries and headers
ii libnvinfer-plugin-dev 7.1.2-1+cuda11.0 amd64 TensorRT plugin libraries
ii libnvinfer-plugin7 7.1.2-1+cuda11.0 amd64 TensorRT plugin libraries
ii libnvinfer7 7.1.2-1+cuda11.0 amd64 TensorRT runtime libraries
ii libnvonnxparsers-dev 7.1.2-1+cuda11.0 amd64 TensorRT ONNX libraries
ii libnvonnxparsers7 7.1.2-1+cuda11.0 amd64 TensorRT ONNX libraries
ii libnvparsers-dev 7.1.2-1+cuda11.0 amd64 TensorRT parsers libraries
ii libnvparsers7 7.1.2-1+cuda11.0 amd64 TensorRT parsers libraries
ii python3-libnvinfer 7.1.2-1+cuda11.0 amd64 Python 3 bindings for TensorRT
ii python3-libnvinfer-dev 7.1.2-1+cuda11.0 amd64 Python 3 development package for TensorRT
ii uff-converter-tf 7.1.2-1+cuda11.0 amd64 UFF converter for TensorRT package
容器内部是已经安装好了tensorrt了吧?那这两个变量是怎么设置的?

from tpat.

buptqq avatar buptqq commented on May 14, 2024

可能是docker镜像内部设置的问题,我在容器外面把仓库clone下来了,然后在启动的容器内部去执行了dockfile里面的剩余命令就可以了,看文档Plugin Compiler Env这块: And export TensorRT/include to Environment Variables : CPLUS_INCLUDE_PATH and C_INCLUDE_PATH: 这块是怎么设置的?我在容器内部执行dpkg -l | grep TensorRT 看到有如下输出: ii graphsurgeon-tf 7.1.2-1+cuda11.0 amd64 GraphSurgeon for TensorRT package ii libnvinfer-bin 7.1.2-1+cuda11.0 amd64 TensorRT binaries ii libnvinfer-dev 7.1.2-1+cuda11.0 amd64 TensorRT development libraries and headers ii libnvinfer-plugin-dev 7.1.2-1+cuda11.0 amd64 TensorRT plugin libraries ii libnvinfer-plugin7 7.1.2-1+cuda11.0 amd64 TensorRT plugin libraries ii libnvinfer7 7.1.2-1+cuda11.0 amd64 TensorRT runtime libraries ii libnvonnxparsers-dev 7.1.2-1+cuda11.0 amd64 TensorRT ONNX libraries ii libnvonnxparsers7 7.1.2-1+cuda11.0 amd64 TensorRT ONNX libraries ii libnvparsers-dev 7.1.2-1+cuda11.0 amd64 TensorRT parsers libraries ii libnvparsers7 7.1.2-1+cuda11.0 amd64 TensorRT parsers libraries ii python3-libnvinfer 7.1.2-1+cuda11.0 amd64 Python 3 bindings for TensorRT ii python3-libnvinfer-dev 7.1.2-1+cuda11.0 amd64 Python 3 development package for TensorRT ii uff-converter-tf 7.1.2-1+cuda11.0 amd64 UFF converter for TensorRT package 容器内部是已经安装好了tensorrt了吧?那这两个变量是怎么设置的?

这块如果用Dockerfile的话,From nvidia的镜像里是有的。如果是自己创建的环境,是为了编译plugin.cu和.h。当然如果你修改了python/trt_plugin/Makefile里的trt_path,那么不用加到C和C++的头文件里也可以的

from tpat.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.