godaji / tensorrt-inference-server Goto Github PK
View Code? Open in Web Editor NEWThis project forked from triton-inference-server/server
The TensorRT Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.
License: BSD 3-Clause "New" or "Revised" License