Comments (4)
If we expose direct creation of GPU tensors we'd have to have a data copy somewhere. I'm currently not sure where is the best place for that copy, and if the GPU tensors should be different in the type system somehow to make it clear that they don't have direct access to the memory.
Is the use case in a pipeline of models all resident on the GPU?
from onnxruntime.
Yup, the models are on the GPU. In my case, it's not a direction creation of GPU tensor, but rather wrapping a raw resource, a D3D11 texture, converted into a CUDA resource with CUDA's Direct3D 11 Interoperability. CreateTensorWithDataAsOrtValue is used for this. I don't think the CSharp API has any distinction between an OrtValue that is backed by a GPU buffer or a CPU buffer. In fact, calling GetTensorMutableRawData on a GPU backed OrtValue gives a System.AccessViolationException. I personally think having a distinction in the type system would be nice.
from onnxruntime.
Ok, so if you've made the tensor elsewhere via JNI what kind of type would it be to allow wrapping in an OrtValue? Is it a bare pointer/long?
from onnxruntime.
Do not forget about disposing native resources.
from onnxruntime.
Related Issues (20)
- Symbolic Shape infer fails on onnx file without much logs
- How to convert quantized ONNX model from Tensor-Oriented format to Operator-Oriented format?
- Quantized ONNX Model Still Has Float32 Input/Output Tensors HOT 2
- [Build] build python wheel fails HOT 2
- [Documentation] Typo in tutorials at the top of the official webpage
- [Jvm] Native crash during createSession: std::bad_cast HOT 4
- [Performance] CUDA kernel not found in registries for Op type: ScatterND HOT 7
- [Training] Onnxruntime-training 1.18.0 for windows not available HOT 4
- [Performance] Whisper model inference results incorrect after Transformer Optimizer HOT 2
- [Training] Cannot export model for inferencing from session created from buffers
- [Performance] Failed to run Whisper inference after optimization with Dml EP HOT 1
- [E:onnxruntime:, qnn_execution_provider.cc:591 GetCapability] QNN SetupBackend failed qnn_backend_manager.cc:334 InitializeBackend Failed to initialize backend HOT 3
- [Feature Request] Add DFT support for CUDAExecutionProvider
- [Performance] Increased memory usage when loading from bytes HOT 5
- Can onnxruntime.quantization.quantize_dynamic() work with onnx-trt?
- CoreML EP inference result is improperly scaled HOT 3
- ORT 1.18.1 Release Candidates available for testing HOT 3
- [Build] "utf8_range::utf8_validity" does not exist HOT 5
- QDQ removal optimization from around MaxPool changes results with negative scale
- [Mobile] Cocoapods release archive zips are missing HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from onnxruntime.