Git Product home page Git Product logo

Comments (23)

fs-eire avatar fs-eire commented on July 18, 2024 1

Than you for reporting this issue. I will try to figure out how to fix this problem.

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024 1

#20991 makes default ESM import to use non-dynamic-import and hope this change may fix this problem. PR is still in progress

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024 1

Hey, I also need this. I am struggling with importing this version. So far I have been importing ONNX using import * as ort from "https://cdn.jsdelivr.net/npm/onnxruntime-web/dist/esm/ort.webgpu.min.js". However, when I change to import * as ort from "https://cdn.jsdelivr.net/npm/[email protected]/dist/esm/ort.webgpu.min.js" it seems not to have an .../esm/ folder. Do you know why that is and how to import it then?

just replace .../esm/ort.webgpu.min.js to .../ort.webgpu.min.mjs should work. If you are also using service worker, use ort.webgpu.bundle.min.mjs instead of ort.webgpu.min.mjs.

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

So it turns out to be that dynamic import (ie. import()) and top-level await is not supported in current service worker. I was not expecting that import() is banned in SW.

Currently, the WebAssembly factory (wasm-factory.ts) uses dynamic import to load the JS glue. This does not work in service worker. A few potential solutions are also not available:

  • Modifying it to import statement: won't work, because the JS glue includes top-level await.
  • Using importScripts: won't work, because the JS glue is ESM
  • Using eval: won't work; same to importScripts

I am now trying to make a JS bundle that does not use dynamic import for usage of service worker specifically. Still working on it

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

Thanks, I appreciate your efforts around this. It does seem like some special-case bundle will need to be built after all; you might need iife or umd for the bundler output format

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

Thanks, I appreciate your efforts around this. It does seem like some special-case bundle will need to be built after all; you might need iife or umd for the bundler output format

I have considered this option. However, Emscripten does not offer an option to output both UMD(IIFE+CJS) & ESM for JS glue (emscripten-core/emscripten#21899). I have to choose either. I choose the ES6 format output for the JS glue, because of a couple of problems when import UMD from ESM, and import() is a standard way to import ESM from both ESM and UMD. ( Until I know its not working in service worker by this issue)

I found a way to make ORT web working, - yes this need the build script to do some special handling. And this will only work for ESM, because the JS glue is ESM and it seems no way to import ESM from UMD in service worker.

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

@ggaabe Could you please help to try import * as ort from “./ort.webgpu.bundle.min.js” from version 1.19.0-dev.20240604-3dd6fcc089 ?

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

@fs-eire my project is dependent on transformersjs, which imports onnxruntime webgpu backend like this here:

https://github.com/xenova/transformers.js/blob/v3/src/backends/onnx.js#L24

Is this the right usage? In my project I've added this to my package.json to resolve onnx-runtime to this new version though the issue is still occurring:

  "overrides": {
    "onnxruntime-web": "1.19.0-dev.20240604-3dd6fcc089"
  }

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

Maybe also important: The same error is still occurring in same spot in inference session in the onnx package and not from transformersjs. Do I need to add a resolver for onnxruntime-common as well?

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

Hi @fs-eire, is the newly-merged fix in a released build I can try?

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

Please try 1.19.0-dev.20240612-94aa21c3dd

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

@fs-eire EDIT: Nvm the comment I just deleted, that error was because I didn't set the webpack target to webworker.

However, I'm getting a new error now (progress!):

Error: no available backend found. ERR: [webgpu] RuntimeError: null function or function signature mismatch

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

Update: Found the error is happening in here:

if (!isInitializing) {
backendInfo.initPromise = backendInfo.backend.init(backendName);
}
await backendInfo.initPromise;

For some reason the webgpu backend.init promise is rejecting due to the null function or function signature mismatch error. This is much further along than we were before though.

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

Update: Found the error is happening in here:

if (!isInitializing) {
backendInfo.initPromise = backendInfo.backend.init(backendName);
}
await backendInfo.initPromise;

For some reason the webgpu backend.init promise is rejecting due to the null function or function signature mismatch error. This is much further along than we were before though.

Could you share me the reproduce steps?

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

@fs-eire You'll need to run the webGPU setup in a chrome extension.

  1. You can use my code I just published here: https://github.com/ggaabe/extension
  2. run npm install
  3. run npm run build
  4. open the chrome manage extensions
Screenshot 2024-06-14 at 9 37 14 AM
  1. load unpacked
Screenshot 2024-06-14 at 9 37 52 AM
  1. select the build folder from the repo.
  2. open the AI WebGPU Extension extension
  3. type some text in the text input. it will load Phi-3 mini and after finishing loading this error will occur
  4. if you view the extension in the extension in the extension manager and select the "Inspect views
    service worker" link before opening the extension it will bring up an inspection window to view the errors as they occur. A little "errors" bubble link also shows up here after they occur.
Screenshot 2024-06-14 at 9 40 48 AM
  1. You will need to click the "Refresh" button on the extension in the extension manager to rerun the error because it does not attempt reloading the model after the first attempt until another refresh

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

@ggaabe I did some debug on my box and made some fixes -

  1. Changes to ONNXRuntime Web:

    #21073 is created to make sure the web assembly file can be loaded correctly when env.wasm.wasmPaths is not specified.

  2. Changes to https://github.com/ggaabe/extension

    ggaabe/extension#1 need to be made to the extension example, to make it load the model correctly. Please note:

    • The onnxruntime-web version need to be updated to consume changes from (1) (after it get merged and published for dev channel)
    • There are still errors in background.js, which looks like incorrect params passed to tokenizer.apply_chat_template(). However, the WebAssembly is initialized and the model loaded successfully.
  3. Other issues:

    • Transformerjs overrides env.wasm.wasmPaths to a CDN URL internally. At least for this example, we don't want this behavior so we need to reset it to undefined to keep the default behavior.
    • Multi-threaded CPU EP is not supported because Worker is not accessible in service worker. Issue tracking: whatwg/html#8362

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

Awesome, thank you for your thoroughness in explaining this and tackling this head on. Is there a dev channel version I can test out?

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

Not yet. Will update here once it is ready.

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

sorry to bug; is there any dev build number? wasn't sure how often a release runs

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

sorry to bug; is there any dev build number? wasn't sure how often a release runs

Please try 1.19.0-dev.20240621-69d522f4e9

from onnxruntime.

ggaabe avatar ggaabe commented on July 18, 2024

@fs-eire I'm getting one new error:

ort.webgpu.bundle.min.mjs:6 Uncaught (in promise) Error: The data is not on CPU. Use `getData()` to download GPU data to CPU, or use `texture` or `gpuBuffer` property to access the GPU data directly.
    at get data (ort.webgpu.bundle.min.mjs:6:13062)
    at get data (tensor.js:62:1)

I pushed the code changes to my repo and fixed the call to the tokenizer. To reproduce, just type 1 letter in the chrome extension’s text input and wait

from onnxruntime.

nickl1234567 avatar nickl1234567 commented on July 18, 2024

Hey, I also need this. I am struggling with importing this version. So far I have been importing ONNX using
import * as ort from "https://cdn.jsdelivr.net/npm/onnxruntime-web/dist/esm/ort.webgpu.min.js".
However, when I change to import * as ort from "https://cdn.jsdelivr.net/npm/[email protected]/dist/esm/ort.webgpu.min.js" it seems not to have an .../esm/ folder. Do you know why that is and how to import it then?

from onnxruntime.

fs-eire avatar fs-eire commented on July 18, 2024

@fs-eire I'm getting one new error:

ort.webgpu.bundle.min.mjs:6 Uncaught (in promise) Error: The data is not on CPU. Use `getData()` to download GPU data to CPU, or use `texture` or `gpuBuffer` property to access the GPU data directly.
    at get data (ort.webgpu.bundle.min.mjs:6:13062)
    at get data (tensor.js:62:1)

I pushed the code changes to my repo and fixed the call to the tokenizer. To reproduce, just type 1 letter in the chrome extension’s text input and wait

This may be a problem of transformerjs. Could you try whether this problem happen in a normal page? If so, can report the issue to transformerjs. If it's only happening in service worker, I can take a closer look

from onnxruntime.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.