Description
Describe the issue
we are using onnxruntime-web and want to know if there is a configuration to customize the nn inference location. the method now is getting from the root of the app build but we want to do a custom path.
anyone have idea on how to change so the wasm can be in own custom location?
Trying to find way to set custom wasm paths.
Unfortunately, direct configuration for custom WASM file paths via ONNX Runtime Web's API is not supported. Instead, the solution revolves around ensuring the files are served from the correct path that ONNX Runtime expects.
To reproduce
we are building the wasm but when we call the innference it only using the root location. we placed the wasm elsewhere.
import cv from "@techstark/opencv-js";
import { Tensor, InferenceSession } from "onnxruntime-web";
import CryptoJS from 'crypto-js';
window.liveIdLibraries = {
Tensor: Tensor,
InferenceSession: InferenceSession,
CryptoJS: CryptoJS,
cv: cv
};
And then our webpack, we do this:
new CopyPlugin({
patterns: [
{
from: path.resolve(path.join(__dirname, 'node_modules', 'onnxruntime-web', 'dist', '*.wasm')),
to: "../html/assets/js/id-models/[name][ext]"
},
],
}),
Urgency
No response
Platform
Windows
OS Version
web
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1
ONNX Runtime API
JavaScript
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response