Skip to content

[Javascript ] inferenceSession on WebGL #20224

Open
@shimaamorsy

Description

@shimaamorsy

Describe the issue

When i tried to inferenceSession on WebGL , I encountered this error

webgl

To reproduce

  1. Download Yolov8n onnx model here MODEL
  2. Run this HTML page in a webserver (LiveServer in Visual Studio Code fi):
    `
<script src="https://cdn.jsdelivr.net/npm/onnxruntime-web/dist/ort.webgl.min.js"></script>

let model =  await ort.InferenceSession.create("yolov8n.onnx", { executionProviders: ['webgl'] });
const tensor = new ort.Tensor("float32",new Float32Array(modelInputShape.reduce((a, b) => a * b)),modelInputShape);
await model.run({ images: tensor })

Urgency

Yes , i should solve this error immediately

Platform

Windows

OS Version

10

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.17.1

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

Webgl

Model File

No response

Is this a quantized model?

No

Metadata

Metadata

Assignees

No one assigned

    Labels

    api:Javascriptissues related to the Javascript APIplatform:webissues related to ONNX Runtime web; typically submitted using templateplatform:windowsissues related to the Windows platformstaleissues that have not been addressed in a while; categorized by a bot

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions