Skip to content

[Bug]: Compiler Crash "f16 is not representable as pointer to f32" during compile_model #33425

@zhihaoxu1325

Description

@zhihaoxu1325

OpenVINO Version

2025.4.0

Operating System

Ubuntu 20.04 (LTS)

Device used for inference

CPU

Framework

None

Model used

No response

Issue description

I encountered a hard crash (C++ Exception) during core.compile_model when processing an ONNX model containing an If operator and float16 constants.

The error message Tensor data with element type f16, is not representable as pointer to f32 suggests that an internal optimization pass (likely related to constant folding or subgraph handling within control flow) is attempting to access float16 tensor data using a float* (f32) pointer, violating type safety checks in make_tensor.cpp.

read_model: Succeeds.

compile_model: Fails immediately with the exception.

Step-by-step reproduction

Download the attached minimal ONNX reproduction model (openvino_if_f16_bug_repro.onnx).

Context: The model has been reduced to ~40 nodes. It involves Mul/Div operations using an FP16 initializer (v67_0) likely interacting with an If block.

Run the following script:

import openvino as ov

# Reproduction script for OpenVINO If + f16 constant bug
# Bug: compile_model fails with "f16 is not representable as pointer to f32"

model_path = "openvino_if_f16_bug_repro.onnx"

core = ov.Core()

# read_model succeeds
print("Testing read_model...")
ov_model = core.read_model(model_path)
print(f"  Success: {len(ov_model.inputs)} inputs, {len(ov_model.outputs)} outputs")

# compile_model fails
print("\nTesting compile_model...")
try:
    compiled = core.compile_model(ov_model, "CPU")
    print("  Success")
except Exception as e:
    print(f"  Failed with error:")
    print(f"  {e}")

openvino_if_f16_bug_repro.onnx.zip

Relevant log output

Exception from src/inference/src/cpp/core.cpp:109:
Exception from src/inference/src/dev/plugin.cpp:53:
Exception from src/core/src/runtime/tensor.cpp:121:
Check 'is_pointer_representable(element_type)' failed at src/inference/src/dev/make_tensor.cpp:79:
Tensor data with element type f16, is not representable as pointer to f32

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions