Skip to content

Bump protobuf from 4.25.8 to 5.29.6 in /onnxruntime/python/tools/transformers/models/llama #10219

Bump protobuf from 4.25.8 to 5.29.6 in /onnxruntime/python/tools/transformers/models/llama

Bump protobuf from 4.25.8 to 5.29.6 in /onnxruntime/python/tools/transformers/models/llama #10219

Status Success
Total duration 3h 4m 48s
Artifacts 1

windows_tensorrt.yml

on: pull_request
Windows GPU TensorRT CI Pipeline
38m 0s
Windows GPU TensorRT CI Pipeline
Windows GPU TensorRT CI Pipeline Test Job
37m 9s
Windows GPU TensorRT CI Pipeline Test Job
Fit to window
Zoom out
Zoom in

Annotations

6 warnings
Windows GPU TensorRT CI Pipeline: onnxruntime/core/mlas/lib/amd64/QgemmU8X8KernelAvx2.asm#L1234
epilog offset from end of function exceeds 4095
Windows GPU TensorRT CI Pipeline: onnxruntime/core/mlas/lib/amd64/QgemmU8X8KernelAvx2.asm#L1227
epilog offset from end of function exceeds 4095
Windows GPU TensorRT CI Pipeline: onnxruntime/core/mlas/lib/amd64/QgemmU8X8KernelAvx2.asm#L1220
epilog offset from end of function exceeds 4095
Windows GPU TensorRT CI Pipeline: onnxruntime/core/mlas/lib/amd64/QgemmU8X8KernelAvx2.asm#L1213
epilog offset from end of function exceeds 4095
Windows GPU TensorRT CI Pipeline: onnxruntime/core/mlas/lib/amd64/QgemmU8X8KernelAvx2.asm#L1206
epilog offset from end of function exceeds 4095
Windows GPU TensorRT CI Pipeline: onnxruntime/core/mlas/lib/amd64/QgemmU8X8KernelAvx2.asm#L1199
epilog offset from end of function exceeds 4095

Artifacts

Produced during runtime
Name Size Digest
build-artifacts Expired
1.89 GB
sha256:6fc2451ecdfaf21fd76e3a93a97d3e19de88159a061189921b9c0921064ac73e