Skip to content

llama model with coreml backend doesn't work on iphone 13 pro #3712

Closed
@seayoung1112

Description

@seayoung1112

First, the same test worked fine on iphone 15/14, so it's very likely an issue on iphone 13

Test set up:

llama ExecuTorch ios app, built locally
iphone 13 pro, ios version 17.4

llama model was exported using the command:
python -m examples.models.llama2.export_llama -c <model.ckpt> -p <model_params.json> --coreml --use_kv_cache

The app crashed with the following error log:

Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED
Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED
Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED
MTLCompiler: Compilation failed with XPC_ERROR_CONNECTION_INTERRUPTED on 3 try
/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShaders/MPSCore/Utility/MPSLibrary.mm, line 491: error 'MPSLibrary::MPSKey_Create internal error: Unable to get MPS kernel NDArrayVectorMatrixMultiply. Error: Compiler encountered an internal error
'
/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShaders/MPSCore/Utility/MPSLibrary.mm:491: failed assertion `MPSLibrary::MPSKey_Create internal error: Unable to get MPS kernel NDArrayVectorMatrixMultiply. Error: Compiler encountered an internal error

Metadata

Metadata

Labels

module: coremlIssues related to Apple's Core ML delegation and code under backends/apple/coreml/

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions