Open
Description
Describe the documentation issue
It's not clear what should be the execution providers strings in Python.
Eg. I want to enable DirectML or CoreML. I can see them here https://onnxruntime.ai/docs/execution-providers/
But I don't see any mention for the exact string value that should passed eg CUDAExecutionProvider
Page / URL
https://onnxruntime.ai/docs/api/python/api_summary.html#load-and-run-a-model