Open
Description
I’m looking for guidance on how to use Pipenv to install llama-cpp-python with CUDA support. The installation command using pip is:
CMAKE_ARGS="-DGGML_CUDA=on" pip install llama-cpp-python
Could someone help me with how to implement this in Pipenv?
Thanks in advance!
Metadata
Metadata
Assignees
Labels
No labels