Skip to content

CUDA out of memory/crashing on NVDIA GeForce RTX 5090 #199

@icombs2017

Description

@icombs2017

Good morning,

I have been trying to troubleshoot this issue our system has been having with TagLab for a few weeks now, I found some forums online saying that the 5000 series chips from NVDIA are having some compatibility issues with pytorch, so I think that may be part of my issue here. I have this version of TagLab installed and working on all of our other machines besides this one with the 5090 GPU. I still installed it in the conda environment, and even tried to specify installing the CPU version since I think part of the problem is the GPU. This is the error messaging I am getting along with a "CUDA out of memory" message when using some of the semiautomatic tools despite zooming in on the image:

C:\Users\Restoration\Documents\GitHub\TagLab\models\isegm\inference\utils.py:22: FutureWarning: You are using torch.load with weights_only=False (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for weights_only will be flipped to True. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via torch.serialization.add_safe_globals. We recommend you start setting weights_only=True for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
state_dict = torch.load(checkpoint, map_location='cpu')
C:\Users\Restoration.conda\envs\taglab11\Lib\site-packages\torch\cuda_init_.py:235: UserWarning:
NVIDIA GeForce RTX 5090 with CUDA capability sm_120 is not compatible with the current PyTorch installation.
The current PyTorch install supports CUDA capabilities sm_50 sm_60 sm_61 sm_70 sm_75 sm_80 sm_86 sm_90.
If you want to use the NVIDIA GeForce RTX 5090 GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/

warnings.warn(

Is there a work around for this?

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions