Skip to content

Possible memory leak issue #8

@ArvinKhoshboresh

Description

@ArvinKhoshboresh

I've been trying to use this package to solve a very large (1M x 1M cost matrix, though very sparse) linear assignment problem. doing this causes a segmentation fault even when passing cardinality_check=False.

To get around this, I cut up my problem into many smaller problems of varying sizes (10kx10k to 300x300) because this is suboptimal but feasible for my use-case. My program will start to solve the smaller problems, but will eventually crash, but never at one particular problem.

To isolate the problem, I wrote the following code to remove the other elements of my code:

import numpy as np
from scipy.sparse import random as sparse_random
import sslap

size = 1000
density = 0.01
loop_count = 100

for i in range(loop_count):
    dense_matrix = sparse_random(size, size, density=density, format='coo', data_rvs=np.random.rand).todense()
    result = sslap.auction_solve(dense_matrix, cardinality_check=False, fast=True)
    print(f"Loop {i + 1} completed")

When this is done, memory usage climbs very quickly. Our compute has 200gb ram and a larger swap, and it quickly exceeds this.

Notably this only happens when passing a sparse matrix, and not a dense matrix. I believe it still happens with the dense matrix but it appears to be much much slower.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions