Skip to content

Conversation

@Delaunay
Copy link
Collaborator

No description provided.

@Delaunay Delaunay requested a review from bouthilx January 19, 2023 14:44
@Delaunay Delaunay requested a review from bouthilx January 19, 2023 19:43
return i


def has_pytorch():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤨

self.pool_config = state["pool_config"]
backend = self.pool_config.get("backend")
n_workers = self.pool_config.get("n_workers", -1)
self.pool = PoolExecutor.BACKENDS.get(backend, ThreadPool)(n_workers)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit unsure about this part. If the object is serialized and passed to the subprocess, the deserialization step will have the effect or creating another pool of n_workers, no?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we maybe able to pass a queue instead to avoid creating multiple pools
but nesting the executor in general is a bit of a nono

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants