Skip to content

Dask only uses ~40% percentage of CPU on Windows #9073

@carlosg-m

Description

@carlosg-m

Describe the issue: After multiple attempts, including setting the LocalCluster in "processes" mode to multiple combinations of n_workers and threads_per_worker, using Dask Dataframes with MapPartitions, Dask Bags and Dask Arrays with simple snippets of code. The CPU does not go over 40%, only 16 cores are working but capped below 100%.

Minimal Complete Verifiable Example:

import dask.array as da
import numpy as np
from dask.distributed import Client, LocalCluster

# Create a local Dask cluster
cluster = LocalCluster(n_workers=24, threads_per_worker=1)
client = Client(cluster)

dask_array = dask.array.random.random((1000000, 100000))

# Perform a computation on the Dask array
result = dask_array.sum()

# Compute and get the result
print(result.compute())

# Close the Dask client and cluster
client.close()
cluster.close()

Anything else we need to know?:

Image

Image

Environment:

  • Dask version: 2025.4.1
  • Python version: 3.13.3
  • Operating System: Windows 11 24H2
  • Install method (conda, pip, source): pip
  • CPU: 13th Gen Intel(R) Core(TM) i9-13950HX, 2200 Mhz, 24 Cores, 32 Threads
  • Workstation: Laptop

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething is brokendiscussionDiscussing a topic with no specific actions yet

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions