Skip to content

[FLINK-34152] Add an option to scale memory when downscaling #786

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Mar 7, 2024

Conversation

mxm
Copy link
Contributor

@mxm mxm commented Feb 28, 2024

This adds an option to increase heap and managed memory when removing TaskManagers. The scaling is applied after adjusting the memory pools (heap, metaspace, network, managed) and only affects heap memory.

The reason for adding this functionality is that the likelihood of running into memory constrained scenarios when downscaling is increased after applying memory tuning. As a precaution, we temporarily increase the total cluster
memory proportionally to the removed TaskManagers up to the maximum allowed TaskManager memory.

mxm added 2 commits February 28, 2024 15:14
This adds an option to increase heap and managed memory when removing
TaskManagers. The scaling is applied after adjusting the memory pools (heap,
metaspace, network, managed) and only affects heap and managed memory.

The reason for adding this functionality is that the likelihood of running into
memory constrained scenarios when downscaling is increased after applying memory
tuning. As a precaution, we temporarily increase the memory up to the maximum
allowed TaskManager memory.
@mxm
Copy link
Contributor Author

mxm commented Mar 7, 2024

@1996fanrui I'm merging but I'll take any comments from your side that you might have reviewing this later on.

@mxm mxm merged commit d526174 into apache:main Mar 7, 2024
131 checks passed
@mxm mxm deleted the memory-tuning branch March 7, 2024 15:56
@1996fanrui
Copy link
Member

@1996fanrui I'm merging but I'll take any comments from your side that you might have reviewing this later on.

Thanks @mxm for the improvement and ping! Sorry for the late response here. I just finished my vacation, I will take a look these 2 days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants