Skip to content

Conversation

@fstagni
Copy link
Contributor

@fstagni fstagni commented Jun 25, 2025

BEGINRELEASENOTES

*WMS
NEW: InProcessComputingElement takes RAM requirements into consideration for jobs' matching
NEW: PoolComputingElement takes RAM requirements into consideration when subdividing the pool
NEW: SingularityComputingElement can enforce CG2 RAM limits

ENDRELEASENOTES

  • This does not change anything at the matching level, where a mechanism using Tags was already present. It should anyway be checked --
  • I have kept for the moment the definition of MaxRAM in GBs. Maybe MBs is better? The value has to be an integer
  • This is a large PR for v8, and I am considering moving it to v9. What do you think?
  • check for TS -> WMS propagation of requirements

@DIRACGridBot DIRACGridBot added the alsoTargeting:integration Cherry pick this PR to integration after merge label Jun 25, 2025
if availableMemory < requestedMemory:
return None

return requestedMemory
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I might have overlooked something, do we specify a default RAM value for the jobs?

Because if I understand correctly, getRAMInUse() depends on the value returned by _getMemoryForJobs(), which would be 0 if the tag *GB or *GB_MAX is not specified (?).
So if we have n jobs with no "RAM" tag, we will always have getRAMInUse = 0.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's no obvious default value for RAM (for processors we specify 1 which is a slightly more obvious default value). So, yes, as it stands the default value for RAM is 0.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

RAM and NumberOfProcessors should be treated in exactly the same way. These are resources with integer values, having default values and summing up for jobs on the same node. So, may be common methods (or class) can be introduced for those, may be other tags can be added eventually, e.g. disk space.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having RAM in GBs isn't fine enough for an integer value, having less than 1GB per core is sensible on some systems and jobs.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this be made MB instead?

@fstagni fstagni requested a review from sfayer June 26, 2025 09:49
Copy link
Member

@sfayer sfayer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi,

I've done a quick review: Please let me know if you need anything else from me on this. (I didn't quite hear what you said about it in the most recent BiLD meeting: there was some local noise just as you were talking about it).

The other CEs (singularity/inprocess) should probably set the same memory setting based on the tags too if they've not already been set by PoolCE for cases where the PoolCE isn't in use.

Regards,
Simon

@fstagni fstagni force-pushed the 80_Pool_RAM branch 3 times, most recently from 6c09d92 to ee00569 Compare July 2, 2025 12:21
@fstagni fstagni force-pushed the 80_Pool_RAM branch 2 times, most recently from 2db1187 to 608d57f Compare July 14, 2025 15:17
@fstagni fstagni force-pushed the 80_Pool_RAM branch 3 times, most recently from 2883631 to f2e8f9e Compare October 14, 2025 14:29
@fstagni fstagni marked this pull request as draft October 14, 2025 14:30
@fstagni fstagni force-pushed the 80_Pool_RAM branch 3 times, most recently from b8b1635 to e41e706 Compare October 15, 2025 08:13
@fstagni fstagni force-pushed the 80_Pool_RAM branch 3 times, most recently from f859666 to 3ce23be Compare October 16, 2025 13:58
@fstagni fstagni marked this pull request as ready for review October 16, 2025 14:08
@fstagni
Copy link
Contributor Author

fstagni commented Nov 13, 2025

This PR is replaced by #8366 which targets v9.0. As said, these are a bit too many changes for 8.0 branch.

@fstagni fstagni closed this Nov 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

alsoTargeting:integration Cherry pick this PR to integration after merge

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants