Description
Describe the bug
Currently the article report loops through all submissions of a journal, does many sub-queries for each of them, then stores all of them in memory before outputting, which turns it into both a CPU and memory hog, where the only alternative is to increase resource limits for the application.
-
We definitely need to stop storing all the records in memory, especially because the reason is just to generate headers and skip cells (formatting).
To solve it, we can just pre-read the maximum number of whatever is needed (e.g. maximum number of authors per submission).
Another less interesting option (could be nice if getting the totals was ultra expensive), would be write everything to a file, keep track of the numbers, then re-format (add headers + spans). -
It's better to assume the number of submissions can be large (thousands), then generating the report offline (with a job) and with a pagination (to avoid buffering a large result set) sounds like a sane idea.
-
Reducing the number of sub-queries would be great, but probably not feasible with the current code. Then, if it's a highly used report, writing it with pure SQL will bring nice performance gains.
What application are you using?
OJS 3.3