Open
Description
I'm using pylhe
for looping on several LHE files, each containing 100K events. Running the snippet below on a lxplus machine (CentOS Linux release 7.9.2009), one can see that iterations become slower as time progresses, and eventually the job gets killed due to too much memory being used.
import pylhe
import time
afile = "/afs/cern.ch/work/b/bfontana/public/Singlet_TManualV3_all_M280p00_ST0p14_L463p05_K1p00_cmsgrid_final.lhe"
atime = time.time()
for ievt, evt in enumerate(pylhe.read_lhe(afile)): #pylhe.read_lhe_with_attributes
if ievt%5000==0:
print(time.time() - atime)
atime = time.time()
print(' - {} events processed'.format(ievt))
The significant slowdown occurs at iteration ~40K/50K. I would expect no memory increase given that we are dealing with a generator.
Is the above behavior expected? I'm using Python 3.5.6 (GCC 6.2.0).
Metadata
Metadata
Assignees
Labels
No labels