This code is a work in progress to build a neural network that will distinguish signal from background. It consists of an Ntuplizer and the learner (where the actual implementation of the PNN is).
Recommended release for this analyzer is CMSSW_13_3_0 or later. Commands to setup the repo are:
cmsrel CMSSW_13_3_0
cd CMSSW_13_3_0/src
cmsenv
git clone [email protected]:JavierGarciadeCastro/PNN.git
scram b -j 8
The code consists of two folders:
- Ntuplizer
- plugins/: which contains the plugins (EDAnalyzer's) where the analyzers are defined in .cc files. These are where we convert miniAOD to flat ntuples.
- python/: which contains cfi files to setup the sequences that run with the plugins contained in plugins/. A sequence is a specific configuration of the parameters that run with one of the plugins defined in plugins. One single plugin may have different sequences defined in the same or multiple files.
- test/: which contains cfg files to run the sequences defined in the python/ folder.
- learner
- learner.py: Core of the implementation (work in progress)
- make_env.sh and setup.sh: These files are used to set up the virtual environment where we can use torch for the PNN.
cmsenv
voms-proxy-init --voms cms
First run the Ntuplizer to obtaun the ntuples that you want to feed to the pnn, for both signal and background, for now we only have one file in signal and one in background. Signal is H to ZdZd and background is DY
cmsRun test/runNtuplizer_bkg_cfg.py
cmsRun test/runNtuplizer_signal_cfg.py
This should produce two root files, Ntuples_bkg.root and Ntuples_signal.root. Next we have to move these files to the learner format (make this automatic in the future)
cd ..
mv Ntuplizer/Ntuples*.root learner
Now set up the environment for the learner to run:
sh make_env.sh
source setup.sh
Lastly, run the learner to train the Neural Network (for now this doesn't really do anything, just reads the root files and converts to torch tensors)
python3 learner.py