Skip to content

Commit f320ba4

Browse files
committed
Merge branch 'main' into 694-fix-corrupted-instrument-synergy-tutorial
2 parents af63621 + 0272643 commit f320ba4

10 files changed

Lines changed: 258 additions & 511 deletions

File tree

changelog.d/687.attribution.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Lucien Mauviard

changelog.d/687.fixed.rst

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
Removed the old synthesise method from `synthesise.pyx`.
2+
3+
Fixed the Signal synthesize to handle better the background

changelog.d/688.added.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Add documentation about installing X-PSI in the Jean-Zay supercomputer.

changelog.d/688.attribution.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Pierre Stammler

docs/source/Example_job.rst

Lines changed: 50 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,12 +3,12 @@
33
Example job
44
===========
55

6-
For both jobs you will need these
6+
For the three jobs below you will need these
77
`auxiliary files <https://zenodo.org/record/7113931>`_ inside the ``model_data/``
88
directory.
99

10-
Snellius
11-
--------
10+
Snellius (SURF)
11+
---------------
1212

1313
The following job script ``job.sh`` is an example job script for analysis on the Snellius system (see :ref:`hpcsystems`).
1414

@@ -58,8 +58,8 @@ number of processes to spawn as a flag argument.
5858

5959
Finally, note that only the root process will generate output for inspection.
6060

61-
Helios
62-
------
61+
Helios (API)
62+
------------
6363

6464
For Helios, we can use the following type of job script:
6565

@@ -109,4 +109,48 @@ For Helios, we can use the following type of job script:
109109
#Clean the scratch automatically here.
110110
#But remember to remove manually in each node, if the main program ends by crashing.
111111
rm -rf $OUTPUT_FOLDER
112-
112+
113+
114+
Jean-Zay (IDRIS)
115+
----------------
116+
117+
For Jean-Zay, a script like the following one can be prepared. Just check your working project account first, with ``idrproj``.
118+
119+
.. code-block:: bash
120+
121+
#!/bin/bash
122+
#SBATCH --account=nameproject@cpu
123+
#SBATCH --job-name=XPSI3_BBTest
124+
#SBATCH --time=20:00:00
125+
#SBATCH --partition=cpu_p1
126+
#SBATCH --qos=qos_cpu-t3
127+
#SBATCH --ntasks=200
128+
#SBATCH --ntasks-per-node=40
129+
#SBATCH --hint=nomultithread
130+
#SBATCH --mail-user=myemail@mailservice.com
131+
#SBATCH --mail-type=END,FAIL
132+
133+
module purge
134+
module load miniforge/24.9.0
135+
module load intel-all/19.0.4
136+
module load gsl/2.5
137+
138+
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$WORK/Softwares/MultiNest/MultiNest_v3.12_CMake/multinest/lib
139+
export LD_PRELOAD=$MKLROOT/lib/intel64/libmkl_core.so:$MKLROOT/lib/intel64/libmkl_sequential.so
140+
141+
conda activate $WORK/conda/xpsi
142+
143+
cd $WORK
144+
mkdir TestBBXPSIrun/
145+
cd TestBBXPSIrun/
146+
mkdir examples/
147+
cd examples/
148+
mkdir examples_modeling_tutorial/
149+
cd examples_modeling_tutorial/
150+
cp -r $WORK/Softwares/xpsi/examples/examples_modeling_tutorial/* ./
151+
mkdir run
152+
153+
srun python TestRun_BB.py > out1 2> err1
154+
155+
cp -r out1 err1 run $WORK/Softwares/xpsi/examples/examples_modeling_tutorial/.
156+

docs/source/HPC_systems.rst

Lines changed: 109 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -224,16 +224,122 @@ If the above works, we can then continue building X-PSI:
224224
git clone https://github.com/xpsi-group/xpsi.git
225225
cd xpsi
226226
CC=$(which cc) pip install .
227-
227+
228228
Batch usage
229229
^^^^^^^^^^^
230230

231231
For example job scripts, see the Helios example in :ref:`example_job`.
232232

233+
Jean-Zay (IDRIS)
234+
----------------
235+
236+
`Jean-Zay <http://www.idris.fr/eng/jean-zay/index.html>`_ is the French national supercomputer.
237+
238+
Installation
239+
^^^^^^^^^^^^
240+
241+
To install X-PSI in Jean-Zay, you need to check out if you have different projects for your IDRIS account, with the command ``idrproj``. The installation will be done only in your currently working project, so you may have to switch between your projects and redo the complete installation.
242+
243+
For any installation, it is preferable to process in the ``$WORK`` folder of your project, because of the small storage of the ``$HOME``
244+
245+
First prepare your modules:
246+
247+
.. code-block:: bash
248+
249+
module purge
250+
module load miniforge/24.9.0
251+
module load cmake/3.21.3
252+
module load intel-all/19.0.4
253+
module gsl/2.5
254+
255+
We intend here to install X-PSI with Intel compilers. It is preferable to avoid using the recent Cmake versions because of their dependencies with the GCC compiler.
256+
257+
Now, let's prepare the conda environment for X-PSI, by installing some of the required packages. However, they must be mentioned explicitly:
258+
259+
.. code-block:: bash
260+
261+
cd $WORK
262+
mkdir conda
263+
conda create -p $WORK/conda/xpsi python'>=3.9.0' numpy'<2.0.0' cython'~=3.0.11' matplotlib'==3.9.2' scipy wrapt gsl pytest getdist tqdm nestcheck fgivenx astropy'>=5.2,<7.0.0' emcee ultranest 'h5py<3.16.0' cmap
264+
265+
266+
Then point to the Intel compilers. If needed, mention them explicitly:
267+
268+
.. code-block:: bash
269+
270+
export CC=icc
271+
export CXX=icpc
272+
export FC=ifort
273+
#export CC=/gpfslocalsys/intel/parallel_studio_xe_2019_update4_cluster_edition/compilers_and_libraries_2019.4.243/linux/bin/intel64/icc
274+
#export CXX=/gpfslocalsys/intel/parallel_studio_xe_2019_update5_cluster_edition/compilers_and_libraries_2019.4.243/linux/bin/intel64/icpc
275+
#export FC=/gpfslocalsys/intel/parallel_studio_xe_2019_update5_cluster_edition/compilers_and_libraries_2019.4.243/linux/bin/intel64/ifort
276+
277+
Then install mpi4py:
278+
279+
.. code-block:: bash
280+
281+
cd $WORK
282+
mkdir Softwares
283+
cd Softwares
284+
wget https://github.com/mpi4py/mpi4py/releases/download/4.0.3/mpi4py-4.0.3.tar.gz
285+
tar zxvf mpi4py-4.0.3.tar.gz
286+
cd mpi4py-4.0.3
287+
python setup.py build
288+
python setup.py install
289+
290+
291+
Now that the environment is set, MultiNest can be installed:
292+
293+
.. code-block:: bash
294+
295+
cd $WORK/Softwares
296+
git clone https://github.com/farhanferoz/MultiNest.git ./MultiNest
297+
cd MultiNest/MultiNest_v3.12_CMake/multinest/
298+
mkdir build
299+
cd build
300+
cmake -DCMAKE_INSTALL_PREFIX=~/pathtosoftwares/MultiNest \
301+
-DCMAKE_{C,CXX}_FLAGS="-O3 -xCORE-AVX512 -mkl" \
302+
-DCMAKE_Fortran_FLAGS="-O3 -xCORE-AVX512 -mkl" \
303+
-DCMAKE_C_COMPILER=mpiicc \
304+
-DCMAKE_CXX_COMPILER=mpiicpc \
305+
-DCMAKE_Fortran_COMPILER=mpiifort ..
306+
make
307+
ls ../lib
308+
309+
Then its Python interface:
310+
311+
.. code-block:: bash
312+
313+
cd $WORK/Softwares
314+
git clone https://github.com/JohannesBuchner/PyMultiNest.git ./pymultinest
315+
cd pymultinest
316+
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$WORK/Softwares/MultiNest/MultiNest_v3.12_CMake/multinest/lib
317+
export LD_PRELOAD=$MKLROOT/lib/intel64/libmkl_core.so:$MKLROOT/lib/intel64/libmkl_sequential.so
318+
python setup.py install
319+
320+
Finally, xpsi can be installed:
321+
322+
.. code-block:: bash
323+
324+
cd $WORK/Softwares
325+
git clone https://github.com/xpsi-group/xpsi.git
326+
cd xpsi/
327+
LDSHARED="icc -shared" CC=icc pip install .
328+
#LDSHARED="/gpfslocalsys/intel/parallel_studio_xe_2019_update4_cluster_edition/compilers_and_libraries_2019.4.243/linux/bin/intel64/icc -shared" CC="/gpfslocalsys/intel/parallel_studio_xe_2019_update4_cluster_edition/compilers_and_libraries_2019.4.243/linux/bin/intel64/icc" pip install .
329+
330+
# To check your installation
331+
cd ../
332+
python -c "import xpsi"
333+
334+
Batch usage
335+
^^^^^^^^^^^
336+
337+
For example job scripts, see the Jean-Zay example in :ref:`example_job`.
338+
233339
.. _CALMIPsystem:
234340

235-
CALMIP
236-
------------------------------------
341+
CALMIP (U. of Toulouse)
342+
-----------------------
237343

238344
`CALMIP <https://www.calmip.univ-toulouse.fr>`_ is the supercomputer of `Université Fédérale de Toulouse <https://www.univ-toulouse.fr>`_
239345

docs/source/Synthetic_data.ipynb

Lines changed: 74 additions & 59 deletions
Large diffs are not rendered by default.

docs/source/install.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -428,7 +428,7 @@ already present in the Mac OS:
428428
429429
brew install llvm
430430
431-
Also use homebrew to install ``cmake``, ``gfortran`` (with ``gcc``) and ``open-mpi`.
431+
Also use homebrew to install ``cmake``, ``gfortran`` (with ``gcc``) and ``open-mpi``.
432432

433433
.. code-block:: bash
434434

xpsi/Signal.py

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -888,6 +888,10 @@ def synthesise(self,
888888
if (format == 'TXT') and (backscal_ratio != 1.0):
889889
raise ValueError('TXT format not supported for backscal_ratio != 1.')
890890

891+
# In the other case, if phase-averaged background was provided, use it
892+
elif len( data_BKG.shape ) == 1:
893+
BKG = BKG.sum( axis = 1 )
894+
891895
# Write expected background
892896
kwargs = {'channels':self._data.channels,
893897
'counts':BKG,
@@ -911,8 +915,15 @@ def _write(self, format, **kwargs):
911915
if format == 'TXT':
912916
self._write_TXT(**kwargs)
913917
elif format == 'FITS':
914-
if len(self._data.phases) > 2:
915-
self._write_EVT(**kwargs)
918+
# Case of 2D counts
919+
if (len(kwargs['counts'].shape) == 2):
920+
# If counts is phase-resolved
921+
if kwargs['counts'].shape[1] > 1:
922+
self._write_EVT(**kwargs)
923+
# If counts is phase-averaged
924+
else:
925+
kwargs['counts'] = kwargs['counts'][:,0]
926+
self._write_PHA(**kwargs)
916927
else:
917928
self._write_PHA(**kwargs)
918929
else:

0 commit comments

Comments
 (0)