Skip to content

Commit 9501540

Browse files
landmanbesterLandman Bester
andauthored
V0.0.6 dev (#132)
* update stimela in pyproject.toml * add 3.12 to classifiers in pyproject.toml * add jax version of psf convolve * modify kclean to work with undersized psf * fix wsum normalisation in subminor. avoid circular import. lower kclean test threshold * clean up external masking logic in kclean * report which threading layer was in use for numba after sara * add multi-stokes predict * don't double normalise wgt per band in L2-reweight steps * Add joint correlation gridding (#131) * start adding joint polarisation products * compute natural grad with jax in hci * attempt to limit number of jax cpu threads per process * don't limit XLA cpu device count * use first element of tuple produced by cg in hci * tweaks * get grid app working with joint pol products. Add beams table to fits hdu * try IFT pol model with approx likelihood * convolving correlations is hard * polfusion * fiddle with BEAMS subtable * add some checks * Fix BEAMS subtable so that beam displays correctly in CARTA * small tweaks to parser * fix failing tests * use 90-PA in add_beampars * reinstate nearsest neighbour gridding for weighting tests * Also test FS in test_polproducts * fix flag ncorr for averaging routines, update Dockerfile, remove spotless cab * cache MFS PSFPARS in dds, make psfpars_mfs optional in dds2fits * simple transient injection in hci worker (no pol parameters) * update restore worker to work with multiple times and correlations * fix typos in restore worker, remove some unused imports * add complex convolve test when gridding correlations directly * render wsclean models to mds * remove product from fits name in model2comps * typo * fix axis order in model2comps, write out cube as sanity check * fix field delection in degrid worker * do not drop ncorr on flag before weight_data in hci * add inject-transients to hci config * pass mask into counts to weigts in hci * start adding polclean functionality. Don't double normalise restored image by wsum * add fskclean test --------- Co-authored-by: Landman Bester <lbester@com08.science.kat.ac.za> * transpose stokes axis in hci worker * convert beam params to degrees and add during header construction * add psf-relative-size to hci. only make it big enough to fit the main lobe by default * cast PA to degrees in fits header * also for MFS beam pars * redefine pa as 90-pa for consistency with fits conventions * filter extreme counts in hci worker * mimic no-use-weights-as-taper * allow single precision weighting * noop when filter-counts-level is None * only grid half the uv-plane inside weighting functions * change uv bins in weighting code, use floor instead of round * check bounds in both compute_counts and counts_to_weights * keep uvw and freq in double precision * add better tests for weighting code. don't use fatmath when computing weights. tidy up and add min_padding parameter to hci worker * add option to write out weight grid in hci worker * add arbitrary stokes outputs * make zarr output of hci worker compatible with what is expected in breifast * add NAXIS keywords to fits headers. update product checks in init * fix indentation in hci * use log.info instead of file=log * fix failing tests * reverse sign of X in hci output * do not use fastmath in wavelets * init_factor -> init-factor * log.info -> print in utils * normalise by wsum when writing zarr in hci. remove manual encoding * base divergence check on whether both rmax and rms increase, not just rms * add header ordering in hci comment. fix weighting when k>0 * enforce proper ordering of header parameters when converting WCS to a header * convert pa to degrees in stokes2im * update pyscilog * pyscilog master -> main * resort to nearest neighbour gridding to compute the weights if all else fails * divorce pyscilog. start adding rephasing to hci worker * add logging.py * update actions and set up dependabot --------- Co-authored-by: Landman Bester <lbester@com08.science.kat.ac.za>
1 parent 1ad9849 commit 9501540

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

55 files changed

+4629
-1698
lines changed

.github/dependabot.yml

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
version: 2
2+
updates:
3+
# Monitor GitHub Actions
4+
- package-ecosystem: "github-actions"
5+
directory: "/"
6+
schedule:
7+
interval: "weekly"
8+
day: "monday"
9+
time: "09:00"
10+
open-pull-requests-limit: 10
11+
reviewers:
12+
- "@bester"
13+
assignees:
14+
- "@bester"
15+
commit-message:
16+
prefix: "ci"
17+
include: "scope"
18+
19+
# Monitor Python dependencies
20+
- package-ecosystem: "pip"
21+
directory: "/"
22+
schedule:
23+
interval: "weekly"
24+
day: "monday"
25+
time: "09:00"
26+
open-pull-requests-limit: 10
27+
reviewers:
28+
- "@bester"
29+
assignees:
30+
- "@bester"
31+
commit-message:
32+
prefix: "deps"
33+
include: "scope"

.github/workflows/ci.yml

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ on:
1111
pull_request:
1212

1313
env:
14-
POETRY_VERSION: 1.5
14+
POETRY_VERSION: 1.8.4
1515

1616
jobs:
1717
should-run:
@@ -82,24 +82,24 @@ jobs:
8282
python-version: ["3.10", "3.11", "3.12"]
8383

8484
steps:
85+
- name: Checkout source
86+
uses: actions/checkout@v4
87+
with:
88+
fetch-depth: 0
89+
8590
- name: Set up Python ${{ matrix.python-version }}
86-
uses: actions/setup-python@v4
91+
uses: actions/setup-python@v5
8792
with:
8893
python-version: ${{ matrix.python-version}}
8994

9095
- name: Install poetry
91-
uses: abatilo/actions-poetry@v2
96+
uses: abatilo/actions-poetry@v3
9297
with:
9398
poetry-version: ${{ env.POETRY_VERSION }}
9499

95100
- name: Check poetry install
96101
run: poetry --version
97102

98-
- name: Checkout source
99-
uses: actions/checkout@v3
100-
with:
101-
fetch-depth: 0
102-
103103
- name: Restore repo times
104104
uses: chetan/git-restore-mtime-action@v2
105105

@@ -113,7 +113,7 @@ jobs:
113113
echo "timestamp=$(/bin/date -u '+%Y%m%d%H%M%S')" >> $GITHUB_OUTPUT
114114
115115
- name: Cache Numba Kernels
116-
uses: actions/cache@v3
116+
uses: actions/cache@v4
117117
with:
118118
key: numba-cache-${{ matrix.python-version }}-${{ steps.numba-key.outputs.timestamp }}
119119
restore-keys: numba-cache-${{ matrix.python-version }}-
@@ -124,7 +124,7 @@ jobs:
124124

125125
- name: Load cached CASA Measures Data
126126
id: load-cached-casa-measures
127-
uses: actions/cache@v3
127+
uses: actions/cache@v4
128128
with:
129129
key: casa-measures-${{ hashFiles('measures_dir.txt')}}
130130
path: |
@@ -163,7 +163,7 @@ jobs:
163163
python-version: "3.10"
164164

165165
- name: Install poetry
166-
uses: abatilo/actions-poetry@v2
166+
uses: abatilo/actions-poetry@v3
167167
with:
168168
poetry-version: ${{ env.POETRY_VERSION }}
169169

@@ -177,7 +177,7 @@ jobs:
177177
run: poetry build
178178

179179
- name: Publish distribution to PyPI
180-
uses: pypa/gh-action-pypi-publish@27b31702a0e7fc50959f5ad993c78deac1bdfc29
180+
uses: pypa/gh-action-pypi-publish@v1.12.2
181181
with:
182182
user: __token__
183183
password: ${{ secrets.PYPI_API_TOKEN }}

Dockerfile

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,5 +13,4 @@ RUN apt -y update && \
1313

1414
RUN python -m pip install -U pip setuptools wheel && \
1515
python -m pip install -U pfb-imaging@git+https://github.com/ratt-ru/pfb-imaging@main && \
16-
python -m pip install numpy==1.22 && \
1716
python -m pip cache purge

README.rst

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,6 @@ Install the package by cloning and running
1111

1212
:code:`$ pip install -e pfb-imaging/`
1313

14-
Note casacore needs to be installed on the system for this to work.
15-
1614
You will probably may also need to update pip eg.
1715

1816
:code:`$ pip install -U pip setuptools wheel`
@@ -94,6 +92,6 @@ Documentation for each worker is listed under
9492
Acknowledgement
9593
~~~~~~~~~~~~~~~~~
9694

97-
If you find any of this useful please cite (for now)
95+
If you find any of this useful please cite
9896

99-
https://arxiv.org/abs/2101.08072
97+
https://arxiv.org/abs/2412.10073

pfb/__init__.py

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,10 @@ def set_envs(nthreads, ncpu):
99
os.environ["VECLIB_MAXIMUM_THREADS"] = str(nthreads)
1010
os.environ["NPY_NUM_THREADS"] = str(nthreads)
1111
os.environ["NUMBA_NUM_THREADS"] = str(nthreads)
12-
os.environ["JAX_PLATFORMS"] = 'cpu'
12+
# os.environ["JAX_PLATFORMS"] = 'cpu'
13+
# os.environ["XLA_FLAGS"] = ("--xla_cpu_multi_thread_eigen=true "
14+
# "--xla_force_host_platform_device_count=4 "
15+
# f"intra_op_parallelism_threads={str(nthreads)}")
1316
os.environ["JAX_ENABLE_X64"] = 'True'
1417
# this may be required for numba parallelism
1518
# find python and set LD_LIBRARY_PATH
@@ -52,14 +55,14 @@ def set_client(nworkers, log, stack=None, host_address=None,
5255
host_address = host_address or os.environ.get("DASK_SCHEDULER_ADDRESS")
5356
if host_address is not None:
5457
from distributed import Client
55-
print("Initialising distributed client.", file=log)
58+
log.info("Initialising distributed client.")
5659
if stack is not None:
5760
client = stack.enter_context(Client(host_address))
5861
else:
5962
client = Client(host_address)
6063
else:
6164
from dask.distributed import Client, LocalCluster
62-
print("Initialising client with LocalCluster.", file=log)
65+
log.info("Initialising client with LocalCluster.")
6366
dask.config.set({
6467
'distributed.comm.compression': {
6568
'on': True,
@@ -80,7 +83,7 @@ def set_client(nworkers, log, stack=None, host_address=None,
8083

8184
client.wait_for_workers(nworkers)
8285
dashboard_url = client.dashboard_link
83-
print(f"Dask Dashboard URL at {dashboard_url}", file=log)
86+
log.info(f"Dask Dashboard URL at {dashboard_url}")
8487

8588
return client
8689

0 commit comments

Comments
 (0)