Skip to content

Nexus workflow module: Handling time-dependent NXtransformations? #96

Open
@SimonHeybrock

Description

@SimonHeybrock

When component positions in a NeXus file are not fixed, this is represented as one or more transformations with a value dataset replaced by a time-series (NXlog). In this case it is not possible to compute, e.g., a unique sample_position or position coordinate when loading.

The current workflows simply fail, since scippnexus.compute_positions will not create a detector position coordinate — instead the surrounding DataGroup will have a time-dependent position DataArray. This is deliberately not multiplied into the pixel position offsets since this could and would lead to a massive memory use (in addition to leading to a coordinate with time-dependence, which is not allowed by Scipp unless the data has this coord, which it does not).

In practice, I think data reduction will have to be performed for various time intervals, with constant positions in each interval. We somehow need to facilitate handling this, extracting relevant information from various parts. This includes determining a unique position coordinate, as well as loading the correct chunk of events. The current GenericNeXusWorkflow and related components do not support this. We need to figure out a convenient and safe mechanism to make this work. There are a bunch of cases:

  • Detector moves, e.g., to cover different angles.
  • Detector moves to different distance.
  • Sample rotates (SXD).

What many of these have in common is that intermediate workflow results may have to be combined into a single final result. For example, a moving single detector could be interpreted as an instrument with multiple detector banks.

For now, I think we should focus on the case of step-wise movement but keep the continuous scanning case in the back of our minds.

I think there are roughly two fundamentally different approaches that could be considered:

  1. Keep time dependent positions, when performing coordinate transformations perform a lookup from each neutron event's time to the corresponding position and sample_position. Conceptually this would mean changing the current def position(event_id): ... to def position(event_id, event_time_zero): .... As mentioned before, spelling the position out as an array can be prohibitively large, so this needs to be implemented as a sort of lookup function.
  2. Split run into "constant" sections, reduce each individually, accumulate/combine in later steps, once the position dependence has been removed.

I believe that even if we want to do 1.), there will be cases where 2.) is required for scientific reasons, e.g., since the Q-resolution may differ and the results from different detector positions should not actually be merged directly. We should therefore consider fully focusing on 2.), to see how far that approach will get us.

Relevant instruments:

  • NMX moves detectors freely in space. Probably not during a measurement or only on long time scale?
  • Bifrost rotates detector bank. I think this will be done during measurements, but probably on a longer time scale?
  • NMX, Bifrost, C-SPEC, MAGIC, ... will rotate the sample. Does this affect the effective sample position? We may be able to assume that the position is unchanged, but the NXlogs may tell a different story, since the sample might be on a motion stage.

Related:

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

  • Status

    In progress

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions