-
-
Notifications
You must be signed in to change notification settings - Fork 23
Description
I have some patient data that I am converting from ECAT to NIfTI. The data was originally acquired as DICOM, and then converted to ECAT. I understand this is a little weird, but my colleagues who analyse the data always prefer to evaluate the ECAT versions (either because of the software they use or just a historical thing).
I take the ECAT files and convert them to NIfTI to apply my own processing on them. So far, the conversion has been working well, I have converted hundreds of scans.
However, today I noticed the following when reading the subheader.
ValueError("Unable to determine pixel data type from value: 2 extracted from {'DATA_TYPE': 2, 'NUM_DIMENSIONS': 3, 'X_DIMENSION': 128, 'Y_DIMENSION': 128, 'Z_DIMENSION': 64, 'X_OFFSET': 0.0, 'Y_OFFSET': 0.0, 'Z_OFFSET': 0.0, 'RECON_ZOOM': 0.0, 'SCALE_FACTOR': 0.004170419182628393, 'IMAGE_MIN': 0, 'IMAGE_MAX': 32767, 'X_PIXEL_SIZE': 0.20250001549720764, 'Y_PIXEL_SIZE': 0.20250001549720764, 'Z_PIXEL_SIZE': 0.24249999225139618, 'FRAME_DURATION': 60.0, 'FRAME_START_TIME': 0.0, 'FILTER_CODE': 0, 'X_RESOLUTION': 0.0, 'Y_RESOLUTION': 0.0, 'Z_RESOLUTION': 0.0, 'NUM_R_ELEMENTS': 0.0, 'NUM_ANGLES': 0.0, 'Z_ROTATION_ANGLE': 0.0, 'DECAY_CORR_FCTR': 1.0031609535217285, 'PROCESSING_CODE': 0, 'GATE_DURATION': 0.0, 'R_WAVE_OFFSET': 0.0, 'NUM_ACCEPTED_BEATS': 0, 'FILTER_CUTOFF_FREQUENCY': (), 'FILTER_RESOLUTION': 0.0, 'FILTER_RAMP_SLOPE': 0.0, 'FILTER_ORDER': 0, 'FILTER_SCATTER_FRACTION': 0.0, 'FILTER_SCATTER_SLOPE': 0.0, 'ANNOTATION': '', 'MT_1_1': 0.0, 'MT_1_2': 0.0, 'MT_1_3': 0.0, 'MT_2_1': 0.0, 'MT_2_2': 0.0, 'MT_2_3': 0.0, 'MT_3_1': 0.0, 'MT_3_2': 0.0, 'MT_3_3': 0.0, 'RFILTER_CUTOFF': 0.0, 'RFILTER_RESOLUTION': 0.0, 'RFILTER_CODE': 0, 'RFILTER_ORDER': 0, 'ZFILTER_CUTOFF': 0.0, 'ZFILTER_RESOLUTION': 0.0, 'ZFILTER_CODE': 0, 'ZFILTER_ORDER': 0, 'MT_1_4': 0.0, 'MT_2_4': 0.0, 'MT_3_4': 0.0, 'SCATTER_TYPE': 0, 'RECON_TYPE': 0, 'RECON_VIEWS': 0, 'FILL(87)': (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0), 'FILL(48)': (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)}")
I did some research on this, and I found that this is expected behaviour as the current supported data types are 5 and 6.
PET2BIDS/pypet2bids/pypet2bids/read_ecat.py
Lines 412 to 429 in f7a7ffe
| if subheader_type_number == 7: | |
| image_size = [ | |
| subheader["X_DIMENSION"], | |
| subheader["Y_DIMENSION"], | |
| subheader["Z_DIMENSION"], | |
| ] | |
| # check subheader for pixel datatype | |
| dt_val = subheader["DATA_TYPE"] | |
| if dt_val == 5: | |
| formatting = ">f4" | |
| pixel_data_type = numpy.dtype(formatting) | |
| elif dt_val == 6: | |
| # >H is unsigned short e.g. >u2, reverting to int16 e.g. i2 to align with commit 9beee53 | |
| pixel_data_type = ">i2" | |
| else: | |
| raise ValueError( | |
| f"Unable to determine pixel data type from value: {dt_val} extracted from {subheader}" | |
| ) |
I was checking up on this (with the help of AI), and found that nibabel "supports" the other data types. I use the quotes here because I don't know if they fully tested their commits, although they did add a test specifically for negative values in the pull request that was linked in the comment.
If I update the local version of my read_ecat.py to consider that dt_val == 2 is also valid (as shown in the code snippet below) is there anything that I should be careful of?
dt_val = subheader["DATA_TYPE"]
if dt_val == 5:
formatting = ">f4"
pixel_data_type = numpy.dtype(formatting)
elif dt_val == 6:
pixel_data_type = ">i2"
elif dt_val == 2:
pixel_data_type = ">i2" # Not sure about endianness here but should be big-endian
else:
raise ValueError(
f"Unable to determine pixel data type from value: {dt_val} extracted from {subheader}"
)Thank you in advance for your help!