Skip to content

Conversation

@mstoeckl
Copy link
Contributor

@mstoeckl mstoeckl commented Jan 1, 2026

This PR implements memory limits for the Radiance .hdr image decoder. The current decoder both parses and retains (in HdrMetadata:::custom_attributes) essentially the full header of the file, which can be arbitrarily long, so memory limits need to be provided before header decoding.

The actual implementation and memory accounting is rather awkward, and I've tried to simplify it by overestimating peak memory usage for header decoding, and eliminating some allocations for the signature and dimension lines of the header.. In practice, the HDR images I've found have less than a kilobyte of extra information in their header, so I don't expect constant factor overestimates while header decoding to be an issue.

It would be possible to simplify the memory usage estimation somewhat by changing the HdrMetadata public API to retain Vec<u8>s instead of Strings, but I think that would require making the rest of the decoder logic more complicated. (And it would be yet another API break.)

Edit: my initial implementation undercounted/miscalculated some allocations in HdrMetadata::update_header_info, so I've updated the PR to track memory more closely to the place where it is allocated. The changes may make the code a bit easier to follow, but also ~40 lines longer.

The Radiance HDR decoder retains a decoded form of the complete image
header, which can be arbitrarily large; it also requires buffers a full
scanline into memory to decode the newer RLE variant. This change
introduces a new function to create a decoder that errors if too much
memory is used.
@fintelia
Copy link
Contributor

fintelia commented Jan 6, 2026

For a very long time I've wanted to remove PngDecoder::with_limits in favor of the same ImageDecoder::set_limits that the rest of our decoders use. So I'm not especially thrilled about adding a with_limits constructor to the HDR decoder.

To understand, how common are custom attributes and how much space do they generally consume? If we knew that the attributes would never take more than, say, 100 KB then we could avoid making the limit configurable via a constructor. (Our allocation limits already implicitly ignore tiny fixed sized allocations)

@mstoeckl
Copy link
Contributor Author

mstoeckl commented Jan 7, 2026

To understand, how common are custom attributes and how much space do they generally consume? If we knew that the attributes would never take more than, say, 100 KB then we could avoid making the limit configurable via a constructor. (Our allocation limits already implicitly ignore tiny fixed sized allocations)

The longest headers I've been able to find (looking online, and considering programs that can produce .hdr images) go up to a bit more than 2.5 kilobytes (current record at https://radsite.lbl.gov/radiance/pub/libraries/Rad_skies.tar.Z), and occur in very old images produced by the Radiance tools. These images these often retain the chain of commands used to generate the image (oconv -f ..., rpict -vf ..., etc., which can get long if commands like pcompos were used to combine multiple images into one; some of these commands are also indented). Some of the Radiance programs may also produce fields, like CAMERA and CAPDATE, which aren't explicitly mentioned in the spec. Newer images tend to be produced by conversion from some other format, and may have a comment (# Made with 100% pure HDR Shop), possibly some free form text without a leading #, maybe an EXPOSURE line, and rarely anything else in the header.

I think even a 4kB limit on the total header length would be unlikely to cause spurious decoding failures in practice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants