Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions docs/source/details/backendconfig.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,11 @@ Using the Streaming API (i.e. ``SeriesImpl::readIteration()``) will do this auto
Parsing eagerly might be very expensive for a Series with many iterations, but will avoid bugs by forgotten calls to ``Iteration::open()``.
In complex environments, calling ``Iteration::open()`` on an already open environment does no harm (and does not incur additional runtime cost for additional ``open()`` calls).

The key ``resizable`` can be passed to ``Dataset`` options.
It if set to ``{"resizable": true}``, this declares that it shall be allowed to increased the ``Extent`` of a ``Dataset`` via ``resetDataset()`` at a later time, i.e., after it has been first declared (and potentially written).
For HDF5, resizable Datasets come with a performance penalty.
For JSON and ADIOS2, all datasets are resizable, independent of this option.

Configuration Structure per Backend
-----------------------------------

Expand Down
2 changes: 1 addition & 1 deletion include/openPMD/Dataset.hpp
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
/* Copyright 2017-2021 Fabian Koller
/* Copyright 2017-2021 Fabian Koller, Franz Poeschel, Axel Huebl
*
* This file is part of openPMD-api.
*
Expand Down
23 changes: 18 additions & 5 deletions src/IO/HDF5/HDF5IOHandler.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -275,13 +275,23 @@ HDF5IOHandlerImpl::createDataset(Writable* writable,
name = auxiliary::replace_last(name, "/", "");

auto config = nlohmann::json::parse( parameters.options );

// general
bool is_resizable_dataset = false;
if( config.contains( "resizable" ) )
{
is_resizable_dataset = config.at( "resizable" ).get< bool >();
}

// HDF5 specific
if( config.contains( "hdf5" ) &&
config[ "hdf5" ].contains( "dataset" ) )
{
auxiliary::TracingJSON datasetConfig{
config[ "hdf5" ][ "dataset" ] };

/*
* @todo Read options from config here.
* @todo Read more options from config here.
*/
auto shadow = datasetConfig.invertShadow();
if( shadow.size() > 0 )
Expand Down Expand Up @@ -317,7 +327,11 @@ HDF5IOHandlerImpl::createDataset(Writable* writable,
num_elements *= val;
}

hid_t space = H5Screate_simple(static_cast< int >(dims.size()), dims.data(), dims.data());
std::vector< hsize_t > max_dims( dims.begin(), dims.end() );
if( is_resizable_dataset )
max_dims.assign( dims.size(), H5F_UNLIMITED );

hid_t space = H5Screate_simple(static_cast< int >(dims.size()), dims.data(), max_dims.data());
VERIFY(space >= 0, "[HDF5] Internal error: Failed to create dataspace during dataset creation");

/* enable chunking on the created dataspace */
Expand Down Expand Up @@ -429,8 +443,7 @@ HDF5IOHandlerImpl::extendDataset(Writable* writable,
VERIFY(
ndims >= 0,
"[HDF5]: Internal error: Failed to retrieve dimensionality of "
"dataset "
"during dataset read." );
"dataset during dataset read." );
hid_t propertyList = H5Dget_create_plist( dataset_id );
std::vector< hsize_t > chunkExtent( ndims, 0 );
int chunkDimensionality =
Expand All @@ -439,7 +452,7 @@ HDF5IOHandlerImpl::extendDataset(Writable* writable,
{
throw std::runtime_error(
"[HDF5] Cannot extend datasets unless written with chunked "
"layout (currently unsupported)." );
"layout." );
}
}

Expand Down
10 changes: 8 additions & 2 deletions test/SerialIOTest.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -3836,7 +3836,8 @@ extendDataset( std::string const & ext )
}
// only one iteration written anyway
write.setIterationEncoding( IterationEncoding::variableBased );
Dataset ds1{ Datatype::INT, { 5, 5 } };

Dataset ds1{ Datatype::INT, { 5, 5 }, "{ \"resizable\": true }" };
Dataset ds2{ Datatype::INT, { 10, 5 } };

// array record component -> array record component
Expand Down Expand Up @@ -3953,7 +3954,12 @@ TEST_CASE( "extend_dataset", "[serial]" )
extendDataset( "bp" );
#endif
#if openPMD_HAVE_HDF5
// extendDataset( "h5" );
// extensible datasets require chunking
// skip this test for if chunking is disabled
if( auxiliary::getEnvString( "OPENPMD_HDF5_CHUNKS", "auto" ) != "none" )
{
extendDataset("h5");
}
#endif
}

Expand Down