This project uses semantic versioning. This change log uses principles from keep a changelog.
- Fixed issue where changes introduced in 0.14.0 meant that the "dtool verify" command stopped working on datasets created using dtool-s3<0.14.0. This fix means that "dtool verify" works as expected on all dtool-s3 datasets. For more details please see: #16. Thanks to Antoine Sanner for raising this issue.
- Add support for datasets containing file names with non-ascii characters. This feature has the potential to introduce issues if one has a proto dataset created using an earlier version of dtool-s3 and one then tries to freeze it with this version of dtool. It is not anticipated that anyone encounter this scenario as proto datasets are more or less ephemeral when datasets are copied to s3. This feature fixes #14. Thanks to Johannes L. Hörmann and Lars Pastewka for reporting this issue.
- Converted generic
DTOOL_S3_DATASET_PREFIXconfig key into endpoint-specificDTOOL_S3_DATASET_PREFIX_<BUCKET NAME>parameter.
- Support for presigned URLs when using
dtool publish. To enable this feature one needs to set theDTOOL_S3_PUBLISH_EXPIRYsetting to the number of seconds one wants the dataset to be accessible for.
- Fixed long standing issue with
created_atandfrozen_atadmin metadata being returned as string rather than float. Many thanks to Johannes L. Hörmann for reporting and fixing. See #13.
- Added ability to specify custom endpoints thanks to Lars Pastewka
- Added ability to specify user prefix for object path thanks to Lars Pastewka
Added support for tags.
- Added
dtool_s3.storagebroker.delete_key()method - Added
dtool_s3.storagebroker.get_tag_key()method - Added
dtool_s3.storagebroker.list_tags()method
- Added
boto3.exceptions.S3UploadFailedErrorto list of exceptions to retry a file upload
- Added more robust logic for retrying interrupted put_item calls in the
S3StorageBroker thanks to Adam Carrgilson. Code now retries when
encountering
botocore.errorfactory.NoSuchUploadand ``botocore.exceptions.EndpointConnectionError.
- Added support for dataset annotations
- Added debug logging
- Added optimisation to improve speed when copying data from S3 object storage
- Cache environment variable changed from DTOOL_S3_CACHE_DIRECTORY to DTOOL_CACHE_DIRECTORY
- Default cache directory changed from
~/.cache/dtool/s3to~/.cache/dtool
- Fixed defect where multipart upload files did not have the md5sum in the manifest
- Add writing of admin_metadata as content of admin_metadata_key
- Added
storage_broker_versionto structure parameters - Added inheritance from
dtoolcore.storagebroker.BaseStorageClass - Overrode
get_textmethod onBaseStorageBrokerclass - Overrode
put_textmethod onBaseStorageBrokerclass - Overrode
get_admin_metadata_keymethod onBaseStorageBrokerclass - Overrode
get_readme_keymethod onBaseStorageBrokerclass - Overrode
get_manifest_keymethod onBaseStorageBrokerclass - Overrode
get_overlay_keymethod onBaseStorageBrokerclass - Overrode
get_structure_keymethod onBaseStorageBrokerclass - Overrode
get_dtool_readme_keymethod onBaseStorageBrokerclass - Overrode
get_size_in_bytesmethod onBaseStorageBrokerclass - Overrode
get_utc_timestampmethod onBaseStorageBrokerclass - Overrode
get_hashmethod onBaseStorageBrokerclass
- Made download to DTOOL_S3_CACHE_DIRECTORY more robust
- Added
http_enablemethod to theS3StorageBrokerclass, to allow publishing of datasets
- README.rst
- dtoolcore dependency in
setup.py
Initial release.