-
Notifications
You must be signed in to change notification settings - Fork 530
Description
Overview of the Feature Request
After some discussion in https://groups.google.com/u/1/g/dataverse-community/c/ZJRAuy-1PKE there was a conclusion that :MaxFileUploadSizeInBytes limits file size in general. And while allowing user to upload small failes (let's say up to 1GB) by dataverse UI is fine, and limiting amount of data that one can upload is also fine, there is a problem with a tool to upload large data. python dvuploader should use direct upload to s3 but still is limited by value :MaxFileUploadSizeInBytes.
Simple fix on dataverse instance is to increase :MaxFileUploadSizeInBytes but this might give wrong imprssion to users that you can upload large data via UI.
What kind of user is the feature intended for?
This is intended for Depositor of large data tat would like to use tool like python dvuploader which is recommended tool for large data upload.
What existing behavior do you want changed?
MaxFileUploadSizeInBytes - should not restrict direct s3 upload.
Any brand new behavior do you want to add to Dataverse?
As we are able to diverse upload limits per type of storage by curl -X PUT -d '{"default":"2147483648","s3":"2147483648"}' http://localhost:8080/api/admin/settings/:MaxFileUploadSizeInBytes
we should be able to have options like:
- general upload limit - current MaxFileUploadSizeInBytes
- UI upload limit - value that will be shown to user when using upload option on the Dataverse instance
- direct upload limit - value that will limit s3 upload via dvuploader
In the minimum version: current MaxFileUploadSizeInBytes could stay as it is and be the limit both for direct and general upload, but it would be nice to show to user different, smaller value on UI so we would not encourage users to upload large data by UI.