Skip to content

Commit c8097b5

Browse files
committed
Use Markdown readme instead or RST to use native mermaid diagrams
1 parent 9846a80 commit c8097b5

6 files changed

+232
-229
lines changed

Makefile

-2
This file was deleted.

README.md

+231
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,231 @@
1+
# django-s3file
2+
3+
A lightweight file upload input for Django and Amazon S3.
4+
5+
Django-S3File allows you to upload files directly AWS S3 effectively
6+
bypassing your application server. This allows you to avoid long running
7+
requests from large file uploads. This is particularly helpful for if
8+
you run your service on AWS Lambda or Heroku where you have a hard
9+
request limit.
10+
11+
[![PyPi
12+
Version](https://img.shields.io/pypi/v/django-s3file.svg)](https://pypi.python.org/pypi/django-s3file/)
13+
[![Test
14+
Coverage](https://codecov.io/gh/codingjoe/django-s3file/branch/master/graph/badge.svg)](https://codecov.io/gh/codingjoe/django-s3file)
15+
[![GitHub
16+
license](https://img.shields.io/badge/license-MIT-blue.svg)](https://raw.githubusercontent.com/codingjoe/django-s3file/master/LICENSE)
17+
18+
## Features
19+
20+
- lightweight: less 200 lines
21+
- no JavaScript or Python dependencies (no jQuery)
22+
- easy integration
23+
- works just like the built-in
24+
- extendable JavaScript API
25+
26+
## For the Nerds
27+
28+
```mermaid
29+
sequenceDiagram
30+
autonumber
31+
actor Browser
32+
Browser->>S3: POST large file
33+
activate S3
34+
S3->>Browser: RESPONSE AWS S3 key
35+
Browser->>Middleware: POST AWS S3 key
36+
activate Middleware
37+
Middleware->>S3: GET AWS S3 key
38+
S3->>Middleware: RESPONSE large file promise
39+
deactivate S3
40+
Middleware->>Django: request incl. large file promise
41+
deactivate Middleware
42+
activate Django
43+
opt only if files is procssed by Django
44+
Django-->>S3: GET large file
45+
activate S3
46+
S3-->>Django: RESPONSE large file
47+
deactivate S3
48+
end
49+
Django->>Browser: RESPONSE success
50+
deactivate Django
51+
```
52+
53+
In a nutshell, we can bypass Django completely and have AWS handle
54+
the upload or any processing. Of course, if you want to do something
55+
with your file in Django, you can do so, just like before, with the
56+
added advantage, that your file is served from within your datacenter.
57+
58+
## Installation
59+
60+
Make sure you have [Amazon S3
61+
storage](http://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html)
62+
setup correctly.
63+
64+
Just install S3file using `pip`.
65+
66+
```bash
67+
pip install django-s3file
68+
# or
69+
pipenv install django-s3file
70+
```
71+
72+
Add the S3File app and middleware in your settings:
73+
74+
```python
75+
# settings.py
76+
77+
INSTALLED_APPS = (
78+
'...',
79+
's3file',
80+
'...',
81+
)
82+
83+
MIDDLEWARE = (
84+
'...',
85+
's3file.middleware.S3FileMiddleware',
86+
'...',
87+
)
88+
```
89+
90+
## Usage
91+
92+
S3File automatically replaces Django's `ClearableFileInput` widget, you
93+
do not need to alter your code at all.
94+
95+
The `ClearableFileInput` widget is only than automatically replaced when
96+
the `DEFAULT_FILE_STORAGE` setting is set to `django-storages`'
97+
`S3Boto3Storage` or the dummy `FileSystemStorage` is enabled.
98+
99+
### Setting up the AWS S3 bucket
100+
101+
#### Upload folder
102+
103+
S3File uploads to a single folder. Files are later moved by Django when
104+
they are saved to the `upload_to` location.
105+
106+
It is recommended to [setup
107+
expiration](http://docs.aws.amazon.com/AmazonS3/latest/dev/intro-lifecycle-rules.html)
108+
for that folder, to ensure that old and unused file uploads don't add up
109+
and produce costs.
110+
111+
The default folder name is: `tmp/s3file` You can change it by changing
112+
the `S3FILE_UPLOAD_PATH` setting.
113+
114+
#### CORS policy
115+
116+
You will need to allow `POST` from all origins. Just add the following
117+
to your CORS policy.
118+
119+
```json
120+
[
121+
{
122+
"AllowedHeaders": [
123+
"*"
124+
],
125+
"AllowedMethods": [
126+
"POST"
127+
],
128+
"AllowedOrigins": [
129+
"*"
130+
],
131+
"ExposeHeaders": [],
132+
"MaxAgeSeconds": 3000
133+
}
134+
]
135+
```
136+
137+
### Progress Bar
138+
139+
S3File does emit progress signals that can be used to display some kind
140+
of progress bar. Signals named `progress` are emitted for both each
141+
individual file input as well as for the form as a whole.
142+
143+
The progress signal carries the following details:
144+
145+
```javascript
146+
console.log(event.detail)
147+
148+
{
149+
progress: 0.4725307607171312 // total upload progress of either a form or single input
150+
loaded: 1048576 // total upload progress of either a form or single input
151+
total: 2219064 // total bytes to upload
152+
currentFile: File {…} // file object
153+
currentFileName: "text.txt" // file name of the file currently uploaded
154+
currentFileProgress: 0.47227834703299176 // upload progress of that file
155+
originalEvent: ProgressEvent {…} // the original XHR onprogress event
156+
}
157+
```
158+
159+
The following example implements a Boostrap progress bar for upload
160+
progress of an entire form.
161+
162+
```html
163+
<div class="progress">
164+
<div class="progress-bar" role="progressbar" style="width: 0%;" aria-valuenow="0" aria-valuemin="0" aria-valuemax="100">0%</div>
165+
</div>
166+
```
167+
168+
```javascript
169+
(function () {
170+
var form = document.getElementsByTagName('form')[0]
171+
var progressBar = document.getElementsByClassName('progress-bar')[0]
172+
173+
form.addEventListener('progress', function (event) {
174+
// event.detail.progress is a value between 0 and 1
175+
var percent = Math.round(event.detail.progress * 100)
176+
177+
progressBar.setAttribute('style', 'width:' + percent + '%')
178+
progressBar.setAttribute('aria-valuenow', percent)
179+
progressBar.innerText = percent + '%'
180+
})
181+
})()
182+
```
183+
184+
### Using S3File in development
185+
186+
Using S3File in development can be helpful especially if you want to use
187+
the progress signals described above. Therefore, S3File comes with a AWS
188+
S3 dummy backend. It behaves similar to the real S3 storage backend. It
189+
is automatically enabled, if the `DEFAULT_FILE_STORAGE` setting is set
190+
to `FileSystemStorage`.
191+
192+
To prevent users from accidentally using the `FileSystemStorage` and the
193+
insecure S3 dummy backend in production, there is also an additional
194+
deployment check that will error if you run Django\'s deployment check
195+
suite:
196+
197+
```shell
198+
python manage.py check --deploy
199+
```
200+
201+
We recommend always running the deployment check suite as part of your
202+
deployment pipeline.
203+
204+
### Uploading multiple files
205+
206+
Django does have limited support for [uploading multiple
207+
files](https://docs.djangoproject.com/en/stable/topics/http/file-uploads/#uploading-multiple-files).
208+
S3File fully supports this feature. The custom middleware makes ensure
209+
that files are accessible via `request.FILES`, even though they have
210+
been uploaded to AWS S3 directly and not to your Django application
211+
server.
212+
213+
### Using optimized S3Boto3Storage
214+
215+
Since `S3Boto3Storage` supports storing data from any other fileobj, it
216+
uses a generalized `_save` function. This leads to the frontend
217+
uploading the file to S3 and then copying it byte-by-byte to perform a
218+
move operation just to rename the uploaded object. For large files this
219+
leads to additional loading times for the user.
220+
221+
That\'s why S3File provides an optimized version of this method at
222+
`storages_optimized.S3OptimizedUploadStorage`. It uses the more
223+
efficient `copy` method from S3, given that we know that we only copy
224+
from one S3 location to another.
225+
226+
```python
227+
from s3file.storages_optimized import S3OptimizedUploadStorage
228+
229+
class MyStorage(S3OptimizedUploadStorage): # Subclass and use like any other storage
230+
default_acl = 'private'
231+
```

0 commit comments

Comments
 (0)