You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For a full example see [node-addon-examples's package.json](https://github.com/springmeyer/node-addon-example/blob/master/package.json).
@@ -151,9 +155,9 @@ The location your native module is placed after a build. This should be an empty
151
155
152
156
Note: This property supports variables based on [Versioning](#versioning).
153
157
154
-
###### host
158
+
###### host (and host.endpoint)
155
159
156
-
A url to the remote location where you've published tarball binaries (must be `https` not `http`).
160
+
An object with atleast a single key `endpoint` defining the remote location where you've published tarball binaries (must be `https` not `http`).
157
161
158
162
It is highly recommended that you use Amazon S3. The reasons are:
159
163
@@ -165,13 +169,21 @@ Why then not require S3? Because while some applications using node-pre-gyp need
165
169
166
170
It should also be mentioned that there is an optional and entirely separate npm module called [node-pre-gyp-github](https://github.com/bchr02/node-pre-gyp-github) which is intended to complement node-pre-gyp and be installed along with it. It provides the ability to store and publish your binaries within your repositories GitHub Releases if you would rather not use S3 directly. Installation and usage instructions can be found [here](https://github.com/bchr02/node-pre-gyp-github), but the basic premise is that instead of using the ```node-pre-gyp publish``` command you would use ```node-pre-gyp-github publish```.
167
171
168
-
##### The `binary` object other optional S3 properties
172
+
This looks like:
169
173
170
-
If you are not using a standard s3 path like `bucket_name.s3(.-)region.amazonaws.com`, you might get an error on `publish` because node-pre-gyp extracts the region and bucket from the `host` url. For example, you may have an on-premises s3-compatible storage server, or may have configured a specific dns redirecting to an s3 endpoint. In these cases, you can explicitly set the `region` and `bucket` properties to tell node-pre-gyp to use these values instead of guessing from the `host` property. The following values can be used in the `binary` section:
#####The `host` object other optional S3 properties
173
185
174
-
The url to the remote server root location (must be `https` not `http`).
186
+
If you are not using a standard s3 path like `bucket_name.s3(.-)region.amazonaws.com`, you might get an error on `publish` because node-pre-gyp extracts the region and bucket from the `host` url. For example, you may have an on-premises s3-compatible storage server, or may have configured a specific dns redirecting to an s3 endpoint. In these cases, you can explicitly set the `region` and `bucket` properties to tell node-pre-gyp to use these values instead of guessing from the `host` property. The following values can be used in the `binary` section:
175
187
176
188
###### bucket
177
189
@@ -185,6 +197,21 @@ Your S3 server region.
185
197
186
198
Set `s3ForcePathStyle` to true if the endpoint url should not be prefixed with the bucket name. If false (default), the server endpoint would be constructed as `bucket_name.your_server.com`.
187
199
200
+
For example using an alternate S3 compatible host:
201
+
202
+
```js
203
+
{
204
+
"binary": {
205
+
"host": {
206
+
"endpoint":"https://play.min.io",
207
+
"bucket":"node-pre-gyp-production",
208
+
"region":"us-east-1",
209
+
"s3ForcePathStyle":true
210
+
}
211
+
}
212
+
}
213
+
```
214
+
188
215
##### The `binary` object has optional properties
189
216
190
217
###### remote_path
@@ -312,28 +339,38 @@ If a a binary was not available for a given platform and `--fallback-to-build` w
312
339
313
340
#### 9) One more option
314
341
315
-
It may be that you want to work with two s3 buckets, one for staging and one for production; this
316
-
arrangement makes it less likely to accidentally overwrite a production binary. It also allows the production
317
-
environment to have more restrictive permissions than staging while still enabling publishing when
318
-
developing and testing.
342
+
It may be that you want to work with multiple s3 buckets, one for development, on for staging and one for production; such arrangement makes it less likely to accidentally overwrite a production binary. It also allows the production environment to have more restrictive permissions than development or staging while still enabling publishing when developing and testing.
319
343
320
-
The binary.host property can be set at execution time. In order to do so all of the following conditions
321
-
must be true.
322
344
323
-
- binary.host is falsey or not present
324
-
- binary.staging_host is not empty
325
-
- binary.production_host is not empty
345
+
To use that option set `staging_host` and/or `development_host` using settings similar to those used for `host`.
326
346
327
-
If any of these checks fail then the operation will not perform execution time determination of the s3 target.
If the command being executed is either "publish" or "unpublish" then the default is set to `binary.staging_host`. In all other cases
330
-
the default is `binary.production_host`.
369
+
Once a development and/or staging host is defined, if the command being executed is either "publish" or "unpublish" then it will default to the lower of the alternate hosts (development and if not present, staging). if the command being executed is either "install" or "info" it will default to the production host (specified by `host`).
331
370
332
-
The command-line options `--s3_host=staging` or `--s3_host=production` override the default. If `s3_host`
333
-
is present and not `staging` or `production` an exception is thrown.
371
+
To explicitly choose a host use command-line options `--s3_host=development`, `--s3_host=staging` or `--s3_host=production`, or set environment variable `node_pre_gyp_s3_host` to either `development`, `staging` or `production`. Note that the environment variable has priority over the the command line.
334
372
335
-
This allows installing from staging by specifying `--s3_host=staging`. And it requires specifying
336
-
`--s3_option=production` in order to publish to, or unpublish from, production, making accidental errors less likely.
373
+
This setup allows installing from development or staging by specifying `--s3_host=staging`. And it requires specifying `--s3_option=production` in order to publish to, or unpublish from, production, making accidental errors less likely.
0 commit comments