Skip to content

Commit c188957

Browse files
committed
feat: Break down s3-compatible configurations
1 parent 9ec2240 commit c188957

File tree

1 file changed

+24
-2
lines changed
  • contents/docs/cdp/batch-exports

1 file changed

+24
-2
lines changed

contents/docs/cdp/batch-exports/s3.md

Lines changed: 24 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ With batch exports, data can be exported to an S3 bucket.
1515
1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already.
1616
2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the **Destinations** tab in your PostHog instance.
1717
3. Search for **S3**.
18-
4. Click the **+ Create** button.
18+
4. Click the **+ Create** button.
1919
5. Fill in the necessary [configuration details](#s3-configuration).
2020
6. Finalize the creation by clicking on "Create".
2121
7. Done! The batch export will schedule its first run on the start of the next period.
@@ -68,4 +68,26 @@ We intend to add support for other common formats, and format-specific configura
6868

6969
### S3-compatible blob storage
7070

71-
PostHog S3 batch exports may also export data to an S3-compatible blob storage like [MinIO](https://github.com/minio/minio). Simply set the *Endpoint URL* to your blob storage's host and port, for example: `https://my-minio-storage:9000`.
71+
PostHog S3 batch exports may also export data to an S3-compatible blob storage like [MinIO](https://github.com/minio/minio), [Cloudflare R2](https://www.cloudflare.com/developer-platform/products/r2/), or [Google Cloud Storage (GCS)](https://cloud.google.com/storage). Here we describe configuration tweaks that are required for S3-compatible blob storage destinations that we have tested.
72+
73+
#### MinIO
74+
* Set the *Endpoint URL* configuration to your MinIO instance's host and port, for example: `https://my-minio-storage:9000`.
75+
76+
#### Cloudflare R2
77+
* Set the *Endpoint URL* configuration to the following after replacing your account id: `https://<ACCOUNT_ID>.r2.cloudflarestorage.com`.
78+
* From the *Region* dropdown, select one of the Cloudflare R2 regions that correspond to your bucket, like "Automatic (AUTO)".
79+
80+
#### Google Cloud Storage (GCS)
81+
Access to GCS for batch exports follows a similar process to accessing [BigQuery](/docs/cdp/batch-exports/bigquery)
82+
as a Service Account is required:
83+
1. Follow the steps in the [BigQuery documentation](/docs/cdp/batch-exports/bigquery) to create a Service Account.
84+
2. Create a [HMAC key for your Service Account](https://cloud.google.com/storage/docs/authentication/managing-hmackeys#console).
85+
3. Grant the Service Account the `Storage Object User` role or a custom role with at least the following permissions:
86+
* `storage.multipartUploads.abort`
87+
* `storage.multipartUploads.create`
88+
* `storage.multipartUploads.list`
89+
* `storage.multipartUploads.listParts`
90+
* `storage.objects.create`
91+
* `storage.objects.delete`
92+
4. Use the HMAC key access key and secret key as *AWS Access Key ID* and *AWS Secret Access Key* respectively when configuring your batch export.
93+
5. Finally, set the *Endpoint URL* configuration to: `https://storage.googleapis.com`.

0 commit comments

Comments
 (0)