You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: contents/docs/cdp/batch-exports/s3.md
+24-2Lines changed: 24 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -15,7 +15,7 @@ With batch exports, data can be exported to an S3 bucket.
15
15
1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already.
16
16
2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the **Destinations** tab in your PostHog instance.
17
17
3. Search for **S3**.
18
-
4. Click the **+ Create** button.
18
+
4. Click the **+ Create** button.
19
19
5. Fill in the necessary [configuration details](#s3-configuration).
20
20
6. Finalize the creation by clicking on "Create".
21
21
7. Done! The batch export will schedule its first run on the start of the next period.
@@ -68,4 +68,26 @@ We intend to add support for other common formats, and format-specific configura
68
68
69
69
### S3-compatible blob storage
70
70
71
-
PostHog S3 batch exports may also export data to an S3-compatible blob storage like [MinIO](https://github.com/minio/minio). Simply set the *Endpoint URL* to your blob storage's host and port, for example: `https://my-minio-storage:9000`.
71
+
PostHog S3 batch exports may also export data to an S3-compatible blob storage like [MinIO](https://github.com/minio/minio), [Cloudflare R2](https://www.cloudflare.com/developer-platform/products/r2/), or [Google Cloud Storage (GCS)](https://cloud.google.com/storage). Here we describe configuration tweaks that are required for S3-compatible blob storage destinations that we have tested.
72
+
73
+
#### MinIO
74
+
* Set the *Endpoint URL* configuration to your MinIO instance's host and port, for example: `https://my-minio-storage:9000`.
75
+
76
+
#### Cloudflare R2
77
+
* Set the *Endpoint URL* configuration to the following after replacing your account id: `https://<ACCOUNT_ID>.r2.cloudflarestorage.com`.
78
+
* From the *Region* dropdown, select one of the Cloudflare R2 regions that correspond to your bucket, like "Automatic (AUTO)".
79
+
80
+
#### Google Cloud Storage (GCS)
81
+
Access to GCS for batch exports follows a similar process to accessing [BigQuery](/docs/cdp/batch-exports/bigquery)
82
+
as a Service Account is required:
83
+
1. Follow the steps in the [BigQuery documentation](/docs/cdp/batch-exports/bigquery) to create a Service Account.
84
+
2. Create a [HMAC key for your Service Account](https://cloud.google.com/storage/docs/authentication/managing-hmackeys#console).
85
+
3. Grant the Service Account the `Storage Object User` role or a custom role with at least the following permissions:
86
+
*`storage.multipartUploads.abort`
87
+
*`storage.multipartUploads.create`
88
+
*`storage.multipartUploads.list`
89
+
*`storage.multipartUploads.listParts`
90
+
*`storage.objects.create`
91
+
*`storage.objects.delete`
92
+
4. Use the HMAC key access key and secret key as *AWS Access Key ID* and *AWS Secret Access Key* respectively when configuring your batch export.
93
+
5. Finally, set the *Endpoint URL* configuration to: `https://storage.googleapis.com`.
0 commit comments