You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* chore(firestore-bigquery-changetracker): bump version
* fix(firestore-bigquery-export): added ts-expect-error and TODOs in the import script
* feat: try to immediately write to bq first
* chore: remove legacy backfill code
* feat: add max enqueue attempts param
* test: add flags to test, remove unused resource
* feat: add backup to gcs
* chore(firestore-bigquery-export): temporarily disable GCS
* chore: bump ext version
* fix(firstore-bigquery-export): comment out unused role for now and use logging
* fix(firestore-bigquery-export): implemented RC changes including logging keys
* chore(firestore-bigquery-export): update README and CHANGELOG
* chore(firestore-bigquery-export): update CHANGELOG
Copy file name to clipboardexpand all lines: firestore-bigquery-export/README.md
+2-6
Original file line number
Diff line number
Diff line change
@@ -126,8 +126,6 @@ To install an extension, your project must be on the [Blaze (pay as you go) plan
126
126
127
127
* Collection path: What is the path of the collection that you would like to export? You may use `{wildcard}` notation to match a subcollection of all documents in a collection (for example: `chatrooms/{chatid}/posts`). Parent Firestore Document IDs from `{wildcards}` can be returned in `path_params` as a JSON formatted string.
128
128
129
-
* Enable logging failed exports: If enabled, the extension will log event exports that failed to enqueue to Cloud Logging, to mitigate data loss.
130
-
131
129
* Enable Wildcard Column field with Parent Firestore Document IDs: If enabled, creates a column containing a JSON object of all wildcard ids from a documents path.
132
130
133
131
* Dataset ID: What ID would you like to use for your BigQuery dataset? This extension will create the dataset, if it doesn't already exist.
@@ -158,18 +156,16 @@ essential for the script to insert data into an already partitioned table.)
158
156
159
157
* Exclude old data payloads: If enabled, table rows will never contain old data (document snapshot before the Firestore onDocumentUpdate event: `change.before.data()`). The reduction in data should be more performant, and avoid potential resource limitations.
160
158
161
-
* Use Collection Group query: Do you want to use a [collection group](https://firebase.google.com/docs/firestore/query-data/queries#collection-group-query) query for importing existing documents? You have to enable collectionGroup query if your import path contains subcollections. Warning: A collectionGroup query will target every collection in your Firestore project that matches the 'Existing documents collection'. For example, if you have 10,000 documents with a subcollection named: landmarks, this will query every document in 10,000 landmarks collections.
162
-
163
159
* Cloud KMS key name: Instead of Google managing the key encryption keys that protect your data, you control and manage key encryption keys in Cloud KMS. If this parameter is set, the extension will specify the KMS key name when creating the BQ table. See the PREINSTALL.md for more details.
164
160
161
+
* Maximum number of enqueue attempts: This parameter will set the maximum number of attempts to enqueue a document to cloud tasks for export to BigQuery. If the maximum number of attempts is reached, the failed export will be handled according to the `LOG_FAILED_EXPORTS` parameter.
162
+
165
163
166
164
167
165
**Cloud Functions:**
168
166
169
167
***fsexportbigquery:** Listens for document changes in your specified Cloud Firestore collection, then exports the changes into BigQuery.
170
168
171
-
***fsimportexistingdocs:** Imports existing documents from the specified collection into BigQuery. Imported documents will have a special changelog with the operation of `IMPORT` and the timestamp of epoch.
172
-
173
169
***syncBigQuery:** A task-triggered function that gets called on BigQuery sync
174
170
175
171
***initBigQuerySync:** Runs configuration for sycning with BigQuery
0 commit comments