Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release 11/03/2025 #2324

Merged
merged 6 commits into from
Mar 11, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion firestore-bigquery-export/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
## Version 0.1.59

docs - remove references to lifecycle backfill feature.

docs - correct typo in maximum dispatches per second.

## Version 0.1.58

feat - move to Node.js 20 runtimes
feat - move to Node.js 20 runtimes.

## Version 0.1.57

Expand Down
6 changes: 2 additions & 4 deletions firestore-bigquery-export/POSTINSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -101,14 +101,12 @@ For PowerShell script:

### _(Optional)_ Import existing documents

If you chose _not_ to automatically import existing documents when you installed this extension, you can backfill your BigQuery dataset with all the documents in your collection using the import script.
You can backfill your BigQuery dataset with all the documents in your collection using the import script.

If you don't either enable automatic import or run the import script, the extension only exports the content of documents that are created or changed after installation.
If you don't run the import script, the extension only exports the content of documents that are created or changed after installation.

The import script can read all existing documents in a Cloud Firestore collection and insert them into the raw changelog table created by this extension. The script adds a special changelog for each document with the operation of `IMPORT` and the timestamp of epoch. This is to ensure that any operation on an imported document supersedes the `IMPORT`.

**Warning:** Make sure to not run the import script if you enabled automatic backfill during the extension installation, as it might result in data loss.

**Important:** Run the import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.

Learn more about using the import script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).
Expand Down
8 changes: 2 additions & 6 deletions firestore-bigquery-export/PREINSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,15 +37,11 @@ Before installing this extension, you'll need to:

#### Import existing documents

There are two ways to import existing Firestore documents into BigQuery - the backfill feature and the import script.

To import documents that already exist at installation time into BigQuery, answer **Yes** when the installer asks "Import existing Firestore documents into BigQuery?" The extension will export existing documents as part of the installation and update processes.

Alternatively, you can run the external [import script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md) to backfill existing documents. If you plan to use this script, answer **No** when prompted to import existing documents.
To import existing documents you can run the external [import script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).

**Important:** Run the external import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.

If you don't either enable automatic import or run the import script, the extension only exports the content of documents that are created or changed after installation.
Without use of this import script, the extension only exports the content of documents that are created or changed after installation.

#### Transform function

Expand Down
10 changes: 3 additions & 7 deletions firestore-bigquery-export/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,15 +45,11 @@ Before installing this extension, you'll need to:

#### Import existing documents

There are two ways to import existing Firestore documents into BigQuery - the backfill feature and the import script.

To import documents that already exist at installation time into BigQuery, answer **Yes** when the installer asks "Import existing Firestore documents into BigQuery?" The extension will export existing documents as part of the installation and update processes.

Alternatively, you can run the external [import script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md) to backfill existing documents. If you plan to use this script, answer **No** when prompted to import existing documents.
To import existing documents you can run the external [import script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).

**Important:** Run the external import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.

If you don't either enable automatic import or run the import script, the extension only exports the content of documents that are created or changed after installation.
Without use of this import script, the extension only exports the content of documents that are created or changed after installation.

#### Transform function

Expand Down Expand Up @@ -281,7 +277,7 @@ essential for the script to insert data into an already partitioned table.)
Note: Cluster columns must be top-level, non-repeated columns of one of the following types: BIGNUMERIC, BOOL, DATE, DATETIME, GEOGRAPHY, INT64, NUMERIC, RANGE, STRING, TIMESTAMP. Clustering will not be added if a field with an invalid type is present in this parameter.
Available schema extensions table fields for clustering include: `document_id, document_name, timestamp, event_id, operation, data`.

* Maximum number of synced documents per second: This parameter will set the maximum number of syncronised documents per second with BQ. Please note, any other external updates to a Big Query table will be included within this quota. Ensure that you have a set a low enough number to compensate. Defaults to 10.
* Maximum number of synced documents per second: This parameter will set the maximum number of syncronised documents per second with BQ. Please note, any other external updates to a Big Query table will be included within this quota. Ensure that you have a set a low enough number to compensate. Defaults to 100.

* View Type: Select the type of view to create in BigQuery. A regular view is a virtual table defined by a SQL query. A materialized view persists the results of a query for faster access, with either incremental or non-incremental updates. Please note that materialized views in this extension come with several important caveats and limitations - carefully review the pre-install documentation before selecting these options to ensure they are appropriate for your use case.

Expand Down
4 changes: 2 additions & 2 deletions firestore-bigquery-export/extension.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
# limitations under the License.

name: firestore-bigquery-export
version: 0.1.58
version: 0.1.59
specVersion: v1beta

displayName: Stream Firestore to BigQuery
Expand Down Expand Up @@ -339,7 +339,7 @@ params:
This parameter will set the maximum number of syncronised documents per
second with BQ. Please note, any other external updates to a Big Query
table will be included within this quota. Ensure that you have a set a low
enough number to compensate. Defaults to 10.
enough number to compensate. Defaults to 100.
type: string
validationRegex: ^([1-9]|[1-9][0-9]|[1-4][0-9]{2}|500)$
validationErrorMessage: Please select a number between 1 and 500
Expand Down
Loading
Loading