You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
*[PubSubLiteToBigTable](/java/src/main/java/com/google/cloud/dataproc/templates/pubsublite#1-pubsublite-to-bigtable) (blogpost [link](https://medium.com/google-cloud/stream-data-from-pub-sub-lite-to-bigtable-using-dataproc-serverless-2c8816f40581)) **Deprecated and will be removed in Q1 2025**
*[RedshiftToGCS](/java/src/main/java/com/google/cloud/dataproc/templates/databases#executing-redshift-to-gcs-template)**Deprecated and will be removed in Q1 2025**
Copy file name to clipboardExpand all lines: java/src/main/java/com/google/cloud/dataproc/templates/databases/README.md
-35Lines changed: 0 additions & 35 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -173,41 +173,6 @@ You can replace the ```casscon``` with your catalog name if it is passed. This i
173
173
174
174
Make sure that either ```cassandratobq.input.query``` or both ```cassandratobq.input.keyspace``` and ```cassandratobq.input.table``` is provided. Setting or not setting all three properties at the same time will throw an error.
--templateProperty redshift.gcs.temp.query='select * from global_temp.temporary_view_name'
207
-
```
208
-
These properties are responsible for applying some spark sql transformations while loading data into Cloud Storage.
209
-
The only thing needs to keep in mind is that, the name of the Spark temporary view and the name of table in the query should match exactly. Otherwise, there would be an error as:- "Table or view not found:"
210
-
211
176
## Executing Mongo to Cloud Storage template
212
177
213
178
Template for exporting a MongoDB Collection to files in Google Cloud Storage. It supports writing JSON, CSV, Parquet and Avro formats.
--templateProperty text.bigquery.temp.query='select * from global_temp.temporary_view_name'
267
-
```
268
-
These properties are responsible for applying some spark sql transformations while loading data into BigQuery.
269
-
The only thing needs to keep in mind is that, the name of the Spark temporary view and the name of table in the query should match exactly. Otherwise, there would be an error as:- "Table or view not found:"
270
-
271
-
## 8. Deltalake To Iceberg
240
+
## 7. Deltalake To Iceberg
272
241
273
242
`deltalake.version.as_of` is an optional parameter which is default set to `0` means we will pick up the latest change only. We are providing below example to show how you can pass the value if you require time travel based on version number.
0 commit comments