@@ -144,3 +144,143 @@ Feature: PostgreSQL - Verify data transfer from BigQuery source to PostgreSQL si
144
144
Then Open and capture logs
145
145
Then Verify the pipeline status is "Succeeded"
146
146
Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table
147
+
148
+ @BQ_SOURCE_TEST @Postgresql_Required @POSTGRESQL_TEST_TABLE @Plugin-1526
149
+ Scenario : To verify data is getting transferred from BigQuery source to PostgreSQL sink with Advanced operations update for table key
150
+ Given Open Datafusion Project to configure pipeline
151
+ When Expand Plugin group in the LHS plugins list: "Source"
152
+ When Select plugin: "BigQuery" from the plugins list as: "Source"
153
+ When Expand Plugin group in the LHS plugins list: "Sink"
154
+ When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
155
+ Then Connect plugins: "BigQuery" and "PostgreSQL" to establish connection
156
+ Then Navigate to the properties page of plugin: "BigQuery"
157
+ Then Replace input plugin property: "project" with value: "projectId"
158
+ Then Enter input plugin property: "datasetProject" with value: "projectId"
159
+ Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
160
+ Then Enter input plugin property: "dataset" with value: "dataset"
161
+ Then Enter input plugin property: "table" with value: "bqSourceTable"
162
+ Then Click on the Get Schema button
163
+ Then Verify the Output Schema matches the Expected Schema: "bqOutputMultipleDatatypesSchema"
164
+ Then Validate "BigQuery" plugin properties
165
+ Then Close the Plugin Properties page
166
+ Then Navigate to the properties page of plugin: "PostgreSQL"
167
+ Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
168
+ Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
169
+ Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
170
+ Then Replace input plugin property: "database" with value: "databaseName"
171
+ Then Replace input plugin property: "tableName" with value: "targetTable"
172
+ Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
173
+ Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
174
+ Then Enter input plugin property: "referenceName" with value: "targetRef"
175
+ Then Replace input plugin property: "dbSchemaName" with value: "schema"
176
+ Then Select radio button plugin property: "opeationName" with value: "UPDATE"
177
+ Then Click on the Add Button of the property: "relationTableKey" with value:
178
+ | PostgreSQLTableKey |
179
+ Then Validate "PostgreSQL" plugin properties
180
+ Then Close the Plugin Properties page
181
+ Then Save the pipeline
182
+ Then Preview and run the pipeline
183
+ Then Verify the preview of pipeline is "success"
184
+ Then Click on preview data for PostgreSQL sink
185
+ Then Close the preview data
186
+ Then Deploy the pipeline
187
+ Then Run the Pipeline in Runtime
188
+ Then Wait till pipeline is in running state
189
+ Then Open and capture logs
190
+ Then Verify the pipeline status is "Succeeded"
191
+ Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table
192
+
193
+ @BQ_SOURCE_TEST @Postgresql_Required @POSTGRESQL_TEST_TABLE @Plugin-1526
194
+ Scenario : To verify data is getting transferred from BigQuery source to PostgreSQL sink with Advanced operations Upsert for table key
195
+ Given Open Datafusion Project to configure pipeline
196
+ When Expand Plugin group in the LHS plugins list: "Source"
197
+ When Select plugin: "BigQuery" from the plugins list as: "Source"
198
+ When Expand Plugin group in the LHS plugins list: "Sink"
199
+ When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
200
+ Then Connect plugins: "BigQuery" and "PostgreSQL" to establish connection
201
+ Then Navigate to the properties page of plugin: "BigQuery"
202
+ Then Replace input plugin property: "project" with value: "projectId"
203
+ Then Enter input plugin property: "datasetProject" with value: "projectId"
204
+ Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
205
+ Then Enter input plugin property: "dataset" with value: "dataset"
206
+ Then Enter input plugin property: "table" with value: "bqSourceTable"
207
+ Then Click on the Get Schema button
208
+ Then Verify the Output Schema matches the Expected Schema: "bqOutputMultipleDatatypesSchema"
209
+ Then Validate "BigQuery" plugin properties
210
+ Then Close the Plugin Properties page
211
+ Then Navigate to the properties page of plugin: "PostgreSQL"
212
+ Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
213
+ Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
214
+ Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
215
+ Then Replace input plugin property: "database" with value: "databaseName"
216
+ Then Replace input plugin property: "tableName" with value: "targetTable"
217
+ Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
218
+ Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
219
+ Then Enter input plugin property: "referenceName" with value: "targetRef"
220
+ Then Replace input plugin property: "dbSchemaName" with value: "schema"
221
+ Then Select radio button plugin property: "opeationName" with value: "UPSERT"
222
+ Then Click on the Add Button of the property: "relationTableKey" with value:
223
+ | PostgreSQLTableKey |
224
+ Then Validate "PostgreSQL" plugin properties
225
+ Then Close the Plugin Properties page
226
+ Then Save the pipeline
227
+ Then Preview and run the pipeline
228
+ Then Verify the preview of pipeline is "success"
229
+ Then Click on preview data for PostgreSQL sink
230
+ Then Close the preview data
231
+ Then Deploy the pipeline
232
+ Then Run the Pipeline in Runtime
233
+ Then Wait till pipeline is in running state
234
+ Then Open and capture logs
235
+ Then Verify the pipeline status is "Succeeded"
236
+ Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table
237
+
238
+ @BQ_SOURCE_TEST @Postgresql_Required @POSTGRESQL_TEST_TABLE @CONNECTION @Plugin-1526
239
+ Scenario : To verify data is getting transferred from BigQuery source to PostgreSQL sink successfully using Connection
240
+ Given Open Datafusion Project to configure pipeline
241
+ When Expand Plugin group in the LHS plugins list: "Source"
242
+ When Select plugin: "BigQuery" from the plugins list as: "Source"
243
+ When Expand Plugin group in the LHS plugins list: "Sink"
244
+ When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
245
+ Then Connect plugins: "BigQuery" and "PostgreSQL" to establish connection
246
+ Then Navigate to the properties page of plugin: "BigQuery"
247
+ Then Replace input plugin property: "project" with value: "projectId"
248
+ Then Enter input plugin property: "datasetProject" with value: "projectId"
249
+ Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
250
+ Then Enter input plugin property: "dataset" with value: "dataset"
251
+ Then Enter input plugin property: "table" with value: "bqSourceTable"
252
+ Then Click on the Get Schema button
253
+ Then Validate "BigQuery" plugin properties
254
+ Then Close the Plugin Properties page
255
+ Then Navigate to the properties page of plugin: "PostgreSQL"
256
+ And Click plugin property: "switch-useConnection"
257
+ And Click on the Browse Connections button
258
+ And Click on the Add Connection button
259
+ Then Click plugin property: "connector-PostgreSQL"
260
+ And Enter input plugin property: "name" with value: "connection.name"
261
+ Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
262
+ Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
263
+ Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
264
+ Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
265
+ Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
266
+ Then Replace input plugin property: "database" with value: "databaseName"
267
+ Then Click on the Test Connection button
268
+ And Verify the test connection is successful
269
+ Then Click on the Create button
270
+ Then Select connection: "connection.name"
271
+ Then Enter input plugin property: "referenceName" with value: "targetRef"
272
+ Then Replace input plugin property: "tableName" with value: "targetTable"
273
+ Then Replace input plugin property: "dbSchemaName" with value: "schema"
274
+ Then Validate "PostgreSQL" plugin properties
275
+ Then Close the Plugin Properties page
276
+ Then Save the pipeline
277
+ Then Preview and run the pipeline
278
+ Then Verify the preview of pipeline is "success"
279
+ Then Click on preview data for PostgreSQL sink
280
+ Then Close the preview data
281
+ Then Deploy the pipeline
282
+ Then Run the Pipeline in Runtime
283
+ Then Wait till pipeline is in running state
284
+ Then Open and capture logs
285
+ Then Verify the pipeline status is "Succeeded"
286
+ Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table
0 commit comments