Skip to content

Commit c5a3d26

Browse files
committed
e2e tests PostgreSql Sink
1 parent 2f51d2d commit c5a3d26

File tree

3 files changed

+200
-0
lines changed

3 files changed

+200
-0
lines changed

postgresql-plugin/src/e2e-test/features/postgresql/sink/PostgresqlRunTime.feature

+149
Original file line numberDiff line numberDiff line change
@@ -144,3 +144,152 @@ Feature: PostgreSQL - Verify data transfer from BigQuery source to PostgreSQL si
144144
Then Open and capture logs
145145
Then Verify the pipeline status is "Succeeded"
146146
Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table
147+
148+
@POSTGRESQL_SOURCE_TEST @Postgresql_Required @POSTGRESQL_SINK_TEST @Plugin-1526
149+
Scenario: To verify data is getting transferred from PostgreSQL source to PostgreSQL sink with Advanced operations Upsert
150+
Given Open Datafusion Project to configure pipeline
151+
Given Open Datafusion Project to configure pipeline
152+
When Expand Plugin group in the LHS plugins list: "Source"
153+
When Select plugin: "PostgreSQL" from the plugins list as: "Source"
154+
When Expand Plugin group in the LHS plugins list: "Sink"
155+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
156+
Then Connect plugins: "PostgreSQL" and "PostgreSQL2" to establish connection
157+
Then Navigate to the properties page of plugin: "PostgreSQL"
158+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
159+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
160+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
161+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
162+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
163+
Then Enter input plugin property: "referenceName" with value: "sourceRef"
164+
Then Replace input plugin property: "database" with value: "databaseName"
165+
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
166+
Then Click on the Get Schema button
167+
Then Validate "PostgreSQL" plugin properties
168+
Then Close the Plugin Properties page
169+
Then Navigate to the properties page of plugin: "PostgreSQL2"
170+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
171+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
172+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
173+
Then Replace input plugin property: "database" with value: "databaseName"
174+
Then Replace input plugin property: "tableName" with value: "targetTable"
175+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
176+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
177+
Then Enter input plugin property: "referenceName" with value: "targetRef"
178+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
179+
Then Select radio button plugin property: "operationName" with value: "upsert"
180+
Then Click on the Add Button of the property: "relationTableKey" with value:
181+
| PostgreSQLTableKey |
182+
Then Validate "PostgreSQL" plugin properties
183+
Then Close the Plugin Properties page
184+
Then Save the pipeline
185+
Then Preview and run the pipeline
186+
Then Verify the preview of pipeline is "success"
187+
Then Click on preview data for PostgreSQL sink
188+
Then Close the preview data
189+
Then Deploy the pipeline
190+
Then Run the Pipeline in Runtime
191+
Then Wait till pipeline is in running state
192+
Then Open and capture logs
193+
Then Verify the pipeline status is "Succeeded"
194+
Then Validate the values of records transferred to target table is equal to the values from source table
195+
196+
@POSTGRESQL_SOURCE_TEST @Postgresql_Required @POSTGRESQL_SINK_TEST @Plugin-1526
197+
Scenario: To verify data is getting transferred from PostgreSQL source to PostgreSQL sink with Advanced operations Update for table key
198+
Given Open Datafusion Project to configure pipeline
199+
Given Open Datafusion Project to configure pipeline
200+
When Expand Plugin group in the LHS plugins list: "Source"
201+
When Select plugin: "PostgreSQL" from the plugins list as: "Source"
202+
When Expand Plugin group in the LHS plugins list: "Sink"
203+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
204+
Then Connect plugins: "PostgreSQL" and "PostgreSQL2" to establish connection
205+
Then Navigate to the properties page of plugin: "PostgreSQL"
206+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
207+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
208+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
209+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
210+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
211+
Then Enter input plugin property: "referenceName" with value: "sourceRef"
212+
Then Replace input plugin property: "database" with value: "databaseName"
213+
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
214+
Then Click on the Get Schema button
215+
Then Validate "PostgreSQL" plugin properties
216+
Then Close the Plugin Properties page
217+
Then Navigate to the properties page of plugin: "PostgreSQL2"
218+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
219+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
220+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
221+
Then Replace input plugin property: "database" with value: "databaseName"
222+
Then Replace input plugin property: "tableName" with value: "targetTable"
223+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
224+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
225+
Then Enter input plugin property: "referenceName" with value: "targetRef"
226+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
227+
Then Select radio button plugin property: "operationName" with value: "update"
228+
Then Click on the Add Button of the property: "relationTableKey" with value:
229+
| PostgreSQLTableKey |
230+
Then Validate "PostgreSQL" plugin properties
231+
Then Close the Plugin Properties page
232+
Then Save the pipeline
233+
Then Preview and run the pipeline
234+
Then Verify the preview of pipeline is "success"
235+
Then Click on preview data for PostgreSQL sink
236+
Then Close the preview data
237+
Then Deploy the pipeline
238+
Then Run the Pipeline in Runtime
239+
Then Wait till pipeline is in running state
240+
Then Open and capture logs
241+
Then Verify the pipeline status is "Succeeded"
242+
Then Validate the values of records transferred to target table is equal to the values from source table
243+
244+
@POSTGRESQL_SOURCE_TEST @POSTGRESQL_SINK_TEST @Postgresql_Required @CONNECTION @Plugin-1526
245+
Scenario: To verify data is getting transferred from PostgreSQL source to PostgreSQL sink successfully using Connection
246+
Given Open Datafusion Project to configure pipeline
247+
When Expand Plugin group in the LHS plugins list: "Source"
248+
When Select plugin: "PostgreSQL" from the plugins list as: "Source"
249+
When Expand Plugin group in the LHS plugins list: "Sink"
250+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
251+
Then Connect plugins: "PostgreSQL" and "PostgreSQL2" to establish connection
252+
Then Navigate to the properties page of plugin: "PostgreSQL"
253+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
254+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
255+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
256+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
257+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
258+
Then Enter input plugin property: "referenceName" with value: "sourceRef"
259+
Then Replace input plugin property: "database" with value: "databaseName"
260+
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
261+
Then Click on the Get Schema button
262+
Then Validate "PostgreSQL" plugin properties
263+
Then Close the Plugin Properties page
264+
Then Navigate to the properties page of plugin: "PostgreSQL2"
265+
And Click plugin property: "switch-useConnection"
266+
And Click on the Browse Connections button
267+
And Click on the Add Connection button
268+
Then Click plugin property: "connector-PostgreSQL"
269+
And Enter input plugin property: "name" with value: "connection.name"
270+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
271+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
272+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
273+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
274+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
275+
Then Replace input plugin property: "database" with value: "databaseName"
276+
Then Click on the Test Connection button
277+
And Verify the test connection is successful
278+
Then Click on the Create button
279+
Then Select connection: "connection.name"
280+
Then Enter input plugin property: "referenceName" with value: "targetRef"
281+
Then Replace input plugin property: "tableName" with value: "targetTable"
282+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
283+
Then Validate "PostgreSQL" plugin properties
284+
Then Close the Plugin Properties page
285+
Then Save the pipeline
286+
Then Preview and run the pipeline
287+
Then Verify the preview of pipeline is "success"
288+
Then Click on preview data for PostgreSQL sink
289+
Then Close the preview data
290+
Then Deploy the pipeline
291+
Then Run the Pipeline in Runtime
292+
Then Wait till pipeline is in running state
293+
Then Open and capture logs
294+
Then Verify the pipeline status is "Succeeded"
295+
Then Validate the values of records transferred to target table is equal to the values from source table

postgresql-plugin/src/e2e-test/features/postgresql/sink/PostgresqlRunTimeMacro.feature

+48
Original file line numberDiff line numberDiff line change
@@ -136,3 +136,51 @@ Feature: PostgreSQL - Verify data transfer to PostgreSQL sink with macro argumen
136136
Then Verify the pipeline status is "Succeeded"
137137
Then Close the pipeline logs
138138
Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table
139+
140+
@BQ_SOURCE_TEST @Postgresql_Required @POSTGRESQL_TEST_TABLE
141+
Scenario: To verify data is getting transferred from BigQuery source to PostgreSQL sink using connection arguments and operations as macro
142+
Given Open Datafusion Project to configure pipeline
143+
When Expand Plugin group in the LHS plugins list: "Source"
144+
When Select plugin: "BigQuery" from the plugins list as: "Source"
145+
When Expand Plugin group in the LHS plugins list: "Sink"
146+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
147+
Then Connect plugins: "BigQuery" and "PostgreSQL" to establish connection
148+
Then Navigate to the properties page of plugin: "BigQuery"
149+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
150+
Then Click on the Macro button of Property: "projectId" and set the value to: "bqProjectId"
151+
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "bqDatasetProjectId"
152+
Then Click on the Macro button of Property: "dataset" and set the value to: "bqDataset"
153+
Then Click on the Macro button of Property: "table" and set the value to: "bqTable"
154+
Then Validate "BigQuery" plugin properties
155+
Then Close the Plugin Properties page
156+
Then Navigate to the properties page of plugin: "PostgreSQL"
157+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
158+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
159+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
160+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
161+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
162+
Then Enter input plugin property: "referenceName" with value: "targetRef"
163+
Then Replace input plugin property: "database" with value: "databaseName"
164+
Then Click on the Macro button of Property: "connectionArguments" and set the value to: "PostgreSQLConnectionArguments"
165+
Then Click on the Macro button of Property: "operationName" and set the value to: "PostgreSQLOperationName"
166+
Then Click on the Macro button of Property: "tableName" and set the value to: "PostgreSQLTableName"
167+
Then Click on the Macro button of Property: "dbSchemaName" and set the value to: "PostgreSQLSchemaName"
168+
Then Validate "PostgreSQL" plugin properties
169+
Then Close the Plugin Properties page
170+
Then Save the pipeline
171+
Then Deploy the pipeline
172+
Then Run the Pipeline in Runtime
173+
Then Enter runtime argument value "projectId" for key "bqProjectId"
174+
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
175+
Then Enter runtime argument value "dataset" for key "bqDataset"
176+
Then Enter runtime argument value "bqSourceTable" for key "bqTable"
177+
Then Enter runtime argument value "PostgreSQLConnectionArgumentsList" for key "PostgreSQLConnectionArguments"
178+
Then Enter runtime argument value "PostgreSQLOperationName" for key "PostgreSQLOperationName"
179+
Then Enter runtime argument value "targetTable" for key "PostgreSQLTableName"
180+
Then Enter runtime argument value "schema" for key "PostgreSQLSchemaName"
181+
Then Run the Pipeline in Runtime with runtime arguments
182+
Then Wait till pipeline is in running state
183+
Then Open and capture logs
184+
Then Verify the pipeline status is "Succeeded"
185+
Then Close the pipeline logs
186+
Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table

postgresql-plugin/src/e2e-test/resources/pluginParameters.properties

+3
Original file line numberDiff line numberDiff line change
@@ -74,6 +74,9 @@ invalidBoundingQuery=SELECT MIN(id),MAX(id) FROM table
7474
invalidBoundingQueryValue=select;
7575
invalidTable=table
7676
#POSTGRESQL Valid Properties
77+
PostgreSQLConnectionArgumentsList=fetchsize=1000
78+
PostgreSQLOperationName=insert
79+
PostgreSQLTableKey=col2
7780
connectionArgumentsList=[{"key":"queryTimeout","value":"-1"}]
7881
connectionTimeout=150
7982
numberOfSplits=2

0 commit comments

Comments
 (0)