Skip to content

Commit 459c18e

Browse files
authored
Merge pull request #720 from cloudsufi/Wrangler-UI-e2e
Wrangler UI features
2 parents f52bcac + bf51004 commit 459c18e

15 files changed

+741
-1
lines changed
Lines changed: 121 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,121 @@
1+
# Copyright © 2024 Cask Data, Inc.
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
4+
# use this file except in compliance with the License. You may obtain a copy of
5+
# the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
11+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
12+
# License for the specific language governing permissions and limitations under
13+
# the License.
14+
15+
@Wrangler
16+
Feature: Runtime Scenarios for datatype parsers
17+
18+
@BQ_SOURCE_TS_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
19+
Scenario: To verify User is able to run a pipeline using parse timestamp directive
20+
Given Open Wrangler connections page
21+
Then Click plugin property: "addConnection" button
22+
Then Click plugin property: "bqConnectionRow"
23+
Then Enter input plugin property: "name" with value: "bqConnectionName"
24+
Then Replace input plugin property: "projectId" with value: "projectId"
25+
Then Enter input plugin property: "datasetProjectId" with value: "projectId"
26+
Then Override Service account details in Wrangler connection page if set in environment variables
27+
Then Click plugin property: "testConnection" button
28+
Then Verify the test connection is successful
29+
Then Click plugin property: "connectionCreate" button
30+
Then Verify the connection with name: "bqConnectionName" is created successfully
31+
Then Select connection data row with name: "dataset"
32+
Then Select connection data row with name: "bqSourceTable"
33+
Then Verify connection datatable is displayed for the data: "bqSourceTable"
34+
Then Expand dropdown column: "update_date" and apply directive: "Parse" as "SIMPLEDATE" with: "yyyy-MM-dd" option
35+
Then Expand dropdown column: "create_date" and apply directive: "Parse" as "SIMPLEDATE" with: "yyyy-MM-dd" option
36+
Then Enter directive from CLI "parse-timestamp :time"
37+
Then Enter directive from CLI "parse-as-currency :price :newprice"
38+
Then Enter directive from CLI "format-as-currency :newprice :format_price"
39+
Then Enter directive from CLI "diff-date :create_date :update_date :diff_date"
40+
Then Enter directive from CLI "timestamp-to-datetime :update_date"
41+
Then Enter directive from CLI "rename :newprice :id"
42+
Then Click Create Pipeline button and choose the type of pipeline as: "Batch pipeline"
43+
Then Verify plugin: "BigQueryTable" node is displayed on the canvas with a timeout of 120 seconds
44+
Then Expand Plugin group in the LHS plugins list: "Sink"
45+
Then Select plugin: "BigQuery" from the plugins list as: "Sink"
46+
Then Navigate to the properties page of plugin: "BigQuery2"
47+
Then Click plugin property: "useConnection"
48+
Then Click on the Browse Connections button
49+
Then Select connection: "bqConnectionName"
50+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
51+
Then Enter input plugin property: "dataset" with value: "dataset"
52+
Then Enter input plugin property: "table" with value: "bqTargetTable"
53+
Then Validate "BigQuery" plugin properties
54+
Then Close the Plugin Properties page
55+
Then Connect plugins: "Wrangler" and "BigQuery2" to establish connection
56+
Then Save the pipeline
57+
Then Deploy the pipeline
58+
Then Run the Pipeline in Runtime
59+
Then Wait till pipeline is in running state
60+
Then Open and capture logs
61+
Then Verify the pipeline status is "Succeeded"
62+
Then Close the pipeline logs
63+
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_Timestamp"
64+
Given Open Wrangler connections page
65+
Then Expand connections of type: "BigQuery"
66+
Then Open action menu for connection: "bqConnectionName" of type: "BigQuery"
67+
Then Select action: "Delete" for connection: "bqConnectionName" of type: "BigQuery"
68+
Then Click plugin property: "Delete" button
69+
Then Verify connection: "bqConnectionName" of type: "BigQuery" is deleted successfully
70+
71+
72+
@BQ_SOURCE_DATETIME_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
73+
Scenario: To verify User is able to run a pipeline using parse datetime directive
74+
Given Open Wrangler connections page
75+
Then Click plugin property: "addConnection" button
76+
Then Click plugin property: "bqConnectionRow"
77+
Then Enter input plugin property: "name" with value: "bqConnectionName"
78+
Then Replace input plugin property: "projectId" with value: "projectId"
79+
Then Enter input plugin property: "datasetProjectId" with value: "projectId"
80+
Then Override Service account details in Wrangler connection page if set in environment variables
81+
Then Click plugin property: "testConnection" button
82+
Then Verify the test connection is successful
83+
Then Click plugin property: "connectionCreate" button
84+
Then Verify the connection with name: "bqConnectionName" is created successfully
85+
Then Select connection data row with name: "dataset"
86+
Then Select connection data row with name: "bqSourceTable"
87+
Then Verify connection datatable is displayed for the data: "bqSourceTable"
88+
Then Expand dropdown column: "timestamp" and apply directive: "Parse" with directive type: "DATETIME" and select: "Custom_Format" and enter: "yyyy-MM-dd'T'HH:mm:ssX'['z']'"
89+
Then Enter directive from CLI "current-datetime :create_date"
90+
Then Enter directive from CLI "datetime-to-timestamp :timestamp"
91+
Then Enter directive from CLI "format-datetime :create_date 'y'"
92+
Then Enter directive from CLI "format-date :timestamp yyyy-mm-dd"
93+
Then Enter directive from CLI "rename timestamp timecolumn"
94+
Then Click Create Pipeline button and choose the type of pipeline as: "Batch pipeline"
95+
Then Verify plugin: "BigQueryTable" node is displayed on the canvas with a timeout of 120 seconds
96+
Then Expand Plugin group in the LHS plugins list: "Sink"
97+
Then Select plugin: "BigQuery" from the plugins list as: "Sink"
98+
Then Navigate to the properties page of plugin: "BigQuery2"
99+
Then Click plugin property: "useConnection"
100+
Then Click on the Browse Connections button
101+
Then Select connection: "bqConnectionName"
102+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
103+
Then Enter input plugin property: "dataset" with value: "dataset"
104+
Then Enter input plugin property: "table" with value: "bqTargetTable"
105+
Then Validate "BigQuery" plugin properties
106+
Then Close the Plugin Properties page
107+
Then Connect plugins: "Wrangler" and "BigQuery2" to establish connection
108+
Then Save the pipeline
109+
Then Deploy the pipeline
110+
Then Run the Pipeline in Runtime
111+
Then Wait till pipeline is in running state
112+
Then Open and capture logs
113+
Then Verify the pipeline status is "Succeeded"
114+
Then Close the pipeline logs
115+
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_DatetimeNew"
116+
Given Open Wrangler connections page
117+
Then Expand connections of type: "BigQuery"
118+
Then Open action menu for connection: "bqConnectionName" of type: "BigQuery"
119+
Then Select action: "Delete" for connection: "bqConnectionName" of type: "BigQuery"
120+
Then Click plugin property: "Delete" button
121+
Then Verify connection: "bqConnectionName" of type: "BigQuery" is deleted successfully
Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
# Copyright © 2024 Cask Data, Inc.
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
4+
# use this file except in compliance with the License. You may obtain a copy of
5+
# the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
11+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
12+
# License for the specific language governing permissions and limitations under
13+
# the License.
14+
15+
@Wrangler
16+
Feature: Wrangler - Run time scenarios for parse csv using UI
17+
18+
@BQ_SOURCE_CSV_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
19+
Scenario: To verify User is able to run a pipeline using parse csv directive
20+
Given Open Wrangler connections page
21+
Then Click plugin property: "addConnection" button
22+
Then Click plugin property: "bqConnectionRow"
23+
Then Enter input plugin property: "name" with value: "bqConnectionName"
24+
Then Replace input plugin property: "projectId" with value: "projectId"
25+
Then Enter input plugin property: "datasetProjectId" with value: "projectId"
26+
Then Override Service account details in Wrangler connection page if set in environment variables
27+
Then Click plugin property: "testConnection" button
28+
Then Verify the test connection is successful
29+
Then Click plugin property: "connectionCreate" button
30+
Then Verify the connection with name: "bqConnectionName" is created successfully
31+
Then Select connection data row with name: "dataset"
32+
Then Select connection data row with name: "bqSourceTable"
33+
Then Verify connection datatable is displayed for the data: "bqSourceTable"
34+
Then Expand dropdown column: "body" and apply directive: "Parse" as "CSV" with: "Comma" option
35+
Then Expand dropdown column: "body_3" and apply directive: "FillNullOrEmptyCells" as "shubh"
36+
Then Enter directive from CLI "rename body_1 new_id"
37+
Then Enter directive from CLI "quantize body_4 body_q 1:2=20,3:4=40"
38+
Then Expand dropdown column: "body_4" and apply directive: "ChangeDataType" as "Integer"
39+
Then Enter directive from CLI "columns-replace s/^new_//g"
40+
Then Enter directive from CLI "set-headers :abc"
41+
Then Enter directive from CLI "change-column-case uppercase"
42+
Then Enter directive from CLI "cleanse-column-names "
43+
Then Enter directive from CLI "split-to-rows :id '#'"
44+
Then Click Create Pipeline button and choose the type of pipeline as: "Batch pipeline"
45+
Then Verify plugin: "BigQueryTable" node is displayed on the canvas with a timeout of 120 seconds
46+
Then Expand Plugin group in the LHS plugins list: "Sink"
47+
Then Select plugin: "BigQuery" from the plugins list as: "Sink"
48+
Then Navigate to the properties page of plugin: "BigQuery2"
49+
Then Click plugin property: "useConnection"
50+
Then Click on the Browse Connections button
51+
Then Select connection: "bqConnectionName"
52+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
53+
Then Enter input plugin property: "dataset" with value: "dataset"
54+
Then Enter input plugin property: "table" with value: "bqTargetTable"
55+
Then Validate "BigQuery" plugin properties
56+
Then Close the Plugin Properties page
57+
Then Connect plugins: "Wrangler" and "BigQuery2" to establish connection
58+
Then Save the pipeline
59+
Then Deploy the pipeline
60+
Then Run the Pipeline in Runtime
61+
Then Wait till pipeline is in running state
62+
Then Open and capture logs
63+
Then Verify the pipeline status is "Succeeded"
64+
Then Close the pipeline logs
65+
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_csv"
66+
Given Open Wrangler connections page
67+
Then Expand connections of type: "BigQuery"
68+
Then Open action menu for connection: "bqConnectionName" of type: "BigQuery"
69+
Then Select action: "Delete" for connection: "bqConnectionName" of type: "BigQuery"
70+
Then Click plugin property: "Delete" button
71+
Then Verify connection: "bqConnectionName" of type: "BigQuery" is deleted successfully
Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
# Copyright © 2024 Cask Data, Inc.
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
4+
# use this file except in compliance with the License. You may obtain a copy of
5+
# the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
11+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
12+
# License for the specific language governing permissions and limitations under
13+
# the License.
14+
15+
@Wrangler
16+
Feature: Parse as excel
17+
18+
@BQ_SINK_TEST
19+
Scenario: To verify User is able to run a pipeline using parse Excel directive
20+
Given Open Wrangler connections page
21+
Then Click plugin property: "addConnection" button
22+
Then Click plugin property: "bqConnectionRow"
23+
Then Enter input plugin property: "name" with value: "bqConnectionName"
24+
Then Replace input plugin property: "projectId" with value: "projectId"
25+
Then Enter input plugin property: "datasetProjectId" with value: "projectId"
26+
Then Override Service account details in Wrangler connection page if set in environment variables
27+
Then Click plugin property: "testConnection" button
28+
Then Verify the test connection is successful
29+
Then Click plugin property: "connectionCreate" button
30+
Then Verify the connection with name: "bqConnectionName" is created successfully
31+
Then Select connection data row with name: "dataset"
32+
Then Select connection data row with name: "bqSourceTableExcel"
33+
Then Verify connection datatable is displayed for the data: "bqSourceTableExcel"
34+
Then Enter directive from CLI "parse-as-excel :body '0' true"
35+
Then Expand dropdown column: "name" and apply directive: "CopyColumn" as "copiedname"
36+
Then Enter directive from CLI "merge name bkd uniquenum ','"
37+
Then Enter directive from CLI "rename bkd rollno"
38+
Then Expand dropdown column: "fwd" and apply directive: "DeleteColumn"
39+
Then Select checkbox on two columns: "id" and "rollno"
40+
Then Expand dropdown column: "id" and apply directive: "SwapTwoColumnNames"
41+
Then Enter directive from CLI "split-to-rows :name 'o'"
42+
Then Enter directive from CLI "filter-rows-on condition-false rollno !~ '2.0'"
43+
Then Click Create Pipeline button and choose the type of pipeline as: "Batch pipeline"
44+
Then Verify plugin: "BigQueryTable" node is displayed on the canvas with a timeout of 120 seconds
45+
Then Expand Plugin group in the LHS plugins list: "Sink"
46+
Then Select plugin: "BigQuery" from the plugins list as: "Sink"
47+
Then Navigate to the properties page of plugin: "BigQuery2"
48+
Then Click plugin property: "useConnection"
49+
Then Click on the Browse Connections button
50+
Then Select connection: "bqConnectionName"
51+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
52+
Then Enter input plugin property: "dataset" with value: "dataset"
53+
Then Enter input plugin property: "table" with value: "bqTargetTable"
54+
Then Validate "BigQuery" plugin properties
55+
Then Close the Plugin Properties page
56+
Then Connect plugins: "Wrangler" and "BigQuery2" to establish connection
57+
Then Save the pipeline
58+
Then Deploy the pipeline
59+
Then Run the Pipeline in Runtime
60+
Then Wait till pipeline is in running state
61+
Then Open and capture logs
62+
Then Verify the pipeline status is "Succeeded"
63+
Then Close the pipeline logs
64+
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_excel"
65+
Given Open Wrangler connections page
66+
Then Expand connections of type: "BigQuery"
67+
Then Open action menu for connection: "bqConnectionName" of type: "BigQuery"
68+
Then Select action: "Delete" for connection: "bqConnectionName" of type: "BigQuery"
69+
Then Click plugin property: "Delete" button
70+
Then Verify connection: "bqConnectionName" of type: "BigQuery" is deleted successfully
Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
# Copyright © 2024 Cask Data, Inc.
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
4+
# use this file except in compliance with the License. You may obtain a copy of
5+
# the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
11+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
12+
# License for the specific language governing permissions and limitations under
13+
# the License.
14+
15+
@Wrangler
16+
Feature: parse as fixed length
17+
18+
@BQ_SOURCE_FXDLEN_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
19+
Scenario: To verify User is able to run a pipeline using parse fixedlength directive
20+
Given Open Wrangler connections page
21+
Then Click plugin property: "addConnection" button
22+
Then Click plugin property: "bqConnectionRow"
23+
Then Enter input plugin property: "name" with value: "bqConnectionName"
24+
Then Replace input plugin property: "projectId" with value: "projectId"
25+
Then Enter input plugin property: "datasetProjectId" with value: "projectId"
26+
Then Override Service account details in Wrangler connection page if set in environment variables
27+
Then Click plugin property: "testConnection" button
28+
Then Verify the test connection is successful
29+
Then Click plugin property: "connectionCreate" button
30+
Then Verify the connection with name: "bqConnectionName" is created successfully
31+
Then Select connection data row with name: "dataset"
32+
Then Select connection data row with name: "bqSourceTable"
33+
Then Verify connection datatable is displayed for the data: "bqSourceTable"
34+
Then Expand dropdown column: "fixedlength" and apply directive: "Parse" as "FIXEDLENGTH" with: "2,4,5,3" option
35+
Then Enter directive from CLI "split-url url"
36+
Then Enter directive from CLI "write-as-csv :url_protocol"
37+
Then Enter directive from CLI "url-encode :url"
38+
Then Enter directive from CLI "url-decode :url"
39+
Then Expand dropdown column: "fixedlength" and apply directive: "Encode" as "Base32"
40+
Then Expand dropdown column: "fixedlength_encode_base32" and apply directive: "Decode" as "Base32"
41+
Then Enter directive from CLI "split-to-columns :url_query '='"
42+
Then Enter directive from CLI "rename fixedlength_2 id"
43+
Then Enter directive from CLI "filter-rows-on condition-true fixedlength_4 !~ 'XYZ'"
44+
Then Click Create Pipeline button and choose the type of pipeline as: "Batch pipeline"
45+
Then Verify plugin: "BigQueryTable" node is displayed on the canvas with a timeout of 120 seconds
46+
Then Expand Plugin group in the LHS plugins list: "Sink"
47+
Then Select plugin: "BigQuery" from the plugins list as: "Sink"
48+
Then Navigate to the properties page of plugin: "BigQuery2"
49+
Then Click plugin property: "useConnection"
50+
Then Click on the Browse Connections button
51+
Then Select connection: "bqConnectionName"
52+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
53+
Then Enter input plugin property: "dataset" with value: "dataset"
54+
Then Enter input plugin property: "table" with value: "bqTargetTable"
55+
Then Validate "BigQuery" plugin properties
56+
Then Close the Plugin Properties page
57+
Then Connect plugins: "Wrangler" and "BigQuery2" to establish connection
58+
Then Save the pipeline
59+
Then Deploy the pipeline
60+
Then Run the Pipeline in Runtime
61+
Then Wait till pipeline is in running state
62+
Then Open and capture logs
63+
Then Verify the pipeline status is "Succeeded"
64+
Then Close the pipeline logs
65+
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_FixedLengthnew"
66+
Given Open Wrangler connections page
67+
Then Expand connections of type: "BigQuery"
68+
Then Open action menu for connection: "bqConnectionName" of type: "BigQuery"
69+
Then Select action: "Delete" for connection: "bqConnectionName" of type: "BigQuery"
70+
Then Click plugin property: "Delete" button
71+
Then Verify connection: "bqConnectionName" of type: "BigQuery" is deleted successfully

0 commit comments

Comments
 (0)