Skip to content

Commit 3e2ef32

Browse files
longvu-dbscottsand-dbtlm365johanl-dballisonport-db
authored
[Spark] Add Delta Connect Python Client for First 4.0 Preview branch (#3201)
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [X] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> [Delta Connect Python Client PR](#3136) but merge into branch-4.0-preview1 instead of master. ## How was this patch tested? Added UTs. <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> ## Does this PR introduce _any_ user-facing changes? No. <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> --------- Signed-off-by: Tai Le Manh <[email protected]> Signed-off-by: Shawn Chang <[email protected]> Co-authored-by: Scott Sandre <[email protected]> Co-authored-by: Tai Le Manh <[email protected]> Co-authored-by: Johan Lasperas <[email protected]> Co-authored-by: Allison Portis <[email protected]> Co-authored-by: Tom van Bussel <[email protected]> Co-authored-by: Jiaheng Tang <[email protected]> Co-authored-by: Sumeet Varma <[email protected]> Co-authored-by: Venki Korukanti <[email protected]> Co-authored-by: Avril Aysha <[email protected]> Co-authored-by: Shawn Chang <[email protected]> Co-authored-by: Shawn Chang <[email protected]> Co-authored-by: Hao Jiang <[email protected]>
1 parent 650dd8b commit 3e2ef32

20 files changed

+583
-19
lines changed

.github/workflows/spark_master_test.yaml

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -57,22 +57,31 @@ jobs:
5757
pipenv --python 3.9 install
5858
pipenv run pip install flake8==3.9.0
5959
pipenv run pip install black==23.9.1
60+
pipenv run pip install importlib_metadata==3.10.0
6061
pipenv run pip install mypy==1.8.0
6162
pipenv run pip install mypy-protobuf==3.3.0
6263
pipenv run pip install cryptography==37.0.4
6364
pipenv run pip install twine==4.0.1
6465
pipenv run pip install wheel==0.33.4
6566
pipenv run pip install setuptools==41.1.0
6667
pipenv run pip install pydocstyle==3.0.0
67-
pipenv run pip install pandas==1.4.4
68-
pipenv run pip install pyarrow==8.0.0
68+
pipenv run pip install pandas==2.0.0
69+
pipenv run pip install pyarrow==10.0.0
70+
pipenv run pip install pypandoc==1.3.3
6971
pipenv run pip install numpy==1.21
72+
pipenv run pip install grpcio==1.62.0
73+
pipenv run pip install grpcio-status==1.62.0
74+
pipenv run pip install googleapis-common-protos==1.56.4
75+
pipenv run pip install protobuf==4.25.1
76+
pipenv run pip install googleapis-common-protos-stubs==2.2.0
77+
pipenv run pip install grpc-stubs==1.24.11
7078
pipenv run pip install pyspark==4.0.0.dev1
7179
- name: Run Spark Master tests
7280
# when changing TEST_PARALLELISM_COUNT make sure to also change it in spark_test.yaml
7381
# NOTE: in this branch, the default sparkVersion is the SPARK_MASTER_VERSION
7482
run: |
7583
TEST_PARALLELISM_COUNT=2 build/sbt -DsparkVersion=master "++ ${{ matrix.scala }}" clean connectServer/test
7684
TEST_PARALLELISM_COUNT=2 build/sbt -DsparkVersion=master "++ ${{ matrix.scala }}" clean connectServer/assembly connectClient/test
85+
TEST_PARALLELISM_COUNT=2 RUN_DELTA_CONNECT_TESTS=true pipenv run python run-tests.py
7786
TEST_PARALLELISM_COUNT=2 pipenv run python run-tests.py --group spark
7887
if: steps.git-diff.outputs.diff

dev/delta-connect-gen-protos.sh

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -71,6 +71,7 @@ for f in `find gen/proto/python/delta/connect -name "*.py*"`; do
7171
if [[ $f == *_pb2.py || $f == *_pb2_grpc.py ]]; then
7272
sed \
7373
-e "s/DESCRIPTOR, 'delta.connect/DESCRIPTOR, 'delta.connect.proto/g" \
74+
-e 's/from delta.connect import/from delta.connect.proto import/g' \
7475
$f > $f.tmp
7576
mv $f.tmp $f
7677
elif [[ $f == *.pyi ]]; then

dev/requirements.txt

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,14 @@
11
# Linter
2-
mypy==0.982
2+
mypy==1.8.0
33
flake8==3.9.0
44

55
# Code Formatter
66
black==23.9.1
77

8+
# Spark Connect (required)
9+
grpcio>=1.62.0
10+
grpcio-status>=1.62.0
11+
googleapis-common-protos>=1.56.4
12+
813
# Spark and Delta Connect python proto generation plugin (optional)
914
mypy-protobuf==3.3.0

examples/python/delta_connect.py

Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
#
2+
# Copyright (2024) The Delta Lake Project Authors.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License");
5+
# you may not use this file except in compliance with the License.
6+
# You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
#
16+
17+
"""
18+
To run this example you must follow these steps:
19+
20+
Requirements:
21+
- Using Java 17
22+
- Spark 4.0.0-preview1+
23+
- delta-spark (python package) 4.0.0rc1+ and pyspark 4.0.0.dev1+
24+
25+
(1) Start a local Spark connect server using this command:
26+
sbin/start-connect-server.sh \
27+
--packages org.apache.spark:spark-connect_2.13:4.0.0-preview1,io.delta:delta-connect-server_2.13:{DELTA_VERSION},io.delta:delta-spark_2.13:{DELTA_VERSION},com.google.protobuf:protobuf-java:3.25.1 \
28+
--conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \
29+
--conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog" \
30+
--conf "spark.connect.extensions.relation.classes"="org.apache.spark.sql.connect.delta.DeltaRelationPlugin" \
31+
--conf "spark.connect.extensions.command.classes"="org.apache.spark.sql.connect.delta.DeltaCommandPlugin"
32+
* Be sure to replace DELTA_VERSION with the version you are using
33+
34+
(2) Set the SPARK_REMOTE environment variable to point to your local Spark server
35+
export SPARK_REMOTE="sc://localhost:15002"
36+
37+
(3) Run this file i.e. python3 examples/python/delta_connect.py
38+
"""
39+
40+
import os
41+
from pyspark.sql import SparkSession
42+
from delta.tables import DeltaTable
43+
import shutil
44+
45+
filePath = "/tmp/delta_connect"
46+
tableName = "delta_connect_table"
47+
48+
def assert_dataframe_equals(df1, df2):
49+
assert(df1.collect().sort() == df2.collect().sort())
50+
51+
def cleanup(spark):
52+
shutil.rmtree(filePath, ignore_errors=True)
53+
spark.sql(f"DROP TABLE IF EXISTS {tableName}")
54+
55+
# --------------------- Set up Spark Connect spark session ------------------------
56+
57+
assert os.getenv("SPARK_REMOTE"), "Must point to Spark Connect server using SPARK_REMOTE"
58+
59+
spark = SparkSession.builder \
60+
.appName("delta_connect") \
61+
.remote(os.getenv("SPARK_REMOTE")) \
62+
.getOrCreate()
63+
64+
# Clean up any previous runs
65+
cleanup(spark)
66+
67+
# -------------- Try reading non-existent table (should fail with an exception) ----------------
68+
69+
# Using forPath
70+
try:
71+
DeltaTable.forPath(spark, filePath).toDF().show()
72+
except Exception as e:
73+
assert "DELTA_MISSING_DELTA_TABLE" in str(e)
74+
else:
75+
assert False, "Expected exception to be thrown for missing table"
76+
77+
# Using forName
78+
try:
79+
DeltaTable.forName(spark, tableName).toDF().show()
80+
except Exception as e:
81+
assert "DELTA_MISSING_DELTA_TABLE" in str(e)
82+
else:
83+
assert False, "Expected exception to be thrown for missing table"
84+
85+
# ------------------------ Write basic table and check that results match ----------------------
86+
87+
# By table name
88+
spark.range(5).write.format("delta").saveAsTable(tableName)
89+
assert_dataframe_equals(DeltaTable.forName(spark, tableName).toDF(), spark.range(5))
90+
assert_dataframe_equals(spark.read.format("delta").table(tableName), spark.range(5))
91+
assert_dataframe_equals(spark.sql(f"SELECT * FROM {tableName}"), spark.range(5))
92+
93+
# By table path
94+
spark.range(10).write.format("delta").save(filePath)
95+
assert_dataframe_equals(DeltaTable.forPath(spark, filePath).toDF(), spark.range(10))
96+
assert_dataframe_equals(spark.read.format("delta").load(filePath), spark.range(10))
97+
assert_dataframe_equals(spark.sql(f"SELECT * FROM delta.`{filePath}`"), spark.range(10))
98+
99+
# ---------------------------------- Clean up ----------------------------------------
100+
cleanup(spark)

python/delta/connect/__init__.py

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
#
2+
# Copyright (2024) The Delta Lake Project Authors.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License");
5+
# you may not use this file except in compliance with the License.
6+
# You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
#
16+
17+
from delta.connect.tables import DeltaTable
18+
19+
__all__ = ['DeltaTable']

python/delta/connect/plan.py

Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
#
2+
# Copyright (2024) The Delta Lake Project Authors.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License");
5+
# you may not use this file except in compliance with the License.
6+
# You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
#
16+
17+
from typing import Optional
18+
19+
import delta.connect.proto as proto
20+
21+
from pyspark.sql.connect.client import SparkConnectClient
22+
from pyspark.sql.connect.plan import LogicalPlan
23+
import pyspark.sql.connect.proto as spark_proto
24+
25+
26+
class DeltaLogicalPlan(LogicalPlan):
27+
def __init__(self, child: Optional[LogicalPlan]) -> None:
28+
super().__init__(child)
29+
30+
def plan(self, session: SparkConnectClient) -> spark_proto.Relation:
31+
plan = spark_proto.Relation()
32+
plan.extension.Pack(self.to_delta_relation(session))
33+
return plan
34+
35+
def to_delta_relation(self, session: SparkConnectClient) -> proto.DeltaRelation:
36+
...
37+
38+
def command(self, session: SparkConnectClient) -> spark_proto.Command:
39+
command = spark_proto.Command()
40+
command.extension.Pack(self.to_delta_command(session))
41+
return command
42+
43+
def to_delta_command(self, session: SparkConnectClient) -> proto.DeltaCommand:
44+
...
45+
46+
47+
class DeltaScan(DeltaLogicalPlan):
48+
def __init__(self, table: proto.DeltaTable) -> None:
49+
super().__init__(None)
50+
self._table = table
51+
52+
def to_delta_relation(self, client: SparkConnectClient) -> proto.DeltaRelation:
53+
relation = proto.DeltaRelation()
54+
relation.scan.table.CopyFrom(self._table)
55+
return relation
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
#
2+
# Copyright (2024) The Delta Lake Project Authors.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License");
5+
# you may not use this file except in compliance with the License.
6+
# You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
#
16+
17+
from delta.connect.proto.base_pb2 import *
18+
from delta.connect.proto.commands_pb2 import *
19+
from delta.connect.proto.relations_pb2 import *

python/delta/connect/proto/commands_pb2.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
_sym_db = _symbol_database.Default()
2828

2929

30-
from delta.connect import base_pb2 as delta_dot_connect_dot_base__pb2
30+
from delta.connect.proto import base_pb2 as delta_dot_connect_dot_base__pb2
3131

3232

3333
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(

python/delta/connect/proto/relations_pb2.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
_sym_db = _symbol_database.Default()
2828

2929

30-
from delta.connect import base_pb2 as delta_dot_connect_dot_base__pb2
30+
from delta.connect.proto import base_pb2 as delta_dot_connect_dot_base__pb2
3131

3232

3333
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(

python/delta/connect/tables.py

Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
#
2+
# Copyright (2024) The Delta Lake Project Authors.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License");
5+
# you may not use this file except in compliance with the License.
6+
# You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
#
16+
17+
from typing import Dict, Optional
18+
19+
from delta.connect.plan import DeltaScan
20+
import delta.connect.proto as proto
21+
from delta.tables import DeltaTable as LocalDeltaTable
22+
23+
from pyspark.sql.connect.dataframe import DataFrame
24+
from pyspark.sql.connect.plan import LogicalPlan, SubqueryAlias
25+
from pyspark.sql.connect.session import SparkSession
26+
27+
28+
class DeltaTable(object):
29+
def __init__(
30+
self,
31+
spark: SparkSession,
32+
path: Optional[str] = None,
33+
tableOrViewName: Optional[str] = None,
34+
hadoopConf: Dict[str, str] = dict(),
35+
plan: Optional[LogicalPlan] = None
36+
) -> None:
37+
self._spark = spark
38+
self._path = path
39+
self._tableOrViewName = tableOrViewName
40+
self._hadoopConf = hadoopConf
41+
if plan is not None:
42+
self._plan = plan
43+
else:
44+
self._plan = DeltaScan(self._to_proto())
45+
46+
def toDF(self) -> DataFrame:
47+
return DataFrame(self._plan, session=self._spark)
48+
49+
def alias(self, aliasName: str) -> "DeltaTable":
50+
return DeltaTable(
51+
self._spark,
52+
self._path,
53+
self._tableOrViewName,
54+
self._hadoopConf,
55+
SubqueryAlias(self._plan, aliasName)
56+
)
57+
58+
@classmethod
59+
def forPath(
60+
cls,
61+
sparkSession: SparkSession,
62+
path: str,
63+
hadoopConf: Dict[str, str] = dict()
64+
) -> "DeltaTable":
65+
assert sparkSession is not None
66+
return DeltaTable(sparkSession, path=path, hadoopConf=hadoopConf)
67+
68+
@classmethod
69+
def forName(
70+
cls, sparkSession: SparkSession, tableOrViewName: str
71+
) -> "DeltaTable":
72+
assert sparkSession is not None
73+
return DeltaTable(sparkSession, tableOrViewName=tableOrViewName)
74+
75+
def _to_proto(self) -> proto.DeltaTable:
76+
result = proto.DeltaTable()
77+
if self._path is not None:
78+
result.path.path = self._path
79+
if self._tableOrViewName is not None:
80+
result.table_or_view_name = self._tableOrViewName
81+
return result
82+
83+
84+
DeltaTable.__doc__ = LocalDeltaTable.__doc__
85+
DeltaTable.toDF.__doc__ = LocalDeltaTable.toDF.__doc__
86+
DeltaTable.alias.__doc__ = LocalDeltaTable.alias.__doc__
87+
DeltaTable.forPath.__func__.__doc__ = LocalDeltaTable.forPath.__doc__
88+
DeltaTable.forName.__func__.__doc__ = LocalDeltaTable.forName.__doc__
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
#
2+
# Copyright (2024) The Delta Lake Project Authors.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License");
5+
# you may not use this file except in compliance with the License.
6+
# You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
#

0 commit comments

Comments
 (0)