Skip to content

Commit a21b49f

Browse files
jakepenzakcpcloud
andauthored
chore(deps): update delta-spark to 3.3.0 for local and remote pyspark (#11123)
## Description of changes 1. Upgrade `delta-spark` from v3.2.1 to [v3.3.0](https://github.com/delta-io/delta/releases/tag/v3.3.0) for pyspark backend testing in github actions 2. Upgrade `delta-spark` [maven package](https://mvnrepository.com/artifact/io.delta/delta-spark) from v.2.1.0 to v3.3.0 in spark-connect container 🐳 - Ensure consistency w/ local pyspark testing & [compatibility](https://docs.delta.io/latest/releases.html#compatibility-with-apache-spark) with pyspark v3.5.5 3. Upgrade spark-connect configuration to enable proper delta & catalog functionality ---- - **2** + **3** together resolved spark-connect testing issues from #11120 --------- Co-authored-by: Phillip Cloud <[email protected]>
1 parent a416755 commit a21b49f

File tree

5 files changed

+9
-10
lines changed

5 files changed

+9
-10
lines changed

.github/workflows/ibis-backends.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -561,20 +561,20 @@ jobs:
561561
pyspark-minor-version: "3.5"
562562
tag: local
563563
deps:
564-
- delta-spark==3.2.1
564+
- delta-spark==3.3.0
565565
- python-version: "3.13"
566566
pyspark-minor-version: "3.5"
567567
tag: local
568568
deps:
569569
- setuptools==75.1.0
570-
- delta-spark==3.2.1
570+
- delta-spark==3.3.0
571571
- python-version: "3.12"
572572
pyspark-minor-version: "3.5"
573573
SPARK_REMOTE: "sc://localhost:15002"
574574
tag: remote
575575
deps:
576576
- setuptools==75.1.0
577-
- delta-spark==3.2.1
577+
- delta-spark==3.3.0
578578
- googleapis-common-protos
579579
- grpcio
580580
- grpcio-status

compose.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -593,7 +593,7 @@ services:
593593
image: bitnami/spark:3.5.5
594594
ports:
595595
- 15002:15002
596-
command: /opt/bitnami/spark/sbin/start-connect-server.sh --name ibis_testing --packages org.apache.spark:spark-connect_2.12:3.5.3,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.6.1,io.delta:delta-core_2.12:2.1.0
596+
command: /opt/bitnami/spark/sbin/start-connect-server.sh --name ibis_testing --packages org.apache.spark:spark-connect_2.12:3.5.3,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.6.1,io.delta:delta-spark_2.12:3.3.0
597597
healthcheck:
598598
test:
599599
- CMD-SHELL

docker/spark-connect/conf.properties

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,3 +10,5 @@ spark.sql.session.timeZone=UTC
1010
spark.sql.streaming.schemaInference=true
1111
spark.ui.enabled=false
1212
spark.ui.showConsoleProgress=false
13+
spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension
14+
spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog

ibis/backends/pyspark/tests/conftest.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -288,6 +288,8 @@ def connect(*, tmpdir, worker_id, **kw):
288288
)
289289
).open(mode="r") as config_file:
290290
for line in config_file:
291+
if "delta" in line:
292+
continue
291293
config = config.config(*map(str.strip, line.strip().split("=", 1)))
292294

293295
config = (

ibis/backends/tests/test_export.py

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
SnowflakeProgrammingError,
2828
TrinoUserError,
2929
)
30-
from ibis.conftest import CI, IS_SPARK_REMOTE
30+
from ibis.conftest import IS_SPARK_REMOTE
3131

3232
pd = pytest.importorskip("pandas")
3333
pa = pytest.importorskip("pyarrow")
@@ -494,11 +494,6 @@ def test_to_pyarrow_decimal(backend, dtype, pyarrow_dtype):
494494
)
495495
@pytest.mark.notyet(["clickhouse"], raises=Exception)
496496
@pytest.mark.notyet(["mssql"], raises=PyDeltaTableError)
497-
@pytest.mark.xfail_version(
498-
pyspark=["pyspark<4"],
499-
condition=CI and IS_SPARK_REMOTE,
500-
reason="not supported until pyspark 4",
501-
)
502497
def test_roundtrip_delta(backend, con, alltypes, tmp_path, monkeypatch):
503498
if con.name == "pyspark":
504499
pytest.importorskip("delta")

0 commit comments

Comments
 (0)