-
Notifications
You must be signed in to change notification settings - Fork 1.8k
[Build] Run the Spark master PYTHON tests using the Spark 4.0 RC4 #4513
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
allisonport-db
merged 7 commits into
delta-io:master
from
allisonport-db:run-python-tests-spark-master-2025
May 9, 2025
Merged
Changes from 5 commits
Commits
Show all changes
7 commits
Select commit
Hold shift + click to select a range
0e81d63
Try to run python tests
allisonport-db c7d6d71
mypy
allisonport-db e1cf82e
Mypi + fix getQueryContext error
allisonport-db b976a38
Upgrade pandas
allisonport-db ad178c5
Ignore all to shorten line
allisonport-db 996a505
Fix a missed version + update python deps to match Spark
allisonport-db 46bd83c
Upgrade required black version
allisonport-db File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,85 @@ | ||
name: "Delta Spark Master Python" | ||
on: [push, pull_request] | ||
jobs: | ||
test: | ||
name: "DSP" | ||
runs-on: ubuntu-24.04 | ||
strategy: | ||
matrix: | ||
# These Scala versions must match those in the build.sbt | ||
scala: [2.13.13] | ||
env: | ||
SCALA_VERSION: ${{ matrix.scala }} | ||
steps: | ||
- uses: actions/checkout@v3 | ||
- uses: technote-space/get-diff-action@v4 | ||
id: git-diff | ||
with: | ||
PATTERNS: | | ||
** | ||
.github/workflows/** | ||
!kernel/** | ||
!connectors/** | ||
- name: install java | ||
uses: actions/setup-java@v3 | ||
with: | ||
distribution: "zulu" | ||
java-version: "17" | ||
- name: Cache Scala, SBT | ||
uses: actions/cache@v3 | ||
with: | ||
path: | | ||
~/.sbt | ||
~/.ivy2 | ||
~/.cache/coursier | ||
!~/.cache/coursier/v1/https/repository.apache.org/content/groups/snapshots | ||
# Change the key if dependencies are changed. For each key, GitHub Actions will cache the | ||
# the above directories when we use the key for the first time. After that, each run will | ||
# just use the cache. The cache is immutable so we need to use a new key when trying to | ||
# cache new stuff. | ||
key: delta-sbt-cache-spark-master-scala${{ matrix.scala }} | ||
- name: Install Job dependencies | ||
# TODO: update pyspark installation once Spark preview is formally released | ||
run: | | ||
sudo apt-get update | ||
sudo apt-get install -y make build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev libncursesw5-dev xz-utils tk-dev libffi-dev liblzma-dev python3-openssl git | ||
sudo apt install libedit-dev | ||
curl -LO https://github.com/bufbuild/buf/releases/download/v1.28.1/buf-Linux-x86_64.tar.gz | ||
mkdir -p ~/buf | ||
tar -xvzf buf-Linux-x86_64.tar.gz -C ~/buf --strip-components 1 | ||
rm buf-Linux-x86_64.tar.gz | ||
sudo apt install python3-pip --fix-missing | ||
sudo pip3 install pipenv==2024.4.1 | ||
curl https://pyenv.run | bash | ||
export PATH="~/.pyenv/bin:$PATH" | ||
eval "$(pyenv init -)" | ||
eval "$(pyenv virtualenv-init -)" | ||
pyenv install 3.9 | ||
pyenv global system 3.9 | ||
pipenv --python 3.9 install | ||
# Update the pip version to 24.0. By default `pyenv.run` installs the latest pip version | ||
# available. From version 24.1, `pip` doesn't allow installing python packages | ||
# with version string containing `-`. In Delta-Spark case, the pypi package generated has | ||
# `-SNAPSHOT` in version (e.g. `3.3.0-SNAPSHOT`) as the version is picked up from | ||
# the`version.sbt` file. | ||
pipenv run pip install pip==24.0 setuptools==69.5.1 wheel==0.43.0 | ||
pipenv run pip install flake8==3.9.0 | ||
pipenv run pip install black==23.9.1 | ||
pipenv run pip install mypy==1.8.0 | ||
pipenv run pip install mypy-protobuf==3.3.0 | ||
pipenv run pip install cryptography==37.0.4 | ||
pipenv run pip install twine==4.0.1 | ||
pipenv run pip install wheel==0.33.4 | ||
pipenv run pip install setuptools==41.1.0 | ||
pipenv run pip install pydocstyle==3.0.0 | ||
pipenv run pip install pandas==2.0.0 | ||
pipenv run pip install pyarrow==8.0.0 | ||
pipenv run pip install numpy==1.21 | ||
pipenv run pip install https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc4-bin//pyspark-4.0.0.tar.gz | ||
if: steps.git-diff.outputs.diff | ||
- name: Run Python tests | ||
# when changing TEST_PARALLELISM_COUNT make sure to also change it in spark_master_test.yaml | ||
run: | | ||
echo 'ThisBuild / version := "4.0.0-SNAPSHOT"' > version.sbt | ||
TEST_PARALLELISM_COUNT=4 USE_SPARK_MASTER=true pipenv run python run-tests.py --group spark-python | ||
if: steps.git-diff.outputs.diff |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,5 @@ | ||
# Linter | ||
mypy==0.982 | ||
mypy==1.8.0 | ||
flake8==3.9.0 | ||
|
||
# Code Formatter | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DSMP