@@ -26,7 +26,7 @@ runtime 9.1 LTS or later.
2626
2727## Checking your code for common issues
2828
29- Run ` ./ lint.sh ` from the project root directory to run various code style checks.
29+ Run ` make dev- lint` from the project root directory to run various code style checks.
3030These are based on the use of ` prospector ` , ` pylint ` and related tools.
3131
3232## Setting up your build environment
@@ -45,6 +45,11 @@ Our recommended mechanism for building the code is to use a `conda` or `pipenv`
4545
4646But it can be built with any Python virtualization environment.
4747
48+ ### Spark dependencies
49+ The builds have been tested against Spark 3.2.1. This requires the OpenJDK 1.8.56 or later version of Java 8.
50+ The Databricks runtimes use the Azul Zulu version of OpenJDK 8 and we have used these in local testing.
51+ These are not installed automatically by the build process, so you will need to install them separately.
52+
4853### Building with Conda
4954To build with ` conda ` , perform the following commands:
5055 - ` make create-dev-env ` from the main project directory to create your conda environment, if using
@@ -70,7 +75,7 @@ To build with `pipenv`, perform the following commands:
7075 - Run ` make dist ` from the main project directory
7176 - The resulting wheel file will be placed in the ` dist ` subdirectory
7277
73- The resulting build has been tested against Spark 3.0 .1
78+ The resulting build has been tested against Spark 3.2 .1
7479
7580## Creating the HTML documentation
7681
0 commit comments