Description
Describe the bug
I am trying to upgrade my spark-submit job in azure databricks to use the newest dotnet spark version. I have upgraded the version number in db-init.sh, upgraded my nuget package and updated the spark-submit params. I see there was a recent change in downloadDriverFile because the signature of fetchFile changed. I'm guessing I just missed something in the upgrade.
To Reproduce
Steps to reproduce the behavior:
- Create an app that uses Microsoft.Spark v 2.1
Setup the db-init.sh to pull the 2.1 version:
https://github.com/dotnet/spark/releases/download/v2.0.0/Microsoft.Spark.Worker.netcoreapp3.1.linux-x64-2.1.0.tar.gz
- Upload the artifacts to dbfs
- Create a databricks job using spark-submit params similar to
["--class","org.apache.spark.deploy.dotnet.DotnetRunner","dbfs:/FileStore/OO/OO-TestApp/microsoft-spark-3-2_2.12-2.1.0.jar","dbfs:/FileStore/OO/OO-TestApp/SparkTestApp.zip","SparkTestApp"]
Use a 10.4 LTS cluster (includes Apache Spark 3.2.1, Scala 2.12)
Standard Error:
Warning: Ignoring non-Spark config property: libraryDownload.sleepIntervalSeconds Warning: Ignoring non-Spark config property: libraryDownload.timeoutSeconds Warning: Ignoring non-Spark config property: eventLog.rolloverIntervalSeconds Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.fetchFile$default$7()Z at org.apache.spark.deploy.dotnet.DotnetRunner$.downloadDriverFile(DotnetRunner.scala:222) at org.apache.spark.deploy.dotnet.DotnetRunner$.main(DotnetRunner.scala:77) at org.apache.spark.deploy.dotnet.DotnetRunner.main(DotnetRunner.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:956) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1045) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1054) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
log4j
22/04/11 16:23:30 INFO AzureNativeFileSystemStore: URI scheme: wasbs, using https for connections 22/04/11 16:23:30 INFO NativeAzureFileSystem: Delete with limit configurations: deleteFileCountLimitEnabled=false, deleteFileCountLimit=-1 22/04/11 16:23:30 INFO DBFS: Initialized DBFS with DBFSV2 as the delegate. 22/04/11 16:23:31 INFO Utils: Fetching dbfs:/FileStore/OO/OO-TestApp/microsoft-spark-3-2_2.12-2.1.0.jar to /local_disk0/tmp/spark-ba7cd3c5-19f9-4461-b80e-f0e7dc01fcda/fetchFileTemp4257510112956878374.tmp 22/04/11 16:23:31 WARN SparkConf: The configuration key 'spark.akka.frameSize' has been deprecated as of Spark 1.6 and may be removed in the future. Please use the new key 'spark.rpc.message.maxSize' instead. 22/04/11 16:23:31 INFO DotnetRunner: Copying user file dbfs:/FileStore/OO/OO-TestApp/SparkTestApp.zip to /databricks/driver 22/04/11 16:23:31 INFO ShutdownHookManager: Shutdown hook called 22/04/11 16:23:31 INFO ShutdownHookManager: Deleting directory /local_disk0/tmp/spark-ba7cd3c5-19f9-4461-b80e-f0e7dc01fcda