Open
Description
What happened?
Attempting to use the Spark "uber jar" job server when not in local mode fails with a NoSuchFileException.
python -m apache_beam.examples.wordcount \
--output ./data_test/ \
--runner=SparkRunner \
--spark_submit_uber_jar \
--spark_master_url=spark://spark-master:7077 \
--environment_type=LOOPBACK
This was reported in https://stackoverflow.com/a/66342031/4278032, I was able to reproduce the issue locally.
ERROR:root:Exception from the cluster: java.nio.file.NoSuchFileException: /tmp/tmp029vgh_0.jar
A work-around is to mount some shared storage for /tmp
on all workers and the job itself. That suggests the jar isn't copied to all places where required.
Issue Priority
Priority: 2
Issue Component
Component: runner-spark