-
Notifications
You must be signed in to change notification settings - Fork 38
Open
Description
When both Apache Spark and ClickHouse4j are on the classpath, the following issue occurs.
java.lang.NoSuchMethodError: 'void net.jpountz.lz4.LZ4BlockInputStream.<init>(java.io.InputStream, net.jpountz.lz4.LZ4FastDecompressor, java.util.zip.Checksum, boolean)'
org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:153)
org.apache.spark.sql.execution.SparkPlan.decodeUnsafeRows(SparkPlan.scala:367)
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeCollect$1(SparkPlan.scala:391)
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>cc.blynk.clickhouse</groupId>
<artifactId>clickhouse4j</artifactId>
</dependency>
The root cause of this are the re-packaged classes from net.jpountz.
Could you please consider to either
- relocate the classes in ClickHouse4j JAR (https://maven.apache.org/plugins/maven-shade-plugin/examples/class-relocation.html)
- or set the scope to provided and not include them
Metadata
Metadata
Assignees
Labels
No labels