Skip to content

Commit de38457

Browse files
authored
#71: Publish fails on scaladoc generation (#72)
* removed failing `ScalaDoc` class references * added check of documentation generation into the `build.yml` action
1 parent ac7d53b commit de38457

File tree

3 files changed

+6
-5
lines changed

3 files changed

+6
-5
lines changed

.github/workflows/build.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,4 +33,4 @@ jobs:
3333
with:
3434
java-version: "[email protected]"
3535
- name: Build and run tests
36-
run: sbt test
36+
run: sbt test doc

spark-commons/src/main/scala/za/co/absa/spark/commons/OncePerSparkSession.scala

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -22,11 +22,11 @@ import java.util.concurrent.ConcurrentHashMap
2222

2323
/**
2424
* Abstract class to help attach/register UDFs and similar object only once to a spark session.
25-
* This is done by using the companion object's registry [[ConcurrentHashMap]] that holds already
25+
* This is done by using the companion object's registry `ConcurrentHashMap` that holds already
2626
* instantiated classes thus not running the method [[register]] again on them.
2727
*
2828
* Usage: extend this abstract class and implement the method [[register]]. On initialization the
29-
* [[register]] method gets called by the [[registerMe]] method if the class + spark session
29+
* [[register]] method gets called by the [[za.co.absa.spark.commons.OncePerSparkSession$.registerMe]] method if the class + spark session
3030
* combination is unique. If it is not unique [[register]] will not get called again.
3131
* This way we ensure only single registration per spark session.
3232
*
@@ -51,7 +51,8 @@ object OncePerSparkSession {
5151
)
5252
}
5353

54-
private def registerMe(library: OncePerSparkSession, spark: SparkSession): Unit = {
54+
protected def registerMe(library: OncePerSparkSession, spark: SparkSession): Unit = {
55+
// the function is `protected` to make it visible to `ScalaDoc`
5556
Option(registry.putIfAbsent(makeKey(library, spark), Unit))
5657
.getOrElse(library.register(spark))
5758
}

spark-commons/src/main/scala/za/co/absa/spark/commons/sql/functions.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ object functions {
2727
// scalastyle:on
2828

2929
/**
30-
* Similarly to [[col]] function evaluates the column based on the provided column name. But here, it can be a full
30+
* Similarly to `col` function evaluates the column based on the provided column name. But here, it can be a full
3131
* path even of nested fields. It also evaluates arrays and maps where the array index or map key is in brackets `[]`.
3232
*
3333
* @param fullColName The full name of the column, where segments are separated by dots and array indexes or map keys

0 commit comments

Comments
 (0)