@@ -21,9 +21,6 @@ If you wish to use `docker compose`, perform the following steps before deployin
21
21
22
22
The above will result in a new MarkLogic instance with a single node.
23
23
24
- Alternatively, if you would like to test against a 3-node MarkLogic cluster with a load balancer in front of it,
25
- run ` docker compose -f docker-compose-3nodes.yaml up -d --build ` .
26
-
27
24
## Deploying the test application
28
25
29
26
To deploy the test application, first create ` ./gradle-local.properties ` and add the following to it:
@@ -59,18 +56,18 @@ To configure the SonarQube service, perform the following steps:
59
56
7 . Click on "Use the global setting" and then "Create project".
60
57
8 . On the "Analysis Method" page, click on "Locally".
61
58
9 . In the "Provide a token" panel, click on "Generate". Copy the token.
62
- 10 . Add ` systemProp.sonar.token =your token pasted here ` to ` gradle-local.properties ` in the root of your project, creating
59
+ 10 . Add ` systemProp.sonar.login =your token pasted here ` to ` gradle-local.properties ` in the root of your project, creating
63
60
that file if it does not exist yet.
64
61
65
62
To run SonarQube, run the following Gradle tasks using Java 17, which will run all the tests with code coverage and
66
63
then generate a quality report with SonarQube:
67
64
68
65
./gradlew test sonar
69
66
70
- If you do not add ` systemProp.sonar.token ` to your ` gradle-local.properties ` file, you can specify the token via the
67
+ If you do not add ` systemProp.sonar.login ` to your ` gradle-local.properties ` file, you can specify the token via the
71
68
following:
72
69
73
- ./gradlew test sonar -Dsonar.token =paste your token here
70
+ ./gradlew test sonar -Dsonar.login =paste your token here
74
71
75
72
When that completes, you will see a line like this near the end of the logging:
76
73
@@ -87,25 +84,6 @@ You can also force Gradle to run `sonar` if any tests fail:
87
84
88
85
./gradlew clean test sonar --continue
89
86
90
- ## Accessing MarkLogic logs in Grafana
91
-
92
- This project's ` docker-compose-3nodes.yaml ` file includes
93
- [ Grafana, Loki, and promtail services] ( https://grafana.com/docs/loki/latest/clients/promtail/ ) for the primary reason of
94
- collecting MarkLogic log files and allowing them to be viewed and searched via Grafana.
95
-
96
- Once you have run ` docker compose ` , you can access Grafana at http://localhost:3000 . Follow these instructions to
97
- access MarkLogic logging data:
98
-
99
- 1 . Click on the hamburger in the upper left hand corner and select "Explore", or simply go to
100
- http://localhost:3000/explore .
101
- 2 . Verify that "Loki" is the default data source - you should see it selected in the upper left hand corner below
102
- the "Home" link.
103
- 3 . Click on the "Select label" dropdown and choose ` job ` . Click on the "Select value" label for this filter and
104
- select ` marklogic ` as the value.
105
- 4 . Click on the blue "Run query" button in the upper right hand corner.
106
-
107
- You should now see logs from all 3 nodes in the MarkLogic cluster.
108
-
109
87
# Testing with PySpark
110
88
111
89
The documentation for this project
@@ -120,7 +98,7 @@ This will produce a single jar file for the connector in the `./build/libs` dire
120
98
121
99
You can then launch PySpark with the connector available via:
122
100
123
- pyspark --jars build/libs/marklogic-spark-connector-2.4 -SNAPSHOT.jar
101
+ pyspark --jars marklogic-spark-connector/ build/libs/marklogic-spark-connector-2.5 -SNAPSHOT.jar
124
102
125
103
The below command is an example of loading data from the test application deployed via the instructions at the top of
126
104
this page.
@@ -164,7 +142,7 @@ For a quick test of writing documents, use the following:
164
142
165
143
```
166
144
167
- spark.read.option("header", True).csv("src/test/resources/data.csv")\
145
+ spark.read.option("header", True).csv("marklogic-spark-connector/ src/test/resources/data.csv")\
168
146
.repartition(2)\
169
147
.write.format("marklogic")\
170
148
.option("spark.marklogic.client.uri", "spark-test-user:spark@localhost:8000")\
@@ -196,7 +174,7 @@ The Spark master GUI is at <http://localhost:8080>. You can use this to view det
196
174
197
175
Now that you have a Spark cluster running, you just need to tell PySpark to connect to it:
198
176
199
- pyspark --master spark://NYWHYC3G0W:7077 --jars build/libs/marklogic-spark-connector-2.4 -SNAPSHOT.jar
177
+ pyspark --master spark://NYWHYC3G0W:7077 --jars marklogic-spark-connector/ build/libs/marklogic-spark-connector-2.5 -SNAPSHOT.jar
200
178
201
179
You can then run the same commands as shown in the PySpark section above. The Spark master GUI will allow you to
202
180
examine details of each of the commands that you run.
@@ -215,12 +193,16 @@ You will need the connector jar available, so run `./gradlew clean shadowJar` if
215
193
You can then run a test Python program in this repository via the following (again, change the master address as
216
194
needed); note that you run this outside of PySpark, and ` spark-submit ` is available after having installed PySpark:
217
195
218
- spark-submit --master spark://NYWHYC3G0W:7077 --jars build/libs/marklogic-spark-connector-2.4 -SNAPSHOT.jar src/test/python/test_program.py
196
+ spark-submit --master spark://NYWHYC3G0W:7077 --jars marklogic-spark-connector/ build/libs/marklogic-spark-connector-2.5 -SNAPSHOT.jar marklogic-spark-connector/ src/test/python/test_program.py
219
197
220
198
You can also test a Java program. To do so, first move the ` com.marklogic.spark.TestProgram ` class from ` src/test/java `
221
- to ` src/main/java ` . Then run ` ./gradlew clean shadowJar ` to rebuild the connector jar. Then run the following:
199
+ to ` src/main/java ` . Then run the following:
222
200
223
- spark-submit --master spark://NYWHYC3G0W:7077 --class com.marklogic.spark.TestProgram build/libs/marklogic-spark-connector-2.4-SNAPSHOT.jar
201
+ ```
202
+ ./gradlew clean shadowJar
203
+ cd marklogic-spark-connector
204
+ spark-submit --master spark://NYWHYC3G0W:7077 --class com.marklogic.spark.TestProgram build/libs/marklogic-spark-connector-2.5-SNAPSHOT.jar
205
+ ```
224
206
225
207
Be sure to move ` TestProgram ` back to ` src/test/java ` when you are done.
226
208
0 commit comments