You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+18Lines changed: 18 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,6 +43,24 @@ For information on security features and the use of cryptography, see the [Secur
43
43
44
44
Security issues in the Charmed Apache Spark can be reported through [LaunchPad](https://wiki.ubuntu.com/DebuggingSecurity). Please do not file GitHub issues about security issues.
45
45
46
+
## Tutorial tests
47
+
48
+
The tutorial (`docs/tutorial/`) is tested end-to-end using [Spread](https://github.com/canonical/spread) inside a Multipass VM. Shell commands are extracted directly from the Markdown sources, so the tutorial itself is the test.
49
+
50
+
Run the full tutorial suite (extract + run Spread tests):
51
+
52
+
```bash
53
+
tox -e tutorial
54
+
```
55
+
56
+
Only generate test scripts from the Markdown tutorial pages (no VM needed):
57
+
58
+
```bash
59
+
tox -e tutorial-extract
60
+
```
61
+
62
+
Both commands must be run from the `python/` directory (or via `cd python && tox …`). See [python/tests/tutorial/TESTING.md](python/tests/tutorial/TESTING.md) for prerequisites, run modes, debug tips, and the full annotation reference.
63
+
46
64
## Contributing
47
65
48
66
Canonical welcomes contributions to Charmed Apache Spark. Please check out our [contribution guidelines](python/CONTRIBUTING.md) if you're interested in contributing to the solution. If you truly enjoy working on open-source projects like this one and you would like to be part of the OSS revolution, please don't forget to check out the [career opportunities](https://canonical.com/careers/all) we have at [Canonical](https://canonical.com/).
Wait for the commands to finish running and check the list of enabled add-ons:
145
157
146
-
```bash
158
+
```shell
147
159
microk8s status --wait-ready
148
160
```
149
161
@@ -171,19 +183,19 @@ The MicroK8s setup is complete.
171
183
For Apache Spark jobs to be running run on top of Kubernetes, a set of resources (ServiceAccount, associated Roles, RoleBindings etc.) need to be created and configured.
172
184
To simplify this task, the Charmed Apache Spark solution offers the `spark-client` snap. Install the snap:
173
185
174
-
```bash
186
+
```shell
175
187
sudo snap install spark-client --channel 3.4/edge
176
188
```
177
189
178
190
Let's create a Kubernetes namespace for us to use as a playground in this tutorial.
179
191
180
-
```bash
192
+
```shell
181
193
kubectl create namespace spark
182
194
```
183
195
184
196
We will now create a ServiceAccount that will be used to run the Spark jobs. The creation of the ServiceAccount can be done using the `spark-client` snap, which will create necessary Roles, RoleBindings and other necessary configurations along with the creation of the ServiceAccount:
185
197
186
-
```bash
198
+
```shell
187
199
spark-client.service-account-registry create \
188
200
--username spark --namespace spark
189
201
```
@@ -192,7 +204,7 @@ This command does a number of things in the background. First, it creates a Serv
192
204
193
205
These resources can be viewed with `kubectl get` commands as follows:
194
206
195
-
```bash
207
+
```shell
196
208
kubectl get serviceaccounts -n spark
197
209
kubectl get roles -n spark
198
210
kubectl get rolebindings -n spark
@@ -257,7 +269,7 @@ We'll use `juju` to deploy and manage the Spark History Server and a number of o
257
269
258
270
To install and configure a `juju` client using a snap:
At this stage, you may want to create a [snapshot](https://documentation.ubuntu.com/multipass/en/latest/reference/command-line-interface/snapshot/#snapshot) of the current state, for which you need to stop the Multipass VM. Exit the VM by pressing `CTRL + D` and stop it:
0 commit comments