Skip to content

Commit 0ea395f

Browse files
authored
Merge pull request #48 from Mu-Sigma/develop
Corrected a mistake in the description regarding SparkR installation
2 parents ab5cfd2 + 555096a commit 0ea395f

File tree

5 files changed

+5
-5
lines changed

5 files changed

+5
-5
lines changed

DESCRIPTION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ Description: Enables data scientists to compose pipelines of analysis which cons
1616
Note - To enable pipelines involving Spark tasks, the package uses the 'SparkR' package.
1717
The SparkR package needs to be installed to use Spark as an engine within a pipeline. SparkR is distributed natively with Apache Spark and is not distributed on CRAN. The SparkR version needs to directly map to the Spark version (hence the native distribution), and care needs to be taken to ensure that this is configured properly.
1818
To install SparkR from Github, run the following command if you know the Spark version: 'devtools::install_github('apache/[email protected]', subdir='R/pkg')'.
19-
The other option is to install R by running the following terminal commands if Spark has already been installed: '$ export SPARK_HOME=/path/to/spark/directory && cd $SPARK_HOME/R/lib/SparkR/ && R -e "devtools::install('.')"'.
19+
The other option is to install SparkR by running the following terminal commands if Spark has already been installed: '$ export SPARK_HOME=/path/to/spark/directory && cd $SPARK_HOME/R/lib/SparkR/ && R -e "devtools::install('.')"'.
2020
Depends: R (>= 3.4.0), magrittr, pipeR, methods
2121
Imports: ggplot2, dplyr, futile.logger, RCurl, rlang (>= 0.3.0), proto, purrr, devtools
2222
Suggests: plotly, knitr, rmarkdown, parallel, visNetwork, rjson, DT, shiny, R.devices, corrplot, car, foreign

R/analysisPipelines_package.R

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
#' \itemize{
1212
#' \item devtools::install_github('apache/[email protected]', subdir='R/pkg')
1313
#' }
14-
#' The other option is to install R by running the following terminal commands if Spark has already been installed:
14+
#' The other option is to install SparkR by running the following terminal commands if Spark has already been installed:
1515
#' \itemize{
1616
#' \item $ export SPARK_HOME=/path/to/spark/directory
1717
#' \item $ cd $SPARK_HOME/R/lib/SparkR/

man/analysisPipelines.Rd

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

vignettes/Analysis_pipelines_for_working_with_sparkR.Rmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ To install from Github, run the following command, if you know the Spark version
2525
devtools::install_github('apache/[email protected]', subdir='R/pkg')
2626
```
2727

28-
The other option is to install R by running the following *terminal* commands if Spark has already been installed.
28+
The other option is to install SparkR by running the following *terminal* commands if Spark has already been installed.
2929

3030
```{bash eval = F}
3131
$ export SPARK_HOME=/path/to/spark/directory

vignettes/Streaming_pipelines_for_working_Apache_Spark_Structured_Streaming.Rmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To install from Github, run the following command, if you know the Spark version
2626
devtools::install_github('apache/[email protected]', subdir='R/pkg')
2727
```
2828

29-
The other option is to install R by running the following *terminal* commands if Spark has already been installed.
29+
The other option is to install SparkR by running the following *terminal* commands if Spark has already been installed.
3030

3131
```{bash eval = F}
3232
$ export SPARK_HOME=/path/to/spark/directory

0 commit comments

Comments
 (0)