Skip to content

DOCSP-31186 Add sections for connections #243

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 6 additions & 7 deletions source/getting-started.txt
Original file line number Diff line number Diff line change
Expand Up @@ -48,12 +48,13 @@ Getting Started
Integrations
------------

You can integrate Spark with third-party platforms to use the {+connector-long+} in various external platforms.
The following sections describe some popular third-party platforms that you can
integrate Spark and the {+connector-long+} with.

Amazon EMR
~~~~~~~~~~

Amazon EMR is a managed cluster platform that you can run big data frameworks such as Spark on. To install Spark on an EMR cluster, see
Amazon EMR is a managed cluster platform that you can use to run big data frameworks like Spark. To install Spark on an EMR cluster, see
`Getting Started with Amazon EMR <https://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-gs.html>`__ in the AWS documentation.

Databricks
Expand All @@ -65,12 +66,10 @@ see `MongoDB <https://docs.databricks.com/aws/en/connect/external-systems/mongod
Docker
~~~~~~

Docker is an open-source platform that helps developers build, share, and run applications in containers. The following steps guide you through the process of connecting to a Docker container
and integrating the {+connector-long+} in Docker.
Docker is an open-source platform that helps developers build, share, and run applications in containers.

1. To start Spark in a Docker container, see `Apache Spark <https://hub.docker.com/r/apache/spark#!>`__ in the Docker documentation and follow the steps provided.
#. See `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__ to deploy Atlas on Docker.
#. Select the appropriate language in the tabs under :ref:`Getting Started <pyspark-shell>` and follow the steps provided.
- To start Spark in a Docker container, see `Apache Spark <https://hub.docker.com/r/apache/spark#!>`__ in the Docker documentation and follow the steps provided.
- See `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__ to deploy Atlas on Docker.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SG: 'Link text should begin with the purpose of the cross-reference'

Suggested change
- See `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__ to deploy Atlas on Docker.
- To learn how to deploy Atlas on Docker, see `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__.


Kubernetes
~~~~~~~~~~
Expand Down
Loading