Skip to content

Commit

Permalink
[Docs] Fix outdated docs for standalone cluster
Browse files Browse the repository at this point in the history
This is now supported!

Author: andrewor14 <[email protected]>
Author: Andrew Or <[email protected]>

Closes apache#2461 from andrewor14/document-standalone-cluster and squashes the following commits:

85c8b9e [andrewor14] Wording change per Patrick
35e30ee [Andrew Or] Fix outdated docs for standalone cluster
  • Loading branch information
andrewor14 committed Sep 19, 2014
1 parent 99b06b6 commit 8af2370
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions docs/spark-standalone.md
Original file line number Diff line number Diff line change
Expand Up @@ -248,8 +248,10 @@ You can also pass an option `--cores <numCores>` to control the number of cores

The [`spark-submit` script](submitting-applications.html) provides the most straightforward way to
submit a compiled Spark application to the cluster. For standalone clusters, Spark currently
only supports deploying the driver inside the client process that is submitting the application
(`client` deploy mode).
supports two deploy modes. In `client` mode, the driver is launched in the same process as the
client that submits the application. In `cluster` mode, however, the driver is launched from one
of the Worker processes inside the cluster, and the client process exits as soon as it fulfills
its responsibility of submitting the application without waiting for the application to finish.

If your application is launched through Spark submit, then the application jar is automatically
distributed to all worker nodes. For any additional jars that your application depends on, you
Expand Down

0 comments on commit 8af2370

Please sign in to comment.