Skip to content

Commit

Permalink
[SPARK-12735] Consolidate & move spark-ec2 to AMPLab managed repository.
Browse files Browse the repository at this point in the history
Author: Reynold Xin <[email protected]>

Closes apache#10673 from rxin/SPARK-12735.
  • Loading branch information
rxin committed Jan 10, 2016
1 parent 3efd106 commit 5b0d544
Show file tree
Hide file tree
Showing 14 changed files with 3 additions and 1,808 deletions.
1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,6 @@ dev/create-release/*final
spark-*-bin-*.tgz
unit-tests.log
/lib/
ec2/lib/
rat-results.txt
scalastyle.txt
scalastyle-output.xml
Expand Down
3 changes: 0 additions & 3 deletions dev/create-release/release-tag.sh
Original file line number Diff line number Diff line change
Expand Up @@ -64,9 +64,6 @@ git commit -a -m "Preparing Spark release $RELEASE_TAG"
echo "Creating tag $RELEASE_TAG at the head of $GIT_BRANCH"
git tag $RELEASE_TAG

# TODO: It would be nice to do some verifications here
# i.e. check whether ec2 scripts have the new version

# Create next version
$MVN versions:set -DnewVersion=$NEXT_VERSION | grep -v "no value" # silence logs
git commit -a -m "Preparing development version $NEXT_VERSION"
Expand Down
1 change: 0 additions & 1 deletion dev/create-release/releaseutils.py
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,6 @@ def get_commits(tag):
"build": CORE_COMPONENT,
"deploy": CORE_COMPONENT,
"documentation": CORE_COMPONENT,
"ec2": "EC2",
"examples": CORE_COMPONENT,
"graphx": "GraphX",
"input/output": CORE_COMPONENT,
Expand Down
2 changes: 1 addition & 1 deletion dev/lint-python
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@

SCRIPT_DIR="$( cd "$( dirname "$0" )" && pwd )"
SPARK_ROOT_DIR="$(dirname "$SCRIPT_DIR")"
PATHS_TO_CHECK="./python/pyspark/ ./ec2/spark_ec2.py ./examples/src/main/python/ ./dev/sparktestsupport"
PATHS_TO_CHECK="./python/pyspark/ ./examples/src/main/python/ ./dev/sparktestsupport"
PATHS_TO_CHECK="$PATHS_TO_CHECK ./dev/run-tests.py ./python/run-tests.py ./dev/run-tests-jenkins.py"
PEP8_REPORT_PATH="$SPARK_ROOT_DIR/dev/pep8-report.txt"
PYLINT_REPORT_PATH="$SPARK_ROOT_DIR/dev/pylint-report.txt"
Expand Down
9 changes: 0 additions & 9 deletions dev/sparktestsupport/modules.py
Original file line number Diff line number Diff line change
Expand Up @@ -406,15 +406,6 @@ def contains_file(self, filename):
should_run_build_tests=True
)

ec2 = Module(
name="ec2",
dependencies=[],
source_file_regexes=[
"ec2/",
]
)


yarn = Module(
name="yarn",
dependencies=[],
Expand Down
2 changes: 0 additions & 2 deletions docs/_layouts/global.html
Original file line number Diff line number Diff line change
Expand Up @@ -98,8 +98,6 @@
<li><a href="spark-standalone.html">Spark Standalone</a></li>
<li><a href="running-on-mesos.html">Mesos</a></li>
<li><a href="running-on-yarn.html">YARN</a></li>
<li class="divider"></li>
<li><a href="ec2-scripts.html">Amazon EC2</a></li>
</ul>
</li>

Expand Down
2 changes: 0 additions & 2 deletions docs/cluster-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,8 +53,6 @@ The system currently supports three cluster managers:
and service applications.
* [Hadoop YARN](running-on-yarn.html) -- the resource manager in Hadoop 2.

In addition, Spark's [EC2 launch scripts](ec2-scripts.html) make it easy to launch a standalone
cluster on Amazon EC2.

# Submitting Applications

Expand Down
192 changes: 0 additions & 192 deletions docs/ec2-scripts.md

This file was deleted.

5 changes: 2 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ To run Spark interactively in a R interpreter, use `bin/sparkR`:
./bin/sparkR --master local[2]

Example applications are also provided in R. For example,

./bin/spark-submit examples/src/main/r/dataframe.R

# Launching on a Cluster
Expand All @@ -73,7 +73,6 @@ The Spark [cluster mode overview](cluster-overview.html) explains the key concep
Spark can run both by itself, or over several existing cluster managers. It currently provides several
options for deployment:

* [Amazon EC2](ec2-scripts.html): our EC2 scripts let you launch a cluster in about 5 minutes
* [Standalone Deploy Mode](spark-standalone.html): simplest way to deploy Spark on a private cluster
* [Apache Mesos](running-on-mesos.html)
* [Hadoop YARN](running-on-yarn.html)
Expand Down Expand Up @@ -103,7 +102,7 @@ options for deployment:
* [Cluster Overview](cluster-overview.html): overview of concepts and components when running on a cluster
* [Submitting Applications](submitting-applications.html): packaging and deploying applications
* Deployment modes:
* [Amazon EC2](ec2-scripts.html): scripts that let you launch a cluster on EC2 in about 5 minutes
* [Amazon EC2](https://github.com/amplab/spark-ec2): scripts that let you launch a cluster on EC2 in about 5 minutes
* [Standalone Deploy Mode](spark-standalone.html): launch a standalone cluster quickly without a third-party cluster manager
* [Mesos](running-on-mesos.html): deploy a private cluster using
[Apache Mesos](http://mesos.apache.org)
Expand Down
4 changes: 0 additions & 4 deletions ec2/README

This file was deleted.

34 changes: 0 additions & 34 deletions ec2/deploy.generic/root/spark-ec2/ec2-variables.sh

This file was deleted.

25 changes: 0 additions & 25 deletions ec2/spark-ec2

This file was deleted.

Loading

0 comments on commit 5b0d544

Please sign in to comment.