@@ -50,7 +50,7 @@ To create a Spark distribution like those distributed by the
50
50
to be runnable, use ` ./dev/make-distribution.sh ` in the project root directory. It can be configured
51
51
with Maven profile settings and so on like the direct Maven build. Example:
52
52
53
- ./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
53
+ ./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pmesos - Pyarn
54
54
55
55
For more information on usage, run ` ./dev/make-distribution.sh --help `
56
56
@@ -105,13 +105,17 @@ By default Spark will build with Hive 1.2.1 bindings.
105
105
106
106
## Packaging without Hadoop Dependencies for YARN
107
107
108
- The assembly directory produced by ` mvn package ` will, by default, include all of Spark's
109
- dependencies, including Hadoop and some of its ecosystem projects. On YARN deployments, this
110
- causes multiple versions of these to appear on executor classpaths: the version packaged in
108
+ The assembly directory produced by ` mvn package ` will, by default, include all of Spark's
109
+ dependencies, including Hadoop and some of its ecosystem projects. On YARN deployments, this
110
+ causes multiple versions of these to appear on executor classpaths: the version packaged in
111
111
the Spark assembly and the version on each node, included with ` yarn.application.classpath ` .
112
- The ` hadoop-provided ` profile builds the assembly without including Hadoop-ecosystem projects,
112
+ The ` hadoop-provided ` profile builds the assembly without including Hadoop-ecosystem projects,
113
113
like ZooKeeper and Hadoop itself.
114
114
115
+ ## Building with Mesos support
116
+
117
+ ./build/mvn -Pmesos -DskipTests clean package
118
+
115
119
## Building for Scala 2.10
116
120
To produce a Spark package compiled with Scala 2.10, use the ` -Dscala-2.10 ` property:
117
121
@@ -263,17 +267,17 @@ The run-tests script also can be limited to a specific Python version or a speci
263
267
264
268
## Running R Tests
265
269
266
- To run the SparkR tests you will need to install the R package ` testthat `
267
- (run ` install.packages(testthat) ` from R shell). You can run just the SparkR tests using
270
+ To run the SparkR tests you will need to install the R package ` testthat `
271
+ (run ` install.packages(testthat) ` from R shell). You can run just the SparkR tests using
268
272
the command:
269
273
270
274
./R/run-tests.sh
271
275
272
276
## Running Docker-based Integration Test Suites
273
277
274
- In order to run Docker integration tests, you have to install the ` docker ` engine on your box.
275
- The instructions for installation can be found at [ the Docker site] ( https://docs.docker.com/engine/installation/ ) .
276
- Once installed, the ` docker ` service needs to be started, if not already running.
278
+ In order to run Docker integration tests, you have to install the ` docker ` engine on your box.
279
+ The instructions for installation can be found at [ the Docker site] ( https://docs.docker.com/engine/installation/ ) .
280
+ Once installed, the ` docker ` service needs to be started, if not already running.
277
281
On Linux, this can be done by ` sudo service docker start ` .
278
282
279
283
./build/mvn install -DskipTests
0 commit comments