Skip to content

Commit

Permalink
[HOTFIX] [ZEPPELIN-2286] Fix CI and split some test matrix that often…
Browse files Browse the repository at this point in the history
… exceeds time limits (50min)

This PR optimize CI test matrix while keeping the same test coverage.

from
1. RAT (1~2min)
2. All modules with -Pbuild-distr flag, spark 2.1 and scala 2.11 (40~50min)
3. All modules with -Pbuild-distr flag, spark 2.0 and scala 2.11 (40~50min)
4. spark 1.6 and scala 2.10 (7~9min)
5. spark 1.6 and scala 2.11 (7~8min)
7. Selenium (20~23min)
8. python2 (6~7min)
9. python3 (6 ~7min)
Total. 128 ~156min

to
1. RAT  (1~2min)
2. Core modules without interpreters (14~15min)
3. Selenium (20~23min)
4. All interpreters except for spark, livy (9~11min)
5. Spark 2.1 and scala 2.11, livy (7~9min)
6. Spark 2.0 and scala 2.11 (7~8min)
7. Spark 1.6 and scala 2.10 (7~8min)
8. Spark 1.6 and scala 2.11 (7~8min)
9. python2 (6~7min)
10. python3 (6 ~7min)
Total. (84 ~98min)

Improvement | Hot fix.

* [x] - Optimize CI test matrix

https://issues.apache.org/jira/browse/ZEPPELIN-2286

CI green

* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: Lee moon soo <[email protected]>

Closes apache#2162 from Leemoonsoo/split_ci_metrics and squashes the following commits:

08fb8ea [Lee moon soo] restore zeppelin-server/pom.xml
1b61b2c [Lee moon soo] adjust order of test considering travis scheduling
5234bfa [Lee moon soo] Livy 0.2 test does not work
0e3040a [Lee moon soo] remove explicit LIVY_VER
ec7af74 [Lee moon soo] add -DfailIfNoTests=false
957443f [Lee moon soo] try exclude test in different way
867b877 [Lee moon soo] set livy 0.3 explicitly
4ac3097 [Lee moon soo] other way to exclude spark from core module test
9958f78 [Lee moon soo] exclude spark test from core module test
04eebcb [Lee moon soo] fix profiles
39b7b65 [Lee moon soo] fix option
abe195a [Lee moon soo] add missing env
2e994a6 [Lee moon soo] fix travis.yml
abb54f9 [Lee moon soo] add test profile that test interpretes
08fcddc [Lee moon soo] try differnt way pass params
a80c94e [Lee moon soo] try differnt way set global env
57ffb38 [Lee moon soo] exclude interpreters does not reqruied by zeppelin-server integration test
05bf826 [Lee moon soo] Revert "assume spark interpreter may not exists in certain test metrics"
5d8d15c [Lee moon soo] include root pom in -pl
27da1cb [Lee moon soo] assume spark interpreter may not exists in certain test metrics
78784a8 [Lee moon soo] configure surefire plugin for zeppelin-server
939e0c7 [Lee moon soo] try set scala.version
d5340d0 [Lee moon soo] set fork count 1
76ee8fa [Lee moon soo] Define scala.binary.version
0654623 [Lee moon soo] Prevent download spark distribution when unnecessary
4c8ffd2 [Lee moon soo] Move out spark and livy test to separate test metrics

(cherry picked from commit 641863d)
Signed-off-by: Lee moon soo <[email protected]>
  • Loading branch information
Leemoonsoo committed Mar 20, 2017
1 parent 9f11353 commit 607cdb5
Showing 1 changed file with 26 additions and 13 deletions.
39 changes: 26 additions & 13 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,39 +33,52 @@ addons:
packages:
- r-base-dev

env:
global:
# Interpreters does not required by zeppelin-server integration tests
- INTERPRETERS='!hbase,!pig,!jdbc,!file,!flink,!ignite,!kylin,!python,!lens,!cassandra,!elasticsearch,!bigquery,!alluxio,!scio,!livy'

matrix:
include:
# Test License compliance using RAT tool
- jdk: "oraclejdk7"
env: SCALA_VER="2.11" SPARK_VER="2.0.2" HADOOP_VER="2.6" PROFILE="-Prat" BUILD_FLAG="clean" TEST_FLAG="org.apache.rat:apache-rat-plugin:check" TEST_PROJECTS=""

# Test all modules with spark 2.1.0 and scala 2.11
# Test core modules
- jdk: "oraclejdk7"
env: SCALA_VER="2.11" SPARK_VER="2.1.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.1 -Phadoop-2.6 -Ppyspark -Psparkr -Pscalding -Phelium-dev -Pexamples -Pscala-2.11" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" TEST_PROJECTS=""
env: SCALA_VER="2.11" SPARK_VER="2.1.0" HADOOP_VER="2.6" PROFILE="-Pscalding -Phelium-dev -Pexamples -Pscala-2.11" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" MODULES="-pl ${INTERPRETERS}" TEST_PROJECTS="-Dtest='!ZeppelinSparkClusterTest,!org.apache.zeppelin.spark.*' -DfailIfNoTests=false"

# Test all modules with spark 2.0.2 and scala 2.11
# Test selenium with spark module for 1.6.3
- jdk: "oraclejdk7"
env: SCALA_VER="2.11" SPARK_VER="2.0.2" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Psparkr -Pscalding -Phelium-dev -Pexamples -Pscala-2.11" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" TEST_PROJECTS=""
env: TEST_SELENIUM="true" SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Phelium-dev -Pexamples" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.AbstractFunctionalSuite -DfailIfNoTests=false"

# Test spark module for 1.6.3 with scala 2.10
# Test interpreter modules
- jdk: "oraclejdk7"
env: SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.10" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
env: SCALA_VER="2.10" PROFILE="-Pscalding" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl $(echo .,zeppelin-interpreter,${INTERPRETERS} | sed 's/!//g')" TEST_PROJECTS=""

# Test spark module for 1.6.3 with scala 2.11
# Test spark module for 2.1.0 with scala 2.11, livy
- jdk: "oraclejdk7"
env: SCALA_VER="2.11" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11 -Dscala.version=2.11.7 -Dscala.binary.version=2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
env: SCALA_VER="2.11" SPARK_VER="2.1.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.1 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark,livy" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.*,org.apache.zeppelin.livy.* -DfailIfNoTests=false"

# Test selenium with spark module for 1.6.3
# Test spark module for 2.0.2 with scala 2.11
- jdk: "oraclejdk7"
env: SCALA_VER="2.11" SPARK_VER="2.0.2" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"

# Test spark module for 1.6.3 with scala 2.10
- jdk: "oraclejdk7"
env: SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.10" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.*,org.apache.zeppelin.spark.* -DfailIfNoTests=false"

# Test spark module for 1.6.3 with scala 2.11
- jdk: "oraclejdk7"
env: TEST_SELENIUM="true" SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Phelium-dev -Pexamples" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.AbstractFunctionalSuite -DfailIfNoTests=false"
env: SCALA_VER="2.11" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"

# Test python/pyspark with python 2
- jdk: "oraclejdk7"
env: PYTHON="2" SCALA_VER="2.10" SPARK_VER="1.6.1" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark" BUILD_FLAG="package -pl spark,python -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
env: PYTHON="2" SCALA_VER="2.10" SPARK_VER="1.6.1" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark" BUILD_FLAG="package -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"

# Test python/pyspark with python 3
- jdk: "oraclejdk7"
env: PYTHON="3" SCALA_VER="2.11" SPARK_VER="2.0.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Pscala-2.11" BUILD_FLAG="package -pl spark,python -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
env: PYTHON="3" SCALA_VER="2.11" SPARK_VER="2.0.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Pscala-2.11" BUILD_FLAG="package -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"

# Test livy with spark 1.5.2 and hadoop 2.6
- jdk: "oraclejdk7"
Expand All @@ -84,7 +97,7 @@ install:
- mvn $BUILD_FLAG $MODULES $PROFILE -B

before_script:
- travis_retry ./testing/downloadSpark.sh $SPARK_VER $HADOOP_VER
- if [[ -n $SPARK_VER ]]; then travis_retry ./testing/downloadSpark.sh $SPARK_VER $HADOOP_VER; fi
- if [[ -n $LIVY_VER ]]; then ./testing/downloadLivy.sh $LIVY_VER; fi
- if [[ -n $LIVY_VER ]]; then export LIVY_HOME=`pwd`/livy-server-$LIVY_VER; fi
- if [[ -n $LIVY_VER ]]; then export SPARK_HOME=`pwd`/spark-$SPARK_VER-bin-hadoop$HADOOP_VER; fi
Expand Down

0 comments on commit 607cdb5

Please sign in to comment.