Skip to content

Commit 78d5d4d

Browse files
HyukjinKwonshivaram
authored andcommittedSep 8, 2016
[SPARK-17200][PROJECT INFRA][BUILD][SPARKR] Automate building and testing on Windows (currently SparkR only)
## What changes were proposed in this pull request? This PR adds the build automation on Windows with [AppVeyor](https://www.appveyor.com/) CI tool. Currently, this only runs the tests for SparkR as we have been having some issues with testing Windows-specific PRs (e.g. apache#14743 and apache#13165) and hard time to verify this. One concern is, this build is dependent on [steveloughran/winutils](https://github.com/steveloughran/winutils) for pre-built Hadoop bin package (who is a Hadoop PMC member). ## How was this patch tested? Manually, https://ci.appveyor.com/project/HyukjinKwon/spark/build/88-SPARK-17200-build-profile This takes roughly 40 mins. Some tests are already being failed and this was found in apache#14743 (comment). Author: hyukjinkwon <[email protected]> Closes apache#14859 from HyukjinKwon/SPARK-17200-build.
1 parent f0d21b7 commit 78d5d4d

File tree

3 files changed

+350
-0
lines changed

3 files changed

+350
-0
lines changed
 

‎appveyor.yml

+56
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
# Licensed to the Apache Software Foundation (ASF) under one or more
2+
# contributor license agreements. See the NOTICE file distributed with
3+
# this work for additional information regarding copyright ownership.
4+
# The ASF licenses this file to You under the Apache License, Version 2.0
5+
# (the "License"); you may not use this file except in compliance with
6+
# the License. You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
16+
version: "{build}-{branch}"
17+
18+
shallow_clone: true
19+
20+
platform: x64
21+
configuration: Debug
22+
23+
branches:
24+
only:
25+
- master
26+
27+
only_commits:
28+
files:
29+
- R/
30+
31+
cache:
32+
- C:\Users\appveyor\.m2
33+
34+
install:
35+
# Install maven and dependencies
36+
- ps: .\dev\appveyor-install-dependencies.ps1
37+
# Required package for R unit tests
38+
- cmd: R -e "install.packages('testthat', repos='http://cran.us.r-project.org')"
39+
- cmd: R -e "packageVersion('testthat')"
40+
- cmd: R -e "install.packages('e1071', repos='http://cran.us.r-project.org')"
41+
- cmd: R -e "packageVersion('e1071')"
42+
- cmd: R -e "install.packages('survival', repos='http://cran.us.r-project.org')"
43+
- cmd: R -e "packageVersion('survival')"
44+
45+
build_script:
46+
- cmd: mvn -DskipTests -Phadoop-2.6 -Psparkr -Phive -Phive-thriftserver package
47+
48+
test_script:
49+
- cmd: .\bin\spark-submit2.cmd --conf spark.hadoop.fs.default.name="file:///" R\pkg\tests\run-all.R
50+
51+
notifications:
52+
- provider: Email
53+
on_build_success: false
54+
on_build_failure: false
55+
on_build_status_changed: false
56+

‎dev/appveyor-guide.md

+168
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,168 @@
1+
# AppVeyor Guides
2+
3+
Currently, SparkR on Windows is being tested with [AppVeyor](https://ci.appveyor.com). This page describes how to set up AppVeyor with Spark, how to run the build, check the status and stop the build via this tool. There is the documenation for AppVeyor [here](https://www.appveyor.com/docs). Please refer this for full details.
4+
5+
6+
### Setting up AppVeyor
7+
8+
#### Sign up AppVeyor.
9+
10+
- Go to https://ci.appveyor.com, and then click "SIGN UP FOR FREE".
11+
12+
<img width="196" alt="2016-09-04 11 07 48" src="https://cloud.githubusercontent.com/assets/6477701/18228809/2c923aa4-7299-11e6-91b4-f39eff5727ba.png">
13+
14+
- As Apache Spark is one of open source projects, click "FREE - for open-source projects".
15+
16+
<img width="379" alt="2016-09-04 11 07 58" src="https://cloud.githubusercontent.com/assets/6477701/18228810/2f674e5e-7299-11e6-929d-5c2dff269ddc.png">
17+
18+
- Click "Github".
19+
20+
<img width="360" alt="2016-09-04 11 08 10" src="https://cloud.githubusercontent.com/assets/6477701/18228811/344263a0-7299-11e6-90b7-9b1c7b6b8b01.png">
21+
22+
23+
#### After signing up, go to profile to link Github and AppVeyor.
24+
25+
- Click your account and then click "Profile".
26+
27+
<img width="204" alt="2016-09-04 11 09 43" src="https://cloud.githubusercontent.com/assets/6477701/18228803/12a4b810-7299-11e6-9140-5cfc277297b1.png">
28+
29+
- Enable the link with GitHub via clicking "Link Github account".
30+
31+
<img width="256" alt="2016-09-04 11 09 52" src="https://cloud.githubusercontent.com/assets/6477701/18228808/23861584-7299-11e6-9352-640a9c747c83.png">
32+
33+
- Click "Authorize application" in Github site.
34+
35+
<img width="491" alt="2016-09-04 11 10 05" src="https://cloud.githubusercontent.com/assets/6477701/18228814/5cc239e0-7299-11e6-8aeb-71305e22d930.png">
36+
37+
38+
#### Add a project, Spark to enable the builds.
39+
40+
- Go to the PROJECTS menu.
41+
42+
<img width="97" alt="2016-08-30 12 16 31" src="https://cloud.githubusercontent.com/assets/6477701/18075017/2e572ffc-6eac-11e6-8e72-1531c81717a0.png">
43+
44+
- Click "NEW PROJECT" to add Spark.
45+
46+
<img width="144" alt="2016-08-30 12 16 35" src="https://cloud.githubusercontent.com/assets/6477701/18075026/3ee57bc6-6eac-11e6-826e-5dd09aeb0e7c.png">
47+
48+
- Since we will use Github here, click the "GITHUB" button and then click "Authorize Github" so that AppVeyor can access to the Github logs (e.g. commits).
49+
50+
<img width="517" alt="2016-09-04 11 10 22" src="https://cloud.githubusercontent.com/assets/6477701/18228819/9a4d5722-7299-11e6-900c-c5ff6b0450b1.png">
51+
52+
- Click "Authorize application" from Github (the above step will pop up this page).
53+
54+
<img width="484" alt="2016-09-04 11 10 27" src="https://cloud.githubusercontent.com/assets/6477701/18228820/a7cfce02-7299-11e6-8ec0-1dd7807eecb7.png">
55+
56+
- Come back to https://ci.appveyor.com/projects/new and then adds "spark".
57+
58+
<img width="738" alt="2016-09-04 11 10 36" src="https://cloud.githubusercontent.com/assets/6477701/18228821/b4b35918-7299-11e6-968d-233f18bc2cc7.png">
59+
60+
61+
#### Check if any event supposed to run the build actually triggers the build.
62+
63+
- Click "PROJECTS" menu.
64+
65+
<img width="97" alt="2016-08-30 12 16 31" src="https://cloud.githubusercontent.com/assets/6477701/18075017/2e572ffc-6eac-11e6-8e72-1531c81717a0.png">
66+
67+
- Click Spark project.
68+
69+
<img width="707" alt="2016-09-04 11 22 37" src="https://cloud.githubusercontent.com/assets/6477701/18228828/5174cad4-729a-11e6-8737-bb7b9e0703c8.png">
70+
71+
72+
### Checking the status, restarting and stopping the build
73+
74+
- Click "PROJECTS" menu.
75+
76+
<img width="97" alt="2016-08-30 12 16 31" src="https://cloud.githubusercontent.com/assets/6477701/18075017/2e572ffc-6eac-11e6-8e72-1531c81717a0.png">
77+
78+
- Locate "spark" and click it.
79+
80+
<img width="707" alt="2016-09-04 11 22 37" src="https://cloud.githubusercontent.com/assets/6477701/18228828/5174cad4-729a-11e6-8737-bb7b9e0703c8.png">
81+
82+
- Here, we can check the status of current build. Also, "HISTORY" shows the past build history.
83+
84+
<img width="709" alt="2016-09-04 11 23 24" src="https://cloud.githubusercontent.com/assets/6477701/18228825/01b4763e-729a-11e6-8486-1429a88d2bdd.png">
85+
86+
- If the build is stopped, "RE-BUILD COMMIT" button appears. Click this button to restart the build.
87+
88+
<img width="176" alt="2016-08-30 12 29 41" src="https://cloud.githubusercontent.com/assets/6477701/18075336/de618b52-6eae-11e6-8f01-e4ce48963087.png">
89+
90+
- If the build is running, "CANCEL BUILD" buttom appears. Click this button top cancel the current build.
91+
92+
<img width="158" alt="2016-08-30 1 11 13" src="https://cloud.githubusercontent.com/assets/6477701/18075806/4de68564-6eb3-11e6-855b-ee22918767f9.png">
93+
94+
95+
### Specifying the branch for building and setting the build schedule
96+
97+
Note: It seems the configurations in UI and `appveyor.yml` are mutually exclusive according to the [documentation](https://www.appveyor.com/docs/build-configuration/#configuring-build).
98+
99+
100+
- Click the settings button on the right.
101+
102+
<img width="1010" alt="2016-08-30 1 19 12" src="https://cloud.githubusercontent.com/assets/6477701/18075954/65d1aefa-6eb4-11e6-9a45-b9a9295f5085.png">
103+
104+
- Set the default branch to build as above.
105+
106+
<img width="422" alt="2016-08-30 12 42 25" src="https://cloud.githubusercontent.com/assets/6477701/18075416/8fac36c8-6eaf-11e6-9262-797a2a66fec4.png">
107+
108+
- Specify the branch in order to exclude the builds in other branches.
109+
110+
<img width="358" alt="2016-08-30 12 42 33" src="https://cloud.githubusercontent.com/assets/6477701/18075421/97b17734-6eaf-11e6-8b19-bc1dca840c96.png">
111+
112+
- Set the Crontab expression to regularly start the build. AppVeyor uses Crontab expression, [atifaziz/NCrontab](https://github.com/atifaziz/NCrontab/wiki/Crontab-Expression). Please refer the examples [here](https://github.com/atifaziz/NCrontab/wiki/Crontab-Examples).
113+
114+
115+
<img width="471" alt="2016-08-30 12 42 43" src="https://cloud.githubusercontent.com/assets/6477701/18075450/d4ef256a-6eaf-11e6-8e41-74e38dac8ca0.png">
116+
117+
118+
### Filtering commits and Pull Requests
119+
120+
Currently, AppVeyor is only used for SparkR. So, the build is only triggered when R codes are changed.
121+
122+
This is specified in `.appveyor.yml` as below:
123+
124+
```
125+
only_commits:
126+
files:
127+
- R/
128+
```
129+
130+
Please refer https://www.appveyor.com/docs/how-to/filtering-commits for more details.
131+
132+
133+
### Checking the full log of the build
134+
135+
Currently, the console in AppVeyor does not print full details. This can be manually checked. For example, AppVeyor shows the failed tests as below in console
136+
137+
```
138+
Failed -------------------------------------------------------------------------
139+
1. Error: union on two RDDs (@test_binary_function.R#38) -----------------------
140+
1: textFile(sc, fileName) at C:/projects/spark/R/lib/SparkR/tests/testthat/test_binary_function.R:38
141+
2: callJMethod(sc, "textFile", path, getMinPartitions(sc, minPartitions))
142+
3: invokeJava(isStatic = FALSE, objId$id, methodName, ...)
143+
4: stop(readString(conn))
144+
```
145+
146+
After downloading the log by clicking the log button as below:
147+
148+
![2016-09-08 11 37 17](https://cloud.githubusercontent.com/assets/6477701/18335227/b07d0782-75b8-11e6-94da-1b88cd2a2402.png)
149+
150+
the details can be checked as below (e.g. exceptions)
151+
152+
```
153+
Failed -------------------------------------------------------------------------
154+
1. Error: spark.lda with text input (@test_mllib.R#655) ------------------------
155+
org.apache.spark.sql.AnalysisException: Path does not exist: file:/C:/projects/spark/R/lib/SparkR/tests/testthat/data/mllib/sample_lda_data.txt;
156+
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$12.apply(DataSource.scala:376)
157+
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$12.apply(DataSource.scala:365)
158+
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
159+
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
160+
...
161+
162+
1: read.text("data/mllib/sample_lda_data.txt") at C:/projects/spark/R/lib/SparkR/tests/testthat/test_mllib.R:655
163+
2: dispatchFunc("read.text(path)", x, ...)
164+
3: f(x, ...)
165+
4: callJMethod(read, "text", paths)
166+
5: invokeJava(isStatic = FALSE, objId$id, methodName, ...)
167+
6: stop(readString(conn))
168+
```

‎dev/appveyor-install-dependencies.ps1

+126
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,126 @@
1+
<#
2+
Licensed to the Apache Software Foundation (ASF) under one
3+
or more contributor license agreements. See the NOTICE file
4+
distributed with this work for additional information
5+
regarding copyright ownership. The ASF licenses this file
6+
to you under the Apache License, Version 2.0 (the
7+
"License"); you may not use this file except in compliance
8+
with the License. You may obtain a copy of the License at
9+
http://www.apache.org/licenses/LICENSE-2.0
10+
Unless required by applicable law or agreed to in writing,
11+
software distributed under the License is distributed on an
12+
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
13+
KIND, either express or implied. See the License for the
14+
specific language governing permissions and limitations
15+
under the License.
16+
#>
17+
18+
$CRAN = "https://cloud.r-project.org"
19+
20+
Function InstallR {
21+
if ( -not(Test-Path Env:\R_ARCH) ) {
22+
$arch = "i386"
23+
}
24+
Else {
25+
$arch = $env:R_ARCH
26+
}
27+
28+
$urlPath = ""
29+
$latestVer = $(ConvertFrom-JSON $(Invoke-WebRequest http://rversions.r-pkg.org/r-release).Content).version
30+
If ($rVer -ne $latestVer) {
31+
$urlPath = ("old/" + $rVer + "/")
32+
}
33+
34+
$rurl = $CRAN + "/bin/windows/base/" + $urlPath + "R-" + $rVer + "-win.exe"
35+
36+
# Downloading R
37+
Start-FileDownload $rurl "R-win.exe"
38+
39+
# Running R installer
40+
Start-Process -FilePath .\R-win.exe -ArgumentList "/VERYSILENT /DIR=C:\R" -NoNewWindow -Wait
41+
42+
$RDrive = "C:"
43+
echo "R is now available on drive $RDrive"
44+
45+
$env:PATH = $RDrive + '\R\bin\' + $arch + ';' + 'C:\MinGW\msys\1.0\bin;' + $env:PATH
46+
47+
# Testing R installation
48+
Rscript -e "sessionInfo()"
49+
}
50+
51+
Function InstallRtools {
52+
$rtoolsver = $rToolsVer.Split('.')[0..1] -Join ''
53+
$rtoolsurl = $CRAN + "/bin/windows/Rtools/Rtools$rtoolsver.exe"
54+
55+
# Downloading Rtools
56+
Start-FileDownload $rtoolsurl "Rtools-current.exe"
57+
58+
# Running Rtools installer
59+
Start-Process -FilePath .\Rtools-current.exe -ArgumentList /VERYSILENT -NoNewWindow -Wait
60+
61+
$RtoolsDrive = "C:"
62+
echo "Rtools is now available on drive $RtoolsDrive"
63+
64+
if ( -not(Test-Path Env:\GCC_PATH) ) {
65+
$gccPath = "gcc-4.6.3"
66+
}
67+
Else {
68+
$gccPath = $env:GCC_PATH
69+
}
70+
$env:PATH = $RtoolsDrive + '\Rtools\bin;' + $RtoolsDrive + '\Rtools\MinGW\bin;' + $RtoolsDrive + '\Rtools\' + $gccPath + '\bin;' + $env:PATH
71+
$env:BINPREF=$RtoolsDrive + '/Rtools/mingw_$(WIN)/bin/'
72+
}
73+
74+
# create tools directory outside of Spark directory
75+
$up = (Get-Item -Path ".." -Verbose).FullName
76+
$tools = "$up\tools"
77+
if (!(Test-Path $tools)) {
78+
New-Item -ItemType Directory -Force -Path $tools | Out-Null
79+
}
80+
81+
# ========================== Maven
82+
Push-Location $tools
83+
84+
$mavenVer = "3.3.9"
85+
Start-FileDownload "https://archive.apache.org/dist/maven/maven-3/$mavenVer/binaries/apache-maven-$mavenVer-bin.zip" "maven.zip"
86+
87+
# extract
88+
Invoke-Expression "7z.exe x maven.zip"
89+
90+
# add maven to environment variables
91+
$env:Path += ";$tools\apache-maven-$mavenVer\bin"
92+
$env:M2_HOME = "$tools\apache-maven-$mavenVer"
93+
$env:MAVEN_OPTS = "-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
94+
95+
Pop-Location
96+
97+
# ========================== Hadoop bin package
98+
$hadoopVer = "2.6.0"
99+
$hadoopPath = "$tools\hadoop"
100+
if (!(Test-Path $hadoopPath)) {
101+
New-Item -ItemType Directory -Force -Path $hadoopPath | Out-Null
102+
}
103+
Push-Location $hadoopPath
104+
105+
Start-FileDownload "https://github.com/steveloughran/winutils/archive/master.zip" "winutils-master.zip"
106+
107+
# extract
108+
Invoke-Expression "7z.exe x winutils-master.zip"
109+
110+
# add hadoop bin to environment variables
111+
$env:HADOOP_HOME = "$hadoopPath/winutils-master/hadoop-$hadoopVer"
112+
113+
Pop-Location
114+
115+
# ========================== R
116+
$rVer = "3.3.1"
117+
$rToolsVer = "3.4.0"
118+
119+
InstallR
120+
InstallRtools
121+
122+
$env:R_LIBS_USER = 'c:\RLibrary'
123+
if ( -not(Test-Path $env:R_LIBS_USER) ) {
124+
mkdir $env:R_LIBS_USER
125+
}
126+

0 commit comments

Comments
 (0)