Skip to content

Commit

Permalink
[SPARK-6806] [SPARKR] [DOCS] Fill in SparkR examples in programming g…
Browse files Browse the repository at this point in the history
…uide

sqlCtx -> sqlContext

You can check the docs by:

```
$ cd docs
$ SKIP_SCALADOC=1 jekyll serve
```
cc shivaram

Author: Davies Liu <[email protected]>

Closes apache#5442 from davies/r_docs and squashes the following commits:

7a12ec6 [Davies Liu] remove rdd in R docs
8496b26 [Davies Liu] remove the docs related to RDD
e23b9d6 [Davies Liu] delete R docs for RDD API
222e4ff [Davies Liu] Merge branch 'master' into r_docs
89684ce [Davies Liu] Merge branch 'r_docs' of github.com:davies/spark into r_docs
f0a10e1 [Davies Liu] address comments from @shivaram
f61de71 [Davies Liu] Update pairRDD.R
3ef7cf3 [Davies Liu] use + instead of function(a,b) a+b
2f10a77 [Davies Liu] address comments from @cafreeman
9c2a062 [Davies Liu] mention R api together with Python API
23f751a [Davies Liu] Fill in SparkR examples in programming guide
  • Loading branch information
Davies Liu authored and shivaram committed May 23, 2015
1 parent 4583cf4 commit 7af3818
Show file tree
Hide file tree
Showing 14 changed files with 706 additions and 323 deletions.
4 changes: 2 additions & 2 deletions R/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ The SparkR documentation (Rd files and HTML files) are not a part of the source
SparkR comes with several sample programs in the `examples/src/main/r` directory.
To run one of them, use `./bin/sparkR <filename> <args>`. For example:

./bin/sparkR examples/src/main/r/pi.R local[2]
./bin/sparkR examples/src/main/r/dataframe.R

You can also run the unit-tests for SparkR by running (you need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first):

Expand All @@ -63,5 +63,5 @@ You can also run the unit-tests for SparkR by running (you need to install the [
The `./bin/spark-submit` and `./bin/sparkR` can also be used to submit jobs to YARN clusters. You will need to set YARN conf dir before doing so. For example on CDH you can run
```
export YARN_CONF_DIR=/etc/hadoop/conf
./bin/spark-submit --master yarn examples/src/main/r/pi.R 4
./bin/spark-submit --master yarn examples/src/main/r/dataframe.R
```
Loading

0 comments on commit 7af3818

Please sign in to comment.