This repo contains notebook exercises for a workshop teaching the best practices of using Spark for practicing data scientists in the context of a data scientist’s standard workflow. By leveraging Spark’s APIs for Python and R to present practical applications, the technology will be much more accessible by decreasing the barrier to entry.
Prior experience with Python and the scientific Python stack is beneficial. Also knowledge of data science models and applications is preferred. This will not be an introduction to Machine Learning or Data Science, but rather a course for people proficient in these methods on a small scale to understand how to apply that knowledge in a distributed setting with Spark.
- Install IRKernel
install.packages(c('rzmq','repr','IRkernel','IRdisplay'), repos = c('http://irkernel.github.io/', getOption('repos')))
IRkernel::installspec()
# Example: Set this to where Spark is installed
Sys.setenv(SPARK_HOME="/Users/[username]/spark")
# This line loads SparkR from the installed directory
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
# if these two lines work, you are all set
library(SparkR)
sc <- sparkR.init(master="local")
Q: How can I find out all the methods that are available on DataFrame?
-
In the IPython console type
sales.[TAB]
-
Autocomplete will show you all the methods that are available.
-
To find more information about a specific method, say
.cov
typehelp(sales.cov)
-
This will display the API documentation for that method.
Q: How can I find out more about Spark's Python API, MLlib, GraphX, Spark Streaming, deploying Spark to EC2?
-
Navigate using tabs to the following areas in particular.
-
Programming Guide > Quick Start, Spark Programming Guide, Spark Streaming, DataFrames and SQL, MLlib, GraphX, SparkR.
-
Deploying > Overview, Submitting Applications, Spark Standalone, YARN, Amazon EC2.
-
More > Configuration, Monitoring, Tuning Guide.
- Why CPUs aren't getting any faster
- Hadoop: A brief History
- The State of Spark: And where we are going next
- https://blogs.apache.org/foundation/entry/the_apache_software_foundation_announces50
- Distributed Systems for Fun and Profit
- Resilience Engineering: Learning to Embrace Failure
- Chaos Monkey
- Tuning and Debugging in Apache Spark
reduceByKey
vsgroupByKey
- Advanced Spark
- What's the difference between
cache()
andpersist()
- Monitoring and Instrumentation
- https://plot.ly/ipython-notebooks/apache-spark/
- https://plot.ly/python/ipython-notebooks/
- https://plot.ly/python/matplotlib-to-plotly-tutorial/#6.1-Matplotlib-to-Plotly-conversion-basics
-
Learning Spark: Lightning-Fast Big Data Analytics
By Holden Karau, Andy Konwinski, Patrick Wendell, Matei Zaharia
Publisher: O'Reilly Media, June 2014
http://shop.oreilly.com/product/0636920028512.do
Introduction to Spark APIs and underlying concepts. -
Spark Knowledge Base
By Databricks, Vida Ha, Pat McDonough
Publisher: Databricks
http://databricks.gitbooks.io/databricks-spark-knowledge-base
Spark tips, tricks, and recipes. -
Spark Reference Applications
By Databricks, Vida Ha, Pat McDonough
Publisher: Databricks
http://databricks.gitbooks.io/databricks-spark-reference-applications
Best practices for large-scale Spark application architecture. Topics include import, export, machine learning, streaming.
- Scala for the Impatient
by Cay S. Horstmann
Publisher: Addison-Wesley Professional, March 2012
http://www.amazon.com/Scala-Impatient-Cay-S-Horstmann/dp/0321774094
Concise, to the point, and contains good practical tips on using Scala.
-
Spark Internals
By Matei Zaharia (Databricks)
https://www.youtube.com/watch?v=49Hr5xZyTEA -
Spark on YARN
By Sandy Ryza (Cloudera)
https://www.youtube.com/watch?v=N6pJhxCPe-Y -
Spark Programming
By Pat McDonough (Databricks)
https://www.youtube.com/watch?v=mHF3UPqLOL8
-
Community
https://spark.apache.org/community.html
Spark's community page lists meetups, mailing-lists, and upcoming Spark conferences. -
Meetups
http://spark.meetup.com/
Spark has meetups in the Bay Area, NYC, Seattle, and most major cities around the world. -
Mailing Lists
https://spark.apache.org/community.html
The user mailing list covers issues and best practices around using Spark. The dev mailing list is for people who want to contribute to Spark.