The Apache Spark - Essentials training course provides attendees with a solid technical introduction to the Spark architecture and how Spark works. Attendees learn the basic building blocks of Spark, including RDDs and the distributed compute engine, as well as higher-level constructs that provide a simpler and more capable interface, including Spark SQL and DataFrames. This course also covers more advanced capabilities such as the use of Spark Streaming to process streaming data, and provides an overview of Spark Graph Processing (GraphX and GraphFrames) and Spark Machine Learning (SparkML Pipelines). Finally, the class explores possible performance issues, troubleshooting, cluster deployment techniques, and strategies for optimization.
The Apache Spark - Advanced training course teaches attendees advanced Spark skills. Attendees discover how to integrate Spark with Cassandra, cluster data workflows, measure performance, and more.
By attending Apache Spark - Essentials workshop, attendees will learn to:
- Understand the need for Spark in data processing
- Understand the Spark architecture and how it distributes computations to cluster nodes
- Be familiar with basic installation / setup / layout of Spark
- Use the Spark for interactive and ad-hoc operations
- Use Dataset/DataFrame/Spark SQL to efficiently process structured data
- Understand basics of RDDs (Resilient Distributed Datasets), and data partitioning, pipelining, and computations
- Understand Spark's data caching and its usage
- Understand performance implications and optimizations when using Spark
- Be familiar with Spark Graph Processing and SparkML machine learning
By attending Apache Spark - Advanced workshop, attendees will learn to:
- Build on Spark fundamentals to gain a deeper understanding of Spark internals
- Learn the operational tweaks to generate the maximum performance from Spark
- Discover how to use GraphX and MLib for machine learning
For Apache Spark - Essentials
- Knowledge of Python or Scala.
For Apache Spark - Advanced
- Developers who have knowledge of Spark Programming.