Introduction to Spark 3 with Python Training in Olathe

Enroll in or hire us to teach our Introduction to Spark 3 with Python class in Olathe, Kansas by calling us @303.377.6176. Like all HSG classes, Introduction to Spark 3 with Python may be offered either onsite or via instructor led virtual training. Consider looking at our public training schedule to see if it is scheduled: Public Training Classes
Provided there are enough attendees, Introduction to Spark 3 with Python may be taught at one of our local training facilities.
We offer private customized training for groups of 3 or more attendees.

Course Description

 

This course introduces the Apache Spark distributed computing engine, and is suitable for developers, data analysts, architects, technical managers, and anyone who needs to use Spark in a hands-on manner. It is based on the Spark 3.x release. All examples and labs use Python for programming.

The course provides a solid technical introduction to the Spark architecture and how Spark works. It covers the basic building blocks of Spark (e.g. RDDs and the distributed compute engine), as well as higher-level constructs that provide a simpler and more capable interface (e.g. DataFrames and Spark SQL). It includes in-depth coverage of Spark SQL and DataFrames, which are now the preferred programming API. This includes exploring possible performance issues and strategies for optimization.

The course also covers more advanced capabilities such as the use of Spark Streaming to process streaming data, and integrating with the Kafka server.

 

The course is very hands-on, with many labs. Participants will interact with Spark through the pyspark shell (for interactive, ad-hoc processing) as well as through programs using the Spark API. After taking this course, you will be ready to work with Spark in an informed and productive manner.

Course Length: 4 Days
Course Tuition: $1890 (US)

Prerequisites

Working knowledge of some programming language - no Java experience needed

Course Outline

 
Session 1: Introduction to Spark
Overview, Motivations, Spark Systems
Spark Ecosystem
Spark vs. Hadoop
Acquiring and Installing Spark
The Spark Shell, SparkContext
 
Session 2: RDDs and Spark Architecture
RDD Concepts, Lifecycle, Lazy Evaluation
RDD Partitioning and Transformations
Working with RDDs - Creating and Transforming (map, filter, etc.)
 
Session 3: Spark SQL, DataFrames, and DataSets
Overview
SparkSession, Loading/Saving Data, Data Formats (JSON, CSV, Parquet, text ...)
Introducing DataFrames (Creation and Schema Inference)
Supported Data Formats (JSON, Text, CSV, Parquet)
Working with the DataFrame (untyped) Query DSL (Column, Filtering, Grouping, Aggregation)
SQL-based Queries
Mapping and Splitting (flatMap(), explode(), and split())
DataFrames vs. RDDs
 
Session 4: Shuffling Transformations and Performance
Grouping, Reducing, Joining
Shuffling, Narrow vs. Wide Dependencies, and Performance Implications
Exploring the Catalyst Query Optimizer (explain(), Query Plans, Issues with lambdas)
The Tungsten Optimizer (Binary Format, Cache Awareness, Whole-Stage Code Gen)
 
Session 5: Performance Tuning
Caching - Concepts, Storage Type, Guidelines
Minimizing Shuffling for Increased Performance
Using Broadcast Variables and Accumulators
General Performance Guidelines
 
Session 6: Creating Standalone Applications
Core API, SparkSession.Builder
Configuring and Creating a SparkSession
Building and Running Applications - sbt/build.sbt and spark-submit
Application Lifecycle (Driver, Executors, and Tasks)
Cluster Managers (Standalone, YARN, Mesos)
Logging and Debugging
 
Session 7: Spark Streaming
Introduction and Streaming Basics
Streaming Introduction
Structured Streaming (Spark 2+)
Continuous Applications
Table Paradigm, Result Table
Steps for Structured Streaming
Sources and Sinks
Consuming Kafka Data
Kafka Overview
Structured Streaming - "kafka" format
Processing the Stream

Course Directory [training on all levels]

Upcoming Classes
Gain insight and ideas from students with different perspectives and experiences.

Python Programming Uses & Stats

Python Programming is Used For:
Web Development Video Games Desktop GUI's Software Development
Difficulty
Popularity
Year Created
1991
Pros
Easy to Learn:
The learning curve is very mild and the language is versatile and fast to develop.
 
Massive Libraries:
You can find a library for basically anything: from web development, through game development, to machine learning.
 
Do More with Less Code:
You can build prototypes and test out  ideas much quicker in Python than in other language
Cons

Speed Limitations:

It is an interpretive language and therefore much slower than compiled languages.

Problems with Threading:

Multi-threaded CPU-bound programs may be slower than single-threaded ones do to the Global Interpreter Lock (GIL) that allows only one thread to execute at a time.

Weak on Mobile:

Although, there are a number or libraries that provide a way to develop for both Android and iOS using Python currently Android and iOS don’t support Python as an official programming language.

Python Programming Job Market
Average Salary
$107,000
Job Count
26,856
Top Job Locations

New York City

Mountain View

San Francisco

Complimentary Skills to have along with Python Programming
The potential for career growth, whether you are new to the industry or plan to expand your current skills, depends upon your interests:
  - For knowledge in building in PC or windows, phone apps or you are looking your future in Microsoft learn C#
  - For android apps and also cross platform apps then learn Java
  - If you are an Apple-holic and want to build iOS and MAC apps and then choose Objective C or Swift
  - Interested in game development? C++
  - Data mining or statistics then go with R programming or MATLAB
  - Building an operating systems? C

Interesting Reads Take a class with us and receive a book of your choosing for 50% off MSRP.