Spark Project - Learn to Write Spark Applications using Spark 2.0

Spark Project - Learn to Write Spark Applications using Spark 2.0

In this project, we will use complex scenarios to make Spark developers better to deal with the issues that come in the real world.

Videos

Each project comes with 2-5 hours of micro-videos explaining the solution.

Code & Dataset

Get access to 50+ solved projects with iPython notebooks and datasets.

Project Experience

Add project experience to your Linkedin/Github profiles.

What will you learn

Pivoting data
Dealing with Structs/Schemas
UDFs and Abstract logic in UDF
Caching and Checkpointing
Clustering, Bucketing, Sorting and Partitioning
Resource Allocation in Spark

Project Description

Spark is the go-to-framework for today's big data processing. Most companies are jumping on the spark wagon. 
However, Spark is notorious for being easy to get started with but being very difficult to master. The mastery of spark is beyond knowledge of its APIs but also knowledge of its internals. Because of this, there are many developers who in the face of production data use case begin to face unknown problems that were not discussed during training.

This Hackerday wishes to pick apart a couple of these tasks or scenarios that are not really discussed during trainings but can burden developers in practice.

We will look at the concept of Spark memory management, cluster resource allocation, clustering, repartitioning, and many more. 

The goal of the Hackerday is to make Spark developers better at their craft and make those just learning spark to quickly appreciate the depths of the framework. The idea is to go beyond simple use cases into complex scenarios or data pipeline to enabled students to get the issues that come with the real world.

All the learning for the sessions will be done on Spark 2.

Similar Projects

In this project, we will evaluate and demonstrate how to handle unstructured data using Spark.

In this project, we will look at Cassandra and how it is suited for especially in a hadoop environment, how to integrate it with spark, installation in our lab environment.

In this project, we will be building and querying an OLAP Cube for Flight Delays on the Hadoop platform.

Curriculum For This Mini Project