Real-Time Log Processing in Kafka for Streaming Architecture

Real-Time Log Processing in Kafka for Streaming Architecture

The goal of this apache kafka project is to process log entries from applications in real-time using Kafka for the streaming architecture in a microservice sense.
explanation image


Each project comes with 2-5 hours of micro-videos explaining the solution.

ipython image

Code & Dataset

Get access to 50+ solved projects with iPython notebooks and datasets.

project experience

Project Experience

Add project experience to your Linkedin/Github profiles.

Customer Love

Read All Reviews
profile image

Camille St. Omer linkedin profile url

Artificial Intelligence Researcher, Quora 'Most Viewed Writer in 'Data Mining'

I came to the platform with no experience and now I am knowledgeable in Machine Learning with Python. No easy thing I must say, the sessions are challenging and go to the depths. I looked at graduate... Read More

profile image

Shailesh Kurdekar linkedin profile url

Solutions Architect at Capital One

I have worked for more than 15 years in Java and J2EE and have recently developed an interest in Big Data technologies and Machine learning due to a big need at my workspace. I was referred here by a... Read More

What will you learn

Understanding the roadmap of the project
Kafka as Real-time app and data pipeline builder
Understanding Service-Oriented architecture as Microservices
Role of Log file in Businesses
Re-state the case for real-time processing of log files
Run through our application and real-time log collection using Flume Log4J appenders
Creating Events in Fume
Ingestion of Data in Kafka by integrating Flume and Kafka
Selecting between Kafka and Flume appenders
Handling Massive data in batch and stream-processing using Lambda architecture
Kafka Stream and Kafka connect
Initiating Zookeeper for starting Kafka
Processing data in Kafka platform
Steps to use Kafka for Streaming Architecture in Microservices
Transforming Kafka streams into the object by parsing
Storing the final processed data(HBase, Cassandra, MongoDB)
Extending our architecture in a microservice world

Project Description

In our previous Spark Project-Real-Time Log Processing using Spark Streaming Architecture, we built on a previous topic of log processing by using the speed layer of the lambda architecture. We performed a real time processing of log entries from application using Spark Streaming, storing the final data in a hbase table.

In this kafka project, we will repeat the same objectives using another set of real time technologies. The idea is to compare both approaches of doing real time data processing which will soon become mainstream in various industries.

We will be using Kafka for the streaming architecture in a microservice sense.

The major highlight of this big data project will be students having to compare the spark streaming approach vs the Kafka-only approach. This is a great session for developers, analyst as much as architects.

Note: It is worthy of note that the Cloudera QuickStart VM does not have Kafka. We intend to work around that. So come prepare to do Kafka Installation in Cloudera quickstart vm.

Similar Projects

In this big data project, we will see how data ingestion and loading is done with Kafka connect APIs while transformation will be done with Kafka Streaming API.

In this big data project, we will embark on real-time data collection and aggregation from a simulated real-time system using Spark Streaming.

The goal of this hadoop project is to apply some data engineering principles to Yelp Dataset in the areas of processing, storage, and retrieval.

Curriculum For This Mini Project

Agenda for the Project
What is Kafka?
Microservices and Its Architecture
Why businesses need logs?
Making a case for real-time log processing
Run through the application using Flume Log4j appenders
Using Flume for Events
Getting data into Kafka
Download and Install Kafka
Kafka and Flume Integration
Lambda Architecture
Recap of the Previous Session
Kafka Streams and Kafka Connect
Starting Kafka Agents -Zookeeper
Kafka Streams
Kafka as a Processing Platform
Steps to use Kafka for Streaming Architecture in Microservices
Kafka Streaming Application
Applying Business Logic on KStream
Parsing the Stream and Transforming into Object
Processed Logs
Resource Counter
Storing the Data into the Destination -HBase, Cassandra, MongoDB
Using Kafka Connect
Example on how to use Kafka Connect
Discussion on using Kafka for Microservices
Resource Counter Process