Real-Time Log Processing in Kafka for Streaming Architecture

Real-Time Log Processing in Kafka for Streaming Architecture

The goal of this apache kafka project is to process log entries from applications in real-time using Kafka for the streaming architecture in a microservice sense.


Each project comes with 2-5 hours of micro-videos explaining the solution.

Code & Dataset

Get access to 50+ solved projects with iPython notebooks and datasets.

Project Experience

Add project experience to your Linkedin/Github profiles.

Customer Love

Read All Reviews

Swati Patra

Systems Advisor , IBM

I have 11 years of experience and work with IBM. My domain is Travel, Hospitality and Banking - both sectors process lots of data. The way the projects were set up and the mentors' explanation was... Read More

Hiren Ahir

Microsoft Azure SQL Sever Developer, BI Developer

I'm a Graduate student and came into the job market and found a university degree wasn't sufficient to get a good paying job. I aimed at hottest technology in the market Big Data but the word BigData... Read More

What will you learn

Understanding the roadmap of the project
Kafka as Real-time app and data pipeline builder
Understanding Service-Oriented architecture as Microservices
Role of Log file in Businesses
Re-state the case for real-time processing of log files
Run through our application and real-time log collection using Flume Log4J appenders
Creating Events in Fume
Ingestion of Data in Kafka by integrating Flume and Kafka
Selecting between Kafka and Flume appenders
Handling Massive data in batch and stream-processing using Lambda architecture
Kafka Stream and Kafka connect
Initiating Zookeeper for starting Kafka
Processing data in Kafka platform
Steps to use Kafka for Streaming Architecture in Microservices
Transforming Kafka streams into the object by parsing
Storing the final processed data(HBase, Cassandra, MongoDB)
Extending our architecture in a microservice world

Project Description

In our previous Spark Project-Real-Time Log Processing using Spark Streaming Architecture, we built on a previous topic of log processing by using the speed layer of the lambda architecture. We performed a real time processing of log entries from application using Spark Streaming, storing the final data in a hbase table.

In this kafka project, we will repeat the same objectives using another set of real time technologies. The idea is to compare both approaches of doing real time data processing which will soon become mainstream in various industries.

We will be using Kafka for the streaming architecture in a microservice sense.

The major highlight of this big data project will be students having to compare the spark streaming approach vs the Kafka-only approach. This is a great session for developers, analyst as much as architects.

Note: It is worthy of note that the Cloudera QuickStart VM does not have Kafka. We intend to work around that. So come prepare to do Kafka Installation in Cloudera quickstart vm.

Similar Projects

In this project, we will show how to build an ETL pipeline on streaming datasets using Kafka.

Hadoop Projects for Beginners -Learn data ingestion from a source using Apache Flume and Kafka to make a real-time decision on incoming data.

Use the dataset on aviation for analytics to simulate a complex real-world big data pipeline based on messaging with AWS Quicksight, Druid, NiFi, Kafka, and Hive.

Curriculum For This Mini Project

Agenda for the Project
What is Kafka?
Microservices and Its Architecture
Why businesses need logs?
Making a case for real-time log processing
Run through the application using Flume Log4j appenders
Using Flume for Events
Getting data into Kafka
Download and Install Kafka
Kafka and Flume Integration
Lambda Architecture
Recap of the Previous Session
Kafka Streams and Kafka Connect
Starting Kafka Agents -Zookeeper
Kafka Streams
Kafka as a Processing Platform
Steps to use Kafka for Streaming Architecture in Microservices
Kafka Streaming Application
Applying Business Logic on KStream
Parsing the Stream and Transforming into Object
Processed Logs
Resource Counter
Storing the Data into the Destination -HBase, Cassandra, MongoDB
Using Kafka Connect
Example on how to use Kafka Connect
Discussion on using Kafka for Microservices
Resource Counter Process