Apache Kafka: Hands-On Training

Course 1266

  • Duration: 2 days
  • Labs: Yes
  • Language: English
  • Level: Foundation

This course provides hands-on experience with Apache Kafka both from the command line and through programmatic access using Java. Attendees will get a full appreciation of publish/subscribe idiom for asynchronous communication. The course producers and consumers and reliable data delivery. Attendees will get a full appreciation of how to build data pipelines and how they can process streaming data.

Kafka is an event streaming platform. Businesses and organizations often must capture data in real-time from event sources like databases, sensors, mobile devices, cloud services, social media, and software applications in the form of streams of events. They need to store these event streams durably for later retrieval. They also need to manipulate, process, and react to the event streams in real-time as well as retrospectively. The event streams need be made available to different destination technologies as needed. Kafka's event streaming implementation ensures a continuous flow and interpretation of data so that the right information is at the right place, at the right time.

Java to the level of Introduction to Java Programming Training.

Anyone developing Java or Python applications who has core Java SE or Python skills and wishes to capitalize on the addition of Kafka to program in the publish/subscribe idiom and manage fast, streaming data.

Learning Tree Exam included at the end of class.

  • Installing a developer Kafka broker
  • Creating a Kafka producer
  • Creating a Kafka consumer
  • Configuring reliable data delivery
  • Process streaming data

Apache Kafka: Hands-On Training Delivery Methods

  • After-course instructor coaching included
  • Learning Tree end-of-course exam included

Apache Kafka: Hands-On Training Course Benefits

Install a Kafka broker for developmentWrite Kafka producers and consumersLeverage Kafka features for reliable data deliveryBuild Kafka data pipelinesReceive and process streaming data using Kafka

Apache Kafka: Hands-On Training Outline

Chapter 1: Apache Kafka (42 slides)

  • Publish/Subscribe Messaging
  • Apache Kafka
  • Installing Kafka
  • Hands-On Exercise 1.1: Installing Kafka

Chapter 2: Producers and Consumers (51 slides)

  • Constructing a Kafka producer
  • Synchronous and asynchronous messages
  • Configuring producers
  • Serializers and partitions
  • Hands-On Exercise 2.1 Creating a Kafka producer
  • Consumers and consumer groups
  • Creating a Kafka consumer
  • Subscribing to Kafka topics
  • Consuming messages
  • Deserializers
  • Hands-On Exercise 2.2: Creating a Kafka producer

Chapter 3: Reliable Data Delivery (15 slides)

  • Reliability Guarantees
  • Broker configuration
  • Producers in a reliable system
  • Consumers in a reliable system
  • HO 3.1 Reliable data delivery

Chapter 4: Building Data Pipelines (21 slides)

  • Considerations when building data pipelines
  • Kafka Connect
  • Hands-On Exercise 4.1: Building a data pipeline

Chapter 5: Stream Processing (51 slides)

  • Stream processing concepts
  • Examples of Kafka Streams
  • Kafka Streams architecture
  • Hands-On Exercise 5.1: Stream processing Twitter data

Need Help Finding The Right Training Solution?

Our training advisors are here for you.

Course FAQs

Approximately 40% of time on the course is devoted to hands-on exercises, allowing you to gain extensive experience with Kafka. Exercises include:

  • Installing a Kafka broker for development
  • Writing Kafka producers and consumers
  • Leveraging Kafka features for reliable data delivery
  • Building Kafka data pipelines
  • Receiving and processing streaming data using Kafka

This course assumes a background in Java programming, to the level of Learning Tree Introduction to Java Programming Training.

Chat With Us