Take advantage of spring savings with up to 50% off ILT training.

Checkout

Cart () Loading...

    • Quantity:
    • Delivery:
    • Dates:
    • Location:

    $

Contact Sales

Working with Apache Kafka

Learn to use Kafka as your real-time data pipeline processor for your streaming needs.

Apache Kafka is a real-time data pipeline processor. It’s scalable, fault tolerance, has high execution speeds with fluid integrations are some of the key hallmarks that make Kafka an integral part of many Enterprise Data architectures. In this lab intensive two day course, you will learn how to use Kafka to build streaming solutions.

GK# 100684 Vendor# TTDSKFKA2
Vendor Credits:
No matching courses available.
Start learning as soon as today! Click Add To Cart to continue shopping or Buy Now to check out immediately.
Access Period:
Scheduling a custom training event for your team is fast and easy! Click here to get started.
$
Your Selections:
Location:
Access Period:
No available dates

Who Should Attend?

Experienced Java Developers with database experience.

What You'll Learn

Join an engaging hands-on learning environment, where you’ll explore:

  • Overview of Streaming technologies
  • Kafka concepts and architecture
  • Programming using Kafka API
  • Kafka Streams
  • Monitoring Kafka
  • Tuning/Troubleshooting Kafka

This course has a 50% hands-on labs to 50% lecture ratio with engaging instruction, demos, group discussions, labs, and project work.

Course Outline

Introduction to Streaming Systems

  • Fast data
  • Streaming architecture
  • Lambda architecture
  • Message queues
  • Streaming processors

Introduction to Kafka

  • Architecture
  • Comparing Kafka with other queue systems (JMS / MQ)
  • Kaka concepts : Messages, Topics, Partitions, Brokers, Producers, and commit logs
  • Kafka and Zookeeper
  • Producing messages
  • Consuming messages (Consumers and Consumer Groups)
  • Message retention
  • Scaling Kafka

Programming With Kafka

  • Configuration parameters
  • Producer API (Sending messages to Kafka)
  • Consumer API (consuming messages from Kafka)
  • Commits, Offsets, Seeking
  • Schema with Avro

Kafka Streams

  • Streams overview and architecture
  • Streams use cases and comparison with other platforms
  • Learning Kafka Streaming concepts (KStream, KTable, and KStore)
  • KStreaming operations (transformations, filters, joins, and aggregations)

Administering Kafka

  • Hardware/Software requirements
  • Deploying Kafka
  • Configuration of brokers/topics/partitions/producers/consumers
  • Security: How secure Kafka cluster, and secure client communications (SASL and Kerberos)
  • Monitoring: monitoring tools
  • Capacity Planning: estimating usage and demand
  • Trouble shooting: failure scenarios and recovery

Monitoring and Instrumenting Kafka

  • Monitoring Kafka
  • Instrumenting with Metrics library
  • Instrument Kafka applications and monitor their performance
BUY NOW

Prerequisites

Before attending this course, you should:

  • Be comfortable with Java
  • Have experience working with databases
  • Able to navigate the Linux command line
  • Have basic knowledge of Linux editors (such as VI/nano) for editing code