Handling Message Errors and Dead Letter Queues in Apache Kafka ft. Jason Bell

Ғылым және технология

cnfl.io/podcast-episode-186 | If you ever wondered what exactly dead letter queues (DLQs) are and how to use them, Jason Bell (Senior DataOps Engineer, Digitalis) has an answer for you. Dead letter queues are a feature of Kafka Connect that acts as the destination for failed messages due to errors like improper message deserialization and improper message formatting. Lots of Jason’s work is around Kafka Connect and the Kafka Streams API, and in this episode, he explains the fundamentals of dead letter queues, how to use them, and the parameters around them.
For example, when deserializing an Avro message, the deserialization could fail if the message passed through is not Avro or in a value that doesn’t match the expected wire format, at which point, the message will be rerouted into the dead letter queue for reprocessing. The Apache Kafka® topic will reprocess the message with the appropriate converter and send it back onto the sink. For a JSON error message, you’ll need another JSON connector to process the message out of the dead letter queue before it can be sent back to the sink.
Dead letter queue is configurable for handling a deserialization exception or a producer exception. When deciding if this topic is necessary, consider if the messages are important and if there’s a plan to read into and investigate why the error occurs. In some scenarios, it’s important to handle the messages manually or have a manual process in place to handle error messages if reprocessing continues to fail. For example, payment messages should be dealt with in parallel for a better customer experience.
Jason also shares some key takeaways on the dead letter queue:
If the message is important, such as a payment, you need to deal with the message if it goes into the dead letter queue
To minimize message routing into the dead letter queue, it’s important to ensure successful data serialization at the source
When implementing a dead letter queue, you need a process to consume the message and investigate the errors
EPISODE LINKS
► Kafka Connect 101: Error Handling and Dead Letter Queues: cnfl.io/error-handling-connect-101-dead-letter-queues-vodcast
► Capacity Planning your Kafka Cluster: cnfl.io/capacity-planning-your-kafka-cluster-dead-letter-queues-vodcast
► Tales from the Frontline of Apache Kafka DevOps: cnfl.io/kafka-devops-tales-dead-letter-queues-vodcast
► Tweet: Morning morning (yes, I have tea): jasonbelldata/status/1441662910938292234
► Tweet: Kafka dead letter queues: tlberglund/status/1392953469913370626
► Join the Confluent Community: cnfl.io/join-the-confluent-community-dead-letter-queues-vodcast
► Learn Kafka on Confluent Developer: cnfl.io/confluent-developer-dead-letter-queues-vodcast
► Demo: Event-Driven Microservices with Confluent: cnfl.io/event-driven-microservices-demo-dead-letter-queues-vodcast
► Use PODCAST100 to get $100 of free Confluent Cloud usage: cnfl.io/try-confluent-cloud-dead-lettter-queues-vodcast
► Promo code details: cnfl.io/promo-code-details-dead-letter-queues-vodcast
ABOUT CONFLUENT
Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Confluent’s cloud-native offering is the foundational platform for data in motion - designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can meet the new business imperative of delivering rich, digital front-end customer experiences and transitioning to sophisticated, real-time, software-driven backend operations. To learn more, please visit www.confluent.io.
#kafkaconnect #apachekafka #kafka #confluent

Пікірлер: 8

  • @Evkayne
    @Evkayne7 ай бұрын

    good episode overall, thanks :)

  • @manideepkumar959
    @manideepkumar9596 күн бұрын

    My doubt:- Assume if exception because of json message we are trying to de-serialize using Avro De-serializer , ok exception came, now instead of putting them into DLQ and having another consumer Then and there itself we can deserialize using another de-serializer na? lot of casual talks & laughs, unable to concetrate on subject

  • @anthonytrisvane4693
    @anthonytrisvane46932 жыл бұрын

    R.I.P Kafka unless they integrate blockchain

  • @JeffCowan

    @JeffCowan

    2 жыл бұрын

    I work in a domain where we are one of the "top Kafka users in the world"... top 5-ish we'll say for now. Literally no one i know internally or externally wants blockchain integrated into Kafka.

  • @anthonytrisvane4693

    @anthonytrisvane4693

    2 жыл бұрын

    @@JeffCowan that's because the web 3 revolution hasn't begun. Let's speak again in 2 years

  • @nehfi

    @nehfi

    Жыл бұрын

    @@anthonytrisvane4693 1 year later, should I wait more? :)

  • @Evkayne

    @Evkayne

    7 ай бұрын

    what even? really?

  • @Evkayne

    @Evkayne

    7 ай бұрын

    do i need to wait more? :)