I am new to Confluent cloud. I have a scenario where we send huge load of messages from Boomi platform to Kafka using Boomi's Kafka connector. I want to know if dead letter queues can be automatically created in case of failure during message transaction from Boomi to Kafka.

I see using HTTP sink connector, I could achieve automatic DLT creation on failure. But topic accepts all error and success message. So dead letter mechanism is only applicable to consumers and not producers?.

Thanks for your reply.

2

There are 2 best solutions below

0
Abhishek On

Kafka itself does not natively support dead letter queues (DLQs) for producers. Dead letter queues are typically associated with consumer-side processing, allowing failed messages to be redirected for further analysis or handling.

You have specifically add logic to move the failed msg to a different topic and process them accordingly

0
Steephen On

Abhishek provided a good overview of DLQ for Kafka. And you already experienced using DLQ with Http Sink connector. It means DLQ implementation is connector specific, and I didn't find DLQ for Boomi's Kafka connector in documentation.

But Boomi has a configuration available to guarantee the message delivery, and you can try it.

Message Delivery Policy - Select the delivery policy to guarantee message delivery between the producer and consumer. At least once indicates that messages are never lost, but can be redelivered.

Please find the reference.