Let's say I have an Order and a Product.
Whenever product is created I publish PRODUCT_CREATE event and for the order it is ORDER_CREATE
If I use Kafka ( or any message broker ) should I have 1 queue per microservice and process events one-by-one or should I have queue-per-feature?
Lets say I quickly create a product and then order ( almost instantly )
Solution 1: I have 1 queue. I can check if product exists when I try to create order. If it doesn't I throw an error because everything is inconsistent.
Solution 2: I have 2 queues. One for PRODUCT_CREATE and one for ORDER_CREATE. If I try to create an order and product is not there I can retry until product emerges. At some point I can say product is not there and move on.
What is the recommended solution?
I tried Solution 2 with retries. It works, but what if product truly doesnt exist? It seems like a dirty approach.
And Solution 1 is simple, but I cant do work in parallel at all.
1.If business is coupled tightly, solution 1 is recommended.
Don't worry about parallelling, kafka provides partitons to increase throughput. You can send PRODUCT_CREATE and ORDER_CREATE event with same key (productId), and mutilple consumers can consume the events orderly.
Let's say your business have five kinds of event, it's impossible to have five queues and recover the order on the consumer side. So, the key is how tightly your business is coupled.
2.Another way to improve solution 2 is kafka transaction, it ensures PRODUCT_CREATE and ORDER_CREATE event are sent to kafka brokers in an atomic way.