How can i consume a message in Kafka in all the instances of a service

275 Views Asked by At

I have a use case where I need to consume a message in all the instances of service. let's say if my service is running on 5 instances, then the message coming through Kafka needs to be processed on every instance. Since this data is being used in many other APIs so we are storing this in local memory to serve APIs.

Since this data is used very frequently, I don't want to store this data in Redis or some other global cache which will increase latency and cost of network calls.

I want to create a pipeline where any change in data by third-party service will be updated to all the instances and new data is being served in the APIs by all the instances.

1

There are 1 best solutions below

0
Naor Levi On

It isn't possible with kafka. It seems that kafka isn't the right choice for this case.

I can suggest 3 solutions:

  1. You can use Redis as you mentioned above, trading off a little latency.
  2. If the services are running on the same machine you could use a shard memory for all the processes to read from (and then you are agnostic to the process that got the event)
  3. You can hack something but it is an anti-pattern and I won't suggest you to do so as you will probably affect the abilities of the Consumer Group. It's a totally abuse of kafka.

The hack you can do is to consume with a different Consumer Group at each instance. (Let's say a random UUID when you start polling).