I want to read data from a topic using kafkaconsumer , we can read data using seektobeginning and seektoend methonds but what I want that while reading the data if I stop in between and restart after sometime then I want that kafka starts reading from where it has left , What methods can I use for this so that I do not have to read the whole data again.
Data Reading From Kafka
23 Views Asked by Mohit Kumar At
1
There are 1 best solutions below
Related Questions in APACHE-KAFKA
- No method found for class java.lang.String in Kafka
- How to create beans of the same class for multiple template parameters in Spring
- Troubleshoot .readStream function not working in kafka-spark streaming (pyspark in colab notebook)
- Handling and ignore UNKNOWN_TOPIC_OR_PARTITION error in Kafka Streams
- Connect Apache Flink with Apache kudu as sink using Pyflink
- Embedded Kafka Failed to Start After Spring Starter Parent Version 3.1.10
- Producer Batching Service Bus Vs Kafka
- How to create a docker composer environment where containers can communicate each other?
- Springboot Kafka Consumer unable to maintain connect to kafka cluster brokers
- Kafka integration between two micro service which can respond back to the same function initiated the request
- Configuring Apache Spark's MemoryStream to simulate Kafka stream
- Opentelemetry Surpresses Kafka Produce Message Java
- Kafka: java.lang.NoClassDefFoundError: Could not initialize class org.apache.logging.log4j.core.appender.mom.kafka.KafkaManager
- MassTransit Kafka producers configure to send several events to the same Kafka topic
- NoClassDefFoundError when running JAR file with Apache Kafka dependencies
Related Questions in KAFKA-CONSUMER-API
- Data Reading From Kafka
- realtime consume data from kafka to clickhouse
- How to resolve KafkaConnectionError: Socket EVENT_READ without in-flight-requests
- Latest Stable offset in Kafka
- Switch between Kafka topics
- Consuming messages from Kafka topic one by one takes too long time. How can I shorten this time? Is reading multiple messages at one time possible?
- Testing Kafka Producer and Consumer
- Docker-compose: ModuleNotFoundError: No module named 'core'
- Problem with kafka request v3+ serealization. Broker cant deserialize message
- Kafka message not being consumed and offset not committed
- Empty consumer groups are not getting removed from kafka
- Detecting new partitions in a kafka topic
- App info kafka.consumer for group-id unregistered
- How to wrap @KafkaListener for custom method arguments?
- Kafka-Spark Streaming Distributed The group coordinate is not available (Host2:9092(id:2147483645))
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Simply do not seek at all.
The behaviour you want is explicitly something that Kafka manages for you: your consumer does belong to a consumer group and the Kafka broker stores the last read offset per consumer group by default. If your consumer dies or is stopped, the offset is kept for 1 week by default IIRC. Once you start a consumer in the same consumer group again, it will continue reading from the last position that was read successfully.
If there is no offset for a consumer to continue reading from (e.g. when starting for the first time), it will fall back to either beginning of the topic or the end. This is controlled by the consumer configuration
auto.offset.reset, which would need to be set toearliest. By default it is configured aslatest, so keep that in mind.