I am encountering an issue while working with Change Data Capture using Kafka Connect.
Problem: I am working on a Change Data Capture Pipeline, and I'm getting the following error message when trying to process records at sink end of the pipeline: Error decoding/mapping Kafka record: SinkRecord{kafkaOffset=186, timestampType=CreateTime}. The sink connector is not able to process a delivery a single record to the destination DB (Cassandra).
Details: Source Connector: debezium.connector.postgresql.PostgresConnector Sink Connector: datastax.oss.kafka.sink.CassandraSinkConnector
connector.class=com.datastax.oss.kafka.sink.CassandraSinkConnector
loadBalancing.localDc=GCP9
auth.password=******
topics=whatsapp.public.user_blocked_status
tasks.max=1
contactPoints=****
topic.whatsapp.public.user_blocked_status.clickstream.user_blocked_status.deletesEnabled=false
auth.provider=PLAIN
topic.whatsapp.public.user_blocked_status.clickstream.user_blocked_status.consistencyLevel=LOCAL_QUORUM
topic.whatsapp.public.user_blocked_status.clickstream.user_blocked_status.nullToUnset=false
port=9047
insert.mode=insert
topic.whatsapp.public.user_blocked_status.clickstream.user_blocked_status.mapping=user_blocked_status_id=value.user_blocked_status_id,user_glid=value.user_glid,blocked_glid=value.blocked_glid,block_status=value.block_status,blocking_date=value.blocking_date,unblocking_date=value.unblocking_date,mod_id=value.mod_id,user_ip=value.user_ip
key.space=clickstream
table=user_blocked_status
auth.username=cas***
As per documentation provided by datastax i tried different ways data serialization, but i encountered the following error Seeking to offset 183 for partition blocked_status-0
https://docs.datastax.com/en/kafka/doc/kafka/kafkaMapKeyPair.html