Error fetching data for metricset kafka.partition

388 Views Asked by At

I have ssl kafka service on aks cluster and I want to get metrics with MetricBeat, index it with ElasticSearch and display with Kibana.

My kafka is set as :

helm install dp bitnami/kafka \
            --set replicaCount=1 \
            --set rbac.create=true \
            --set externalAccess.enabled=true \
            --set externalAccess.autoDiscovery.enabled=true \
            --set auth.clientProtocol=tls \
            --set auth.interBrokerProtocol=tls \
            --set "auth.tls.existingSecrets[0]=dp-ume-kafka-secrets-0" \
            --set auth.tls.password=$PASSWORD -n sslkafka

and in metricbeat and i am enabling its module as -

- module: kafka
        enabled: true
        metricsets:
          - partition
          - consumergroup
        period: 10s
        hosts: ["<kafka-IP>:9094"]

But i am getting below error from metricbeat pod -

Error fetching data for metricset kafka.consumergroup: error in connect: getting cluster client for advertised broker with address <kafka-IP>:9094: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)

Error fetching data for metricset kafka.partition: error in connect: getting cluster client for advertised broker with address <kafka-IP>:9094: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)

Any suggestion?

0

There are 0 best solutions below