Data from ElasticSearch as input to http_poller

85 Views Asked by At

I am trying to do the following: Take some data (lets say a single floating point value) from kafka and add them to the body of an http call to some Rest API, through a Logstash pipeline. Is that really possible? I was trying to do this with a query to elasticsearch, with no luck. Find my Logstash pipeline below:

input {
  elasticsearch {
   kafka {
       bootstrap_servers => 'kafka:9094'
       topics => ["..."]
       client_id => "..."
       group_id => "..."
       codec => json {}
        }
  }
}

filter {
   http {
     verb => "POST"
     url => "http://some url"
     body_format => "json"
     body => {
       "instances" => "%{instances}"
     }
}

output{
  elasticsearch {
    hosts => [ "elasticsearch:9200" ]
    index => "demo_results"
  }
}

In other words, I do not know how to send the data from elastic to the http_poller call!

Thanks a lot,

George.

1

There are 1 best solutions below

9
Val On

What you could do is to leverage the scheduling capabilities of the elasticsearch input so that it executes at regular time intervals to fetch whatever you need.

Then, when you get the data you wanted, you can use the http filter to call your remote API using the data you fetched from Elasticsearch.

input {
  elasticsearch {
    hosts => ["...:9200"]
    index => ["demo_data"]
    query => '{ "query": { "match": { "instances": 20 } } }'
    size => 1
    schedule => { cron => "* * * * * UTC"}
  }  
}
filter {
   http {
     verb => "POST"
     url => "http://some url"
     body_format => "json"
     body => {
       "instances" => "%{instances}"
     }
   }
}
output{
  elasticsearch {
    hosts => [ "elasticsearch:9200" ]
    index => "demo_results"
  }
}