I am trying to do the following: Take some data (lets say a single floating point value) from kafka and add them to the body of an http call to some Rest API, through a Logstash pipeline. Is that really possible? I was trying to do this with a query to elasticsearch, with no luck. Find my Logstash pipeline below:
input {
elasticsearch {
kafka {
bootstrap_servers => 'kafka:9094'
topics => ["..."]
client_id => "..."
group_id => "..."
codec => json {}
}
}
}
filter {
http {
verb => "POST"
url => "http://some url"
body_format => "json"
body => {
"instances" => "%{instances}"
}
}
output{
elasticsearch {
hosts => [ "elasticsearch:9200" ]
index => "demo_results"
}
}
In other words, I do not know how to send the data from elastic to the http_poller call!
Thanks a lot,
George.
What you could do is to leverage the scheduling capabilities of the
elasticsearchinput so that it executes at regular time intervals to fetch whatever you need.Then, when you get the data you wanted, you can use the
httpfilter to call your remote API using the data you fetched from Elasticsearch.