I am using ElasticSearch, here we are creating the day wise index and huge amount of data is being ingested every minute. wanted to export few fields from index created every day to Google cloud storage. am able to achieve this with output file as json as shown below:
input {
elasticsearch {
hosts => "localhost:9200"
index => "test"
query => '
{
"_source": ["field1","field2"],
"query": {
"match_all": {}
}
filter {
mutate {
rename => {
"field1" => "test1"
"field2" => "test2"
}
}
}
}
'
}
}
output {
google_cloud_storage {
codec => csv {
include_headers => true
columns => [ "test1", "test2" ]
}
bucket => "bucketName"
json_key_file => "creds.json"
temp_directory => "/tmp"
log_file_prefix => "logstash_gcs"
max_file_size_kbytes => 1024
date_pattern => "%Y-%m-%dT%H:00"
flush_interval_secs => 600
gzip => false
uploader_interval_secs => 600
include_uuid => true
include_hostname => true
}
}
However how to export it as CSV file and send it to Google Cloud Storage
You should be able to change
output_formattoplainbut this setting is going to be deprecatedYou should remove
output_formatand use thecodecsetting instead, which supports acsvoutput formatIf you want to rename your fields, you can add a
filtersection andmutate/renamethe fields however you like. Make sure to also change thecolumnssettings in your csv codec output: