I'm having issues with logstash where it cannot work with NaN values.
[ERROR][logstash.codecs.json ][main][] JSON parse error, original data now in message field {:message=>"Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at ...}
I have multiple fields incoming NaN from the source of data. They are normally float, int but they also can be NaN. I've browsed the discussions on elastic's forum itself, I could not find any answer.
Especially this is the problem I am having that someone asked years ago but left without an answer: https://discuss.elastic.co/t/json-codec-dealing-with-nan-inf/138611
How can I allow logstash to work ALLOW_NON_NUMERIC_NUMBERS, like it says in the error?
My kafka_properties for logstash is below: I don't have any extra configuration change in .yml files. I'm using everything default.
input {
kafka {
bootstrap_servers => "MY_SERVER"
topics => [
MY_SOURCE_1,
MY_SOURCE_2
]
codec => json
decorate_events=> true
}
}
filter {
mutate{
add_field => { "[topic_name]" => "%{[@metadata][kafka][topic]}"}
}
}
output {
elasticsearch {
hosts => ["MY_SERVER:9200"]
data_stream=>"true"
}
}
Thanks in advance.
It looks like there is a problem with RAW data structure. To diagnose the issue I recommend you to update the logstash conf and check the incoming data format. Here is an example logstash.conf for you.
I disabled to JSON coded in Logstash input part. Save this yaml and run the logstash on the terminal.
Check the output on terminal, if it's not proper JSON you need to take care of the RAW data.