I generate a JSON file every 5 mins through a Python code and try to pushing data to Elastic, but the Logstash throw following message and doesn't push any data to Kibana.

My pipeline: File --> Logstash --> Elastic --> Kibana

JSON file output:

{
    "platform": "Computer",
    "mode": "Live",
    "users": 899
}

Log Message:

[INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

My Logtash.conf file

input {
        file {
                path => "D:/elk/logs_folder/test.json"
                start_position => "beginning"
                sincedb_path => "NUL"
                codec => "json"
        }
}

filter {
  json {
    skip_on_invalid_json => true
    source => "message"
    target => "jsonData"
    add_tag => [ "_message_json_parsed" ]    
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "in_elk_test"
  }

  stdout{

  }
}

I'm getting following error when run Logstash

Log:

[INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.9.0) {:es_version=>8}
[WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"in_elk_test"}
[INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2000, "pipeline.sources"=>["D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/logstash-simple.conf"], :thread=>"#<Thread:0x50d159d0@D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.05}
[INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/data/plugins/inputs/file/.sincedb_f2779ebeeb58467d208ce626cfa73491", :path=>["D:/elk/logs_folder/logs1.json"]}
[INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[INFO ][filewatch.observingtail  ][main][223ec84e00c300043960ade7a8b1b9aa2a896b167223b1bf197e641e0ac119cd] START, creating Discoverer, Watch with file and sincedb collections
[INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

After the statement, Logstash doesn't parse my JSON file and not pushing data to index.

enter image description here

Please help me on figuring out the issue and address it, Thanks in advance!

1

There are 1 best solutions below

8
Val On

[INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/data/plugins/inputs/file/.sincedb_f2779ebeeb58467d208ce626cfa73491", :path=>["D:/elk/logs_folder/logs1.json"]}

In your file input you also need to set a sincedb_path to make sure that you read your file from the beginning, otherwise if you've started Logstash a few times, it will read from the end of the file

    file {
            path => "D:/elk/logs_folder/test.json"
            start_position => "beginning"
            sincedb_path => "NUL"
            codec => "json"
    }