I'm running Elastic Stack 8.3.0. I encounter the following error in Kibana "Discover" with an index:
The index is indexed from the following csv file:
timeStamp,elapsed,label,responseCode,responseMessage,threadName,dataType,success,failureMessage,bytes,sentBytes,grpThreads,allThreads,URL,Latency,Hostname,IdleTime,Connect 2023/09/17 02:23:03.892,167,otcs_user_interface-1,200,,login-browse-upload-download-search-logout 1-1,text,true,,10746,681,1,1,http://{URL},113,{HostName},0,30 2023/09/17 02:23:03.892,113,otcs_user_interface-1-0,302,,login-browse-upload-download-search-logout 1-1,text,true,,2420,143,1,1,http://{URL},113,{HostName},0,30 2023/09/17 02:23:04.008,51,otcs_user_interface-1-1,200,,login-browse-upload-download-search-logout 1-1,text,true,,8326,538,1,1,http://{URL},51,{HostName},0,0 2023/09/17 02:23:04.100,3,Debug Sampler,200,OK,login-browse-upload-download-search-logout 1-1,text,true,,2657,0,1,1,null,0,{HostName},0,0 2023/09/17 02:23:04.104,9,login-2,200,,login-browse-upload-download-search-logout 1-1,text,true,,8041,471,1,1,http://{URL},9,{HostName},0,0
I use Filebeat to collect this csv log to logstash, this is my logstash pipeline:
input {
beats { port => "${beats_port}" ssl => true ssl_certificate_authorities => ["${ssl_certificate_authorities}"] ssl_certificate => "${ssl_certificate}" ssl_key => "${ssl_key}" ssl_verify_mode => "peer" } }
filter {
if "login-browse-upload-download-search-logout" in [tags]{ csv { columns => ["timeStamp", "elapsed", "label", "responseCode", "responseMessage", "threadName", "dataType", "success", "failureMessage", "bytes", "sentBytes", "grpThreads", "allThreads", "URL", "Latency", "Hostname", "IdleTime","Connect"] separator => "," convert => {"elapsed" => "integer"} } date { match => [ "timeStamp", "yyyy/MM/dd HH:mm:ss.SSS" ] target => "timeStamp" } mutate { gsub => ["threadName", " 1-1", ""] } mutate { rename => ["threadName", "TestcaseName"] remove_field => ["responseCode"] remove_field => ["responseMessage"] remove_field => ["dataType"] remove_field => ["bytes"] remove_field => ["sentBytes"] remove_field => ["grpThreads"] remove_field => ["allThreads"] remove_field => ["URL"] remove_field => ["Latency"] remove_field => ["IdleTime"] remove_field => ["Connect"] }
if [label] == "login_logout-testcase" or [label] == "browse-testcase" or [label] == "export_content-testcase" or [label] == "import_content-testcase" or [label] == "search-testcase" or [label] == "login-browse-upload-download-search-logout-testcase" { if [success] == "true" { mutate { add_field => {"Testcase_Status" => "success" } } } else { mutate { add_field => {"Testcase_Status" => "failure" } } }
mutate { rename => ["elapsed", "Testcase_ExecutionTime"] remove_field => ["label"] remove_field => ["success"] remove_field => ["failureMessage"] } } else { mutate { rename => ["label", "Transaction_Name"] rename => ["failureMessage", "Transaction_ErrorMessage"] }
if [success] == "true" { mutate { add_field => {"Transaction_Status" => "success" } } } else { mutate { add_field => {"Transaction_Status" => "failure" } } }
mutate { remove_field => ["elapsed"] remove_field => ["success"] } } } }
Output { if "login-browse-upload-download-search-logout" in [tags]{ elasticsearch { hosts => ["${elasticsearch_host}:${elasticsearch_port}"] index => "otcs_unit_transactions-%{+yyyy.MM.dd}-000001" user => "${logstash_writer_username}" password => "${logstash_writer_password}" ssl => true ssl_certificate_verification => true cacert => "${cacert}" ilm_enabled => true ilm_rollover_alias => "otcs_unit_transactions" ilm_pattern => "{now/d}-000001" ilm_policy => "otcs_unit_transactions_ilm" } } }
As you can see I have a Date filter plugin, not sure if there is an error there that prevents logstsash from parsing the correct date format to the timeStamp column. Has anyone encounter this error?