I will publish some Docker containers incorporating a common logging framework (written in golang). The logging format is a JSON format.
There is distinct data in this custom json logging format that I would like to be index/searchable from with Kibana. My understanding is that I need to transform/filter this data, but I'm struggling to understand how this is done even after RTFM. I have to extract JSON from JSON?
Some example output from minimal sample application as seen in Docker logs:
{"app_name":"SampleApp","app_port":6666,"app_version":"0.0.2","file":"/build/examples/sample/app/runmain/main.go:131","func":"example.com/code/microservices/examples/sample/app/runmain.mainErr.func1","fw_version":"v0.0.1","level":"info","msg":"listening","time":"2024-01-18T20:31:39.163970213+07:00"}
The data makes its way into fluentd and is logged as:
2024-01-18 20:31:39.000000000 +0000 f905b090d278: {"container_id":"f905b090d278ec2cc2f1f912acdbf8787a0a1c91d8ab7b00ad84e9da20c8c147","container_name":"/fervent_jemison","source":"stdout","log":"{\"app_name\":\"SampleApp\",\"app_port\":6666,\"app_version\":\"0.0.2\",\"file\":\"/build/examples/sample/app/runmain/main.go:131\",\"func\":\"example.com/code/microservices/examples/sample/app/runmain.mainErr.func1\",\"fw_version\":\"v0.0.1\",\"level\":\"info\",\"msg\":\"listening\",\"time\":\"2024-01-18T20:31:39.163970213+07:00\"}\r"}
<source>
@type forward
port 24224
bind 0.0.0.0
</source>
<filter docker.**>
@type parser
key_name log
reserve_data true
<parse>
@type json
</parse>
</filter>
<match *.**>
@type copy
<store>
@type elasticsearch
host es
port 9200
user elastic
password elastic
logstash_format true
logstash_prefix fluentd
logstash_dateformat %Y%m%d
include_tag_key true
type_name access_log
tag_key @log_name
<buffer>
flush_interval 1s
</buffer>
</store>
<store>
@type stdout
</store>
</match>
Initially, I'll deploy this all locally on a single host.
Any tips or further direction would be greatly appreciated. This is a new world to me.