Suppose I have the following JSON document (inspired by this post)
Initial document
{
"key": "value",
"ips": [
{
"ip": "1.2.3.4",
"macAddress": "ac:5f:3e:87:d7:1a"
},
{
"ip": "5.6.7.8",
"macAddress": "ac:5f:3e:87:d7:2a"
},
{
"ip": "9.10.11.12",
"macAddress": "ac:5f:3e:87:d7:3a"
},
{
"ip": "13.14.15.16",
"macAddress": "42:12:20:2e:2b:ca"
}
]
}
Now I would like to read every macAddress pass it to a hash function (e.g. md5sum) and write the result back to the JSON document.
Desired output
{
"key": "value",
"ips": [
{
"ip": "1.2.3.4",
"macAddress": "45ee585278a0717c642ff2cb25a8e441"
},
{
"ip": "5.6.7.8",
"macAddress": "ab47bf90cb9f385127977569e676ce70"
},
{
"ip": "9.10.11.12",
"macAddress": "a5e9785db428e3956a47776dbd00fc91"
},
{
"ip": "13.14.15.16",
"macAddress": "f75d61937f70252ff139adee241daab4"
}
]
}
Currently I've the following shell script, but I think it can be done more elegantly...preferably in a one-liner.
json_doc="{\"key\": \"value\", \"ips\": [{\"ip\":\"1.2.3.4\",\"macAddress\":\"ac:5f:3e:87:d7:1a\"},{\"ip\":\"5.6.7.8\",\"macAddress\":\"ac:5f:3e:87:d7:2a\"},{\"ip\":\"9.10.11.12\",\"macAddress\":\"ac:5f:3e:87:d7:3a\"},{\"ip\":\"13.14.15.16\",\"macAddress\":\"42:12:20:2e:2b:ca\"}]}"
ip_list=$(jq -c '.ips[]' <<< "$json_doc" |
while read -r jsonline ; do
hashmac="$(jq -s -j '.[] | .macAddress' <<<"$jsonline" | md5sum | cut -d ' ' -f1)"
jq --arg hashmac "$hashmac" -s -r '.[] | .macAddress |= "\($hashmac)"' <<<"$jsonline"
done | jq -s)
# Update json document with ip list containing hashed mac addresses
jq --argjson ips "$ip_list" '.ips = $ips' <<<"$json_doc"
A variation of peak's answer from the linked question. Two invocations of
jq, first for calculating the md5 hashes and then re-construct the calculated result back into the original JSON usingreduceThe second
jqinvocation should be read carefully. The initial arguments-s -Rare for reading the multi-line non JSON output created by the for-loop into jq's context. While the--slurpfileargument is needed for updating back the calculated hash into the original JSON. The slurp action takes the whole file into memory.So as such this command, might not be effective for really large JSON files.