> log.txt or equivalently (by using uni" /> > log.txt or equivalently (by using uni" /> > log.txt or equivalently (by using uni"/>

Output redirection of real-time stdout doesn't work after piping it into uniq or awk

223 Views Asked by At

I'm trying to run

fswatch -tr /home/*/*/public_html | grep --line-buffered -E ".php|.xml" | awk '!seen[$0]++' >> log.txt

or equivalently (by using uniq):

stdbuf -i0 -o0 -e0 fswatch -tr /home/*/*/public_html | grep --line-buffered -E ".php|.xml" | uniq >> log.txt

So that I don't get duplicate rows. It works just fine in the terminal, with standard output, however when I'm trying to write that output to log.txt, the file is blank (or no new rows are inserted if using >>).

fswatch is a command that "monitors" changes to the filesystem in real time, and it generates a lot of duplicate events and uniq seems to address that just fine.

Any ideas why the output redirection doesn't work?

2

There are 2 best solutions below

5
William Pursell On

awk and uniq are going to buffer their output when writing to a regular file. You can get unbuffered behavior with perl:

... | perl -ne '$|=1; print unless ${$_}++'

That is the perl equivalent of awk '!seen[$0]++', but setting $| non-zero makes the output unbuffered. To be more correct you should probably write BEGIN{$|=1} so you're not making the assignment on every line of input, but it's not really necessary.

0
RARE Kpop Manifesto On

formatting didn't look right in comment, so simply re-pasting it for clarity :

mawk '!__[$_]--{ print; fflush() }'