Tailing a logfile and processing each line is missing data when converting a file with ffmpeg

StackOverflow https://stackoverflow.com/questions/22916515

  •  29-06-2023
  •  | 
  •  

문제

I am running a script to tail a log file as per the code snippet below. I am running into a problem where by the line passed into $line is missing a number amount of bytes from the beginning when several lines are written to the log file at nearly the same time.

I can check the file afterwards and see that the offending line is complete in the file so why is it incomplete in the script. Some kind of buffering issue perhaps?

The processing can sometimes take several seconds to complete would that make a difference?

#!/bin/bash
tail -F /var/log/mylog.log | while read line
do
   log "$line"
   ffmpeg -i "from.wav" "to.mp3"
done

Full line in file

"12","","765467657","56753763","test"

example logged $line

657","56753763","test"

Update I have done some more debugging of my code and it seems the processing that is causing the problem is a call to ffmpeg used to convert a wav to mp3. If I swap that with just a sleep then the problem goes away. Could ffmpeg effect the buffer somehow?

도움이 되었습니까?

해결책 2

In a while loop ffmpeg reads from std input, consuming all the arguments at once. To prevent this behavior a common workaround is redirecting ffmpeg's std input to /dev/null, as shown below:

tail -F /var/log/mylog.log | while read line
 do
   log "$line"
   ffmpeg -i "from.wav" "to.mp3" < /dev/null
done

There are also other commands, such as ssh, mplayer, HandBrakeCLI ..., that display the same behavior in a while loop.

다른 팁

If you are on a platform with a reasonably recent version of GNU Coreutils (e.g. any fairly recent Linux distro), you can use stdbuf to force line buffering.

The example in the stdbuf manpage is highly relevant:

 tail -f access.log | stdbuf -oL cut -d ' ' -f1 | uniq

This will immedidately display unique entries from access.log

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top