how to split files from error.log file of apache server while reading file line by line continuously?
Question
I have done like this but I am having trouble with shellscript I have written. I am confused with tail command functionality and also when I see output of error.log on terminal it shows lines with 'e' deleted from words.
I have written like this please guide me how can I get my problem solved. I want to read this error.log file line by line and during reading lines I want to split fixed number of lines to small files with suffix i.e log-aa,log-ab,... I did this using split command. After splitting I want to filter lines with GET or POST word in them using regex and store this filtered lines into new file. After this store gets completed I need to delete all these log-* files.
I have written like this:
enter code here
processLine(){
line="$@"
echo $line
$ tail -f $FILE
}
FILE="/var/log/apache2/error.log"
if [ "$1" == "/var/log/apache2/error.log" ]; then
FILE="/dev/stdin"
else
FILE="$1"
if [ ! -f $FILE ]; then
echo "$FILE : does not exists"
exit 1
elif [ ! -r $FILE ]; then
echo "$FILE: can not read"
exit 2
fi
fi
#BAKIFS=$IFS
#IFS=$(echo -en "\n\b")
exec 3<&0
exec 0<"$FILE"
#sed -e 's/\[debug\].*\(data\-HEAP\)\:\/-->/g' error.log > /var/log/apache2/error.log.1
while read -r line
do
processLine $line
done
exec 0<&3
IFS=$BAKIFS
logfile="/var/log/apache2/error.log"
pattern="bytes"
# read each new line as it gets written
# to the log file
#tail -1 $logfile
tail -fn0 $logfile | while read line ; do
# check each line against our pattern
echo "$line" | grep -i "$pattern"
#sed -e 's/\[debug\].*\(data\-HEAP\)\:/-->/g' error.log >/var/log/apache2/error.log
split -l 1000 error.log log-
FILE2="/var/log/apache2/log-*"
if [ "$1" == "/var/log/apache2/log-*" ]; then
FILE2="/dev/stdin"
else
FILE2="$1"
if [ ! -f $FILE2 ]; then
echo "$FILE : does not exists"
exit 1
elif [ ! -r $FILE2 ]; then
echo "$FILE: can not read"
exit 2
fi
fi
BAKIFS=$IFS
IFS=$(echo -en "\n\b")
exec 3<&0
exec 0<"$FILE2"
while read -r line
do
processLine $line
echo $line >>/var/log/apache2/url.txt
done
exec 0<&3
IFS=$BAKIFS
find . -name "var/log/apache2/logs/log-*.*" -delete
done
exit 0
The below code deletes files after reading and splitting error.log but when I put tail -f $FILE it stops deleting files I want to delete log-* files after it reaches last line of error.log file: enter code here
processLine(){
line="$@"
echo $line
}
FILE=""
if [ "$1" == "" ]; then
FILE="/dev/stdin"
else
FILE="$1"
# make sure file exist and readable
if [ ! -f $FILE ]; then
echo "$FILE : does not exists"
exit 1
elif [ ! -r $FILE ]; then
echo "$FILE: can not read"
exit 2
fi
fi
#BAKIFS=$IFS
#IFS=$(echo -en "\n\b")
exec 3<&0
exec 0<"$FILE"
while read -r line
do
processLine $line
split -l 1000 error.log log-
cat log-?? | grep "GET\|POST" > storefile
#tail -f $FILE
done
rm log-??
exec 0<&3
#IFS=$BAKIFS
exit 0
Solution
Your code seems unnecessarily long and complex, and the logic is unclear. It should not have been allowed to grow this big without working correctly.
Consider this:
split -l 1000 error.log log-
cat log-?? | grep "GET\|POST" > storefile
rm log-??
Experiment with this. If these three commands do what you expect, you can add more functionality (e.g. using paths, checking for the existence of error.log
), but don't add code until you have this part working.