Repetitive use of OptiPNG / JPEGTran / GIFSicle
-
12-12-2019 - |
Question
I don't have enough rep to comment on the contributors post, so I will ask the question here. There was a great Q&A regarding bulk image optimization here Bulk replacing with compressed images?
I'd like to setup a cron that runs once a month, which performs image optimizations suggested in the above link. Given the way the script would work, it'll attempt to re-optimize images - but would those resources be smart enough to not degrade the quality?
Solution
There's two options for image compression,
- Lossless compression
- Lossy compression
The former can be run repeatedly on the same file, without further loss/degradation of quality (as its only really removing meta data). Obviously, after the first run, there is no benefit on subsequent runs.
The latter should not be repeatedly run on the same file, otherwise, you will degrade the quality each time. The simplest way to prevent this occuring is to merely keep an activity log of what has/hasn't been processed.
We wrote a couple of scripts for MageStack to do this, one as a batch/singular process, the other as an "active" process (that actively watches a directory for changes and processes any changed files by itself).
Here's the standalone script /microcloud/scripts/acc/image_optimise.sh
,
#!/bin/bash
which sponge >/dev/null 2>&1 || ( echo "Error: moreutils must be installed" && exit 1 )
which advpng >/dev/null 2>&1 || ( echo "Error: advancecomp must be installed" && exit 1 )
which optipng >/dev/null 2>&1 || ( echo "Error: optipng must be installed" && exit 1 )
which jpegtran >/dev/null 2>&1 || ( echo "Error: libjpeg-progs must be installed" && exit 1 )
which jfifremove >/dev/null 2>&1 || ( echo "Error: jfifremove must be installed" && exit 1 )
function usage() {
cat <<EOF
$(basename $0) Usage:
$(basename $0) [directory]
directory Directory containing images to optimise
EOF
exit 0
}
function cleanup() {
wait
end_size=$(du -s . | awk '{print $1}')
savings=$(( $start_size - $end_size ))
megabytes=$(( $savings / 10**3 ))
echo -e "\nSaved ${megabytes} MB"
echo "Removing part files"
find -path '*.optipart' -exec rm "{}" \; 2>/dev/null
exit 1
}
function do_png () {
advpng -z -q "$1"
optipng -q "$1"
return 0
}
function do_jpeg () {
jpegtran -copy none -optimize -outfile "$1" "$1"
jfifremove < "$1" | sponge "$1"
return 0
}
function imgopt()
{
ext=$(echo $1 | sed -E 's#.optipart$##g')
case "$ext" in
*.[Pp][Nn][Gg] )
do_png "$1"
;;
*.[Jj][Pp][Ee][Gg] )
do_jpeg "$1"
;;
*.[Jj][Pp][Gg] )
do_jpeg "$1"
;;
* )
return 1
;;
esac
}
trap cleanup 2 3
[ $# -lt 1 ] && usage
[ ! -d "$1" ] && echo "Error: Directory does not exist" && exit 1
dir=$1
cd $dir
touch .optimised.log
find . -regextype sed -regex '.*\.\(png\|jpe\?g\)$' | sed -E 's#\./##g' > .all.log
sort -u .all.log -o .all.log
sort -u .optimised.log -o .optimised.log
comm -23 .all.log .optimised.log > .process.log
rm .all.log
process_total=$(wc -l < .process.log)
total=0
count=0
files=()
start_time=$(date +%s)
threads=$(cat /proc/cpuinfo | grep -cE "^processor")
start_size=$(du -s . | awk '{print $1}')
echo "$process_total images to optimise"
while read i; do
count=$(( count + 1 ))
files+=( $i )
if [ $count -eq $threads ] || [ $(( $process_total - $total )) -lt 1000 ]; then
for f in ${files[@]}; do
(
cp -al $f{,.optipart}
imgopt ${f}.optipart
if [ $( stat -c %s ${f}.optipart ) -eq 0 ]; then
rm ${f}.optipart
else
mv $f{.optipart,}
echo $f >> .optimised.log
fi
) &
done
wait
total=$(( total + count ))
count=0
files=()
else
continue
fi
if [ $(( total % 1000 )) -eq 0 ]; then
current_time=$(date +%s)
elapsed=$(( $current_time - $start_time ))
images_per_second=$(( $total / $elapsed ))
remaining=$(( ( ( $process_total / $images_per_second ) - $elapsed ) / 60 ))
percentage=$(( ( $total * 100 ) / $process_total ))
completion_time=$( date +%r -d "$remaining minutes" )
echo "$elapsed seconds elapsed. $total/$process_total images optimised (${percentage}%) at $images_per_second images/sec. Est. ~$remaining minutes remaining ($completion_time )"
fi
done < .process.log
cleanup
The other utility is mage-watch.sh
(the source is too lengthy to post here), but you can download it from https://sys.sonassi.com/scripts/mage-watch.sh (there a