Question

Hi I'm looking to see what file is changing in a directory i'd like to get the md5sum of every file and write it to a text file. Then after i know a file has changed i'd like to run it again so i can diff the output files to see what exactly changed. Here is what i've tried however it doesn't work as i need.

Also not only do i need to get the md5sum of every file in a folder including subdirectories i need it to not follow symlinks

#!/bin/bash
#

cd /sys/class
for i in $(find . -type f)
do
    ls -lt "$i" >> /home/george/Desktop/before.txt
done
echo "Finished!"

Thank you for any help

===Edit===

I put my actual paths in as i don't really see a need to hide them. Anyway running this returned only a few files (outputted file below) which are the files in the folders meaning it's not going into subdirectories and finding those files too. Btw sorry my bash is way rusty

--w------- 1 root root 4096 Jun 20 03:03 ./gpio/export
--w------- 1 root root 4096 Jun 20 03:03 ./gpio/unexport
-rw-r--r-- 1 root root 4096 Jun 20 03:03 ./firmware/timeout
-r--r--r-- 1 root root 4096 Jun 20 03:04 ./drm/version

===Edit2===

Not exactly sure why some of these files aren't being found for instance /sys/class/backlight/intel_backlight/brightness

And many others like that there are so many files that aren't being found for some reason

Was it helpful?

Solution

The cd is unnecessary, and with type -f you are already in fact bypassing symlinks. So the loop is unnecessary, too:

find /path/to/directory -type f -exec md5sum {} + >before.txt

If your find is too old to support -exec {} + try with -exec {} \; instead.

For the md5sum comparison, you could try simply removing identical lines;

fgrep -vxf before.txt after.txt | less

This is assuming the list in before.txt will fit into fgrep; but if you are dealing with a few dozen thousand files tops, it can probably cope. This will not identify deleted files from before.txt, though.

OTHER TIPS

If your file list size is small enough that you can do it all in memory, you might consider sorting before.txt by the hash. If you do the same for after.txt you'd be able to go line by line on each of the files and identify matches even if the filename has changed. You'd also be able to skip over deleted or added files with less problems than if you had to interpret a diff before.txt after.txt

If using file modification date is an option, what you can do is use ls -lt | head to filter out the newest file and keep that. Then when you want to check for changes, ls -lt again and go through anything that's newer than the date you stored. This should work nicely regardless of file list size, but will be vulnerable to someone modifying last modification date (which would require root privileges)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top