A script that will check for duplicated files in a specified folder on your computer. You can use it to deduplicate any kind of user data, such as old game ROMs that you inherited from various old sources, your porn folder, or really any kind of data in folders that's purely for your own use. I've saved a couple gig from my rather full server, which is a welcome relief.
Code:
#!/usr/bin/env bash
maxsplit=800 # The maximum number of files this will process without trying to
# split the path into multiple subdirectories.
declare -a paths
function normalisepath {
# (i.e. eliminate /../ in the middle of supplied paths)
local better=$1
better=${better//\/.\//\/}
while [[ $better =~ ([^/][^/]*/\.\./) ]]
do
better=${better/${BASH_REMATCH[0]}/}
done
echo "$better"
}
function dividepath {
local path=$1
local maxdepth=$2
if [ $maxdepth -gt 0 ] &&
[ `find "$path" -type f|wc -l` -gt $maxsplit ]
then
# split it
while IFS= read -r file
do
if [ -n "$file" ]
then
paths+=("$file")
fi
done <<< `find "$path" -maxdepth 1 -mindepth 1 -type f`
while IFS= read -r path
do
if [ -n "$path" ]
then
dividepath "$path" $(($maxdepth - 1))
fi
done <<< `find "$path" -maxdepth 1 -mindepth 1 -type d`
else
paths+=("$path")
fi
}
for path in "$@"
do
path=$(normalisepath "$path")
dividepath "$path" 2
done
declare hashes
for path in "${paths[@]}"
do
if [ -d "$path" ]; then echo "Searching in "$path"..."; fi
hashes+=`find "$path" -type f -exec stat -c %s {} \; -exec sha512sum {} \; | paste - - -d" "`
done
echo "Looking for dups..."
declare dups
dups=`echo -n "$hashes" | cut -f -2 -d " " | sort -g | uniq -d`
if [ ${#dups} -lt 64 ]
then
echo "Well done, no duplicates found"
else
while IFS= read -r dup; do
if [ ${#dup} -ge 64 ]; then
echo "The following files seem to be duplicates:"
echo "$hashes"|grep -F "$dup"|cut -f 3- -d " "
fi
done <<< "$dups"
fi
It fettles the paths you supply to split them into subdirectories if your collection forms lots of files, because otherwise it can be rather quiet when processing, which is not something I personally like. There's a number, maxsplit, defined near the top which might need fiddling with for your particular use. If you computer or disc is significantly faster than mine you might want to double it, and if you've big files it'll slow up the hashing so you might want to reduce it to make it noisier.