Shell scripts for day to day use...


docbroke

Banned
Joined
Feb 21, 2019
Messages
406
Age
39
Location
India
I like Z for Jump list (quickly jump to directories I frequent):
its a bit heavyweight for the Pandora, so there I just use an alias (or browse with tunar and drag and drop the directory to bash).
If you use zshell, instead use this
If you are on Windows, use this
I like V too.
I also had used same authors epub reader, but later moved to epr which also support images.
 

ClockworkCoder

Chaotic Neutral
Joined
Jan 21, 2016
Messages
1,654
Location
Menzoberranzan
It's not really a script, but I've just recently moved from using a "bare" git repository to manage my dotfiles, to using GNU Stow. The benefit is that my dotfiles are now organised in folders by application name, and it's easier to find / edit or add to them.


Steps, in short (example using zsh)

Bash:
mkdir -p ~/dotfiles/zsh
mv ~/.zshrc ~/dotfiles/zsh
cd ~/dotfiles && stow zsh
The final step creates a symlink from ~/dotfiles/zsh/.zshrc to ~/.zshrc
Post automatically merged:

Also, if I need to quickly skip branches, I often use a temp "work in progress" commit to avoid stashing issues:

Bash:
git-wip-resume () { 
        is_wip=`git log -1 --oneline | grep -P "\bwip$" -o` 
        if [ "$is_wip" = "wip" ] 
        then  
                echo "Reverting to previous work in progress"   
                git reset --soft HEAD~1 
        fi  
}

git-wip () {
        git commit -am "wip"
}
 

docbroke

Banned
Joined
Feb 21, 2019
Messages
406
Age
39
Location
India
I didn't knew about stow, but I just organized my dotfiles using it. This will be useful when I will migrate my dotfiles to pyra. Hopefully soon :)
 
Last edited:

docbroke

Banned
Joined
Feb 21, 2019
Messages
406
Age
39
Location
India
Another script I like and use routinely for playing videos on kodi from my laptop ( sending youtube links, or playing local videos from laptop to kodi running in pi connected to tv)
send_to_kodi
 
  • Like
Reactions: rSl

levi

Still fresh, damnit!
Joined
Oct 6, 2008
Messages
12,649
Location
Somewhere off the coast of the EU
A script that will check for duplicated files in a specified folder on your computer. You can use it to deduplicate any kind of user data, such as old game ROMs that you inherited from various old sources, your porn folder, or really any kind of data in folders that's purely for your own use. I've saved a couple gig from my rather full server, which is a welcome relief.

Code:
#!/usr/bin/env bash

maxsplit=800 # The maximum number of files this will process without trying to
# split the path into multiple subdirectories.

declare -a paths

function normalisepath {
    # (i.e. eliminate /../ in the middle of supplied paths)
    local better=$1
    better=${better//\/.\//\/}
    while [[ $better =~ ([^/][^/]*/\.\./) ]]
    do
        better=${better/${BASH_REMATCH[0]}/}
    done
    echo "$better"
}

function dividepath {
    local path=$1
    local maxdepth=$2
    if  [ $maxdepth -gt 0 ] &&
      [ `find "$path" -type f|wc -l` -gt $maxsplit ]
      then
        #  split it
        while IFS= read -r file
        do
            if [ -n "$file" ]
            then
                paths+=("$file")
            fi
        done <<< `find "$path" -maxdepth 1 -mindepth 1 -type f`
        while IFS= read -r path
        do
            if [ -n "$path" ]
            then
                dividepath "$path" $(($maxdepth - 1))
            fi
        done <<< `find "$path" -maxdepth 1 -mindepth 1 -type d`
    else
        paths+=("$path")
    fi
}

for path in "$@"
do
    path=$(normalisepath "$path")
    dividepath "$path" 2
done

declare hashes
for path in "${paths[@]}"
do
    if [ -d "$path" ]; then echo "Searching in "$path"..."; fi
    hashes+=`find "$path" -type f -exec stat -c %s {} \; -exec sha512sum {} \; | paste - - -d" "`
done

echo "Looking for dups..."
declare dups
dups=`echo -n "$hashes" | cut -f -2 -d " " | sort -g | uniq -d`
if [ ${#dups} -lt 64 ]
then
    echo "Well done, no duplicates found"
else
    while IFS= read -r dup; do
        if [ ${#dup} -ge 64 ]; then
            echo "The following files seem to be duplicates:"
            echo "$hashes"|grep -F "$dup"|cut -f 3- -d " "
        fi
    done <<< "$dups"
fi
It fettles the paths you supply to split them into subdirectories if your collection forms lots of files, because otherwise it can be rather quiet when processing, which is not something I personally like. There's a number, maxsplit, defined near the top which might need fiddling with for your particular use. If you computer or disc is significantly faster than mine you might want to double it, and if you've big files it'll slow up the hashing so you might want to reduce it to make it noisier.
 

levi

Still fresh, damnit!
Joined
Oct 6, 2008
Messages
12,649
Location
Somewhere off the coast of the EU
I must admit, I didn't even consider md5 because for a number of years now, shasumming everything has been quick enough for me. I did check sha256sum versus 512 and to my surprise found it was quicker to do the full 512.

Edit: Just tried it, and it saved 20 seconds off an 93 second sha512sum run, and didn't flag up any false duplicates on my data either. I think I'll stick with sha512sum for the extra level of security it gives me still though.
 
Last edited:
Top