//Browsing archives (Page 6 of 13)
Latest tips by RSS
Click here to subscribe
Follow Shell-Fu on Twitter
Click here to follow
Follow Shell-Fu on identi.ca
Click here to follow


Ever been on a machine that just wouldn't respond? As soon as you're root, lower the priority of the offending process ID(s) by using the 'renice' command:

# renice -19 <PID>




View Comments »



You can use the 'uniq' command to handle duplicate lines in a file:

$ cat tst
a
a
b
c
c
d

$ uniq -d tst //show duplicated lines a c
$ uniq -u tst //show unique lines b d
$ uniq tst //remove duplicates a b c d


View Comments »

Use the following to see the commands you use most often based on your shell history:

history | awk '{print $2}' | sort | uniq -c | sort -rn | head


View Comments »

If you want to tail the errors on another terminal, just push them to a fifo:

$ mkfifo cmderror

$ mycommand 2> cmderror


On your other terminal:

$ tail -f cmderror



View Comments »

Longish oneliner (I actually wrote it in one line first) for giving you somewhat (mount list is never good enough) accurate sum of your file systems' totals.


df | egrep -v "(Filesystem|\/dev$|shm$|dvd|cdrom)" | awk '{totalu += $2 ; totalf += $4} END {print "Total space in devices: " (totalu/1024/1024) " GB\nFree space total: " (totalf/1024/1024) " GB"}'


View Comments »


ls | xargs rm

Sometime there are so many files in a directory than the rm command doesn't work
[root@server logs]# rm *
bash: /bin/rm: Argument list too long


On this case the best option is to use ls in conjuntion with xargs
[root@server logs]# ls | xargs rm



View Comments »

cat file.txt | sort -R | tail -1

pokes a random line from a file


View Comments »

alias vim='mplayer rocky-theme.ogg& && vim'

Play your favourite tune while you code! :)


View Comments »

Make a backup of existing files, afterwards copy new files from somedir:

1. Go to proddir
ls /update-200805/ |xargs -n1 -I xxx cp xxx xxx.`date +%Y%m%d` ; cp /update-200805/* . 


View Comments »

/usr/sbin/scutil --set ComputerName "COMPUTER NAME HERE"
-or-
/usr/sbin/scutil --set LocalHostName "LOCALHOST NAME HERE"

Alternatively use the --get switch to return them.


View Comments »

/usr/sbin/systemsetup -settimezone "TIMEZONE HERE"


Values can be found in /usr/share/zoneinfo


View Comments »

I use the following to list non-system users. It should be portable though won't work on systems without the getent command.

alias lsusers='getent passwd | tr ":" " " | awk "\$3 >= $(grep UID_MIN /etc/login.defs | cut -d " " -f 2) { print \$1 }" | sort'


View Comments »

You might want to get rid of the awk, sort, uniq and grep depending on how much info you need.

for dir in $(find /proc -maxdepth 1 -type d -name "*[0-9]");do ls -l $dir/fd/ | awk '{print $11}';done | sort | uniq | grep "^/var/"



View Comments »



file -N * | awk -F":" '{type[$2]++}END{ for (i in type) print type[i],i }'


View Comments »

The for loop has a problem with entries with spaces, whether it's file names or entries in a text file. Common solutions include changing the IFS variable to exclude spaces (so that the for loop will only use tabs and line breaks to separate entries) and piping the entries to a 'while read line; do' command. However, setting and resetting IFS each time you want to include/exclude spaces is kinda messy and the 'while read' thing means you can't make any normal use of the read command within the loop. For a once-per-line loop that doesn't present these problems, try


i=0
numberoflines=$(cat mytextfile.txt | wc -l)
while [ $i -lt $numberoflines ]; do
let i++
contentoflinenumberi="$(sed -n ${i}p mytextfile.txt)"
read -p "Do you wish to see line ${i}? (y/n) " reply
if [ "$reply" = "y" ]; then
echo "$contentoflinenumberi"
fi
done


View Comments »


# awk 'BEGIN{FS=""}{for(i=1;i<=NF;i+=2){ r=r $i $(i+1)":"}}END{sub(/:$/,"",r);print r}' file


View Comments »

During an ftp session, get multiple files like this:


prompt
mget *.tar.gz


View Comments »

I guess anyone who's administered several remote boxes has had the unfortunate problem of (when not thinking straight) taking down the network card on a machine you have no physical access to. The result being that the ssh session you used to connect dies. The typical mistake is to do something like (as root):

ifconfig eth0 down; ifconfig eth0 inet 123.4.5.6; ifconfig eth0 up

The unfortunate result being that the first statement disconnects your session and hangs up the chain resulting in the network not coming back up. A nice way around this is to use the bash "disown" builtin command, ie:

(sleep 5; ifconfig eth0 inet 123.4.5.6; ifconfig eth0 up)& disown -h $! ; ifconfig eth0 down

In this case you launch a backgrounded task that is disconneced from the session (meaning the ssh session dying won't kill the process) which sleeps for 5 seconds (to give the down a chance to happen) then configures the network card as appropriate and brings it back up. As soon as this launches and is disowned, then immediately takes the network card down. If the configuration change keeps the IP address the same, you'll find that after 5 seconds your bash prompt just comes back and the session resumes.


View Comments »

This piece of code lists the size of every file and subdirectory of the current directory, much like du -sch ./* except the output is sorted by size, with larger files and directories at the end of the list. Useful to find where all that space goes.


du -sk ./* | sort -n | awk 'BEGIN{ pref[1]="K"; pref[2]="M"; pref[3]="G";} { total = total + $1; x = $1; y = 1; while( x > 1024 ) { x = (x + 1023)/1024; y++; } printf("%g%s\t%s\n",int(x*10)/10,pref[y],$2); } END { y = 1; while( total > 1024 ) { total = (total + 1023)/1024; y++; } printf("Total: %g%s\n",int(total*10)/10,pref[y]); }'


View Comments »

Grep file(s) and highlight the grep matches using less...


grep query *.php | less +/query


View Comments »

Add one number per line from stdin until a 0 is found, then print the result.
dc -e '[+pq]sQ0[?d0=Q+lXx]dsXx'


View Comments »

lynx -dump http://www.spantz.org | grep -A999 "^References$" | tail -n +3 | awk '{print $2 }'


View Comments »

Mass-renaming files using find and sed:

find -name "*.php3" | sed 's/\(.*\).php3$/mv "&" "\1.php"/' | sh

(this example will rename all .php3 files to .php)


View Comments »






Home Latest Browse Top 25 Random Hall Of Fame Contact Submit