//View Tip #251
Similar Tips
» Duplicate directory tree
» The poor man's REPL
» Delete old files
» Remove duplicate files
» Remove empty directories

 

Latest tips by RSS
Click here to subscribe
Follow Shell-Fu on Twitter
Click here to follow
Follow Shell-Fu on identi.ca
Click here to follow

ls | xargs rm

Sometime there are so many files in a directory than the rm command doesn't work
[root@server logs]# rm *
bash: /bin/rm: Argument list too long


On this case the best option is to use ls in conjuntion with xargs
[root@server logs]# ls | xargs rm



View Comments »




Comments 

Add your comment

Comments are currently disabled
wrong, do not pipe tjhe output of ls, the command is not intended to be used that way and is totally unsafe.

when there's too much files you do:

 for filename in *; do
     rm "$filename:
 done

this is a shell construct and it will handle large lists correctly, and is also safe.
Posted 2009-02-12 23:05:04
for filename in *; do
    rm "$filename":
done

(forgot a quote)
Posted 2009-02-12 23:05:55
crap! wrong again, this is it:

for filename in *; do
    rm "$filename"
done
Posted 2009-02-12 23:07:48
If GNU Parallel http://www.gnu.org/software/parallel/ is installed:

ls | parallel -X rm

This will run fewer rm's and can be faster if you remove a lot of files. GNU Parallel is safe for filenames not containing newline (and with -0 it is even safe for those).

Watch the intro video: http://www.youtube.com/watch?v … paiGYxkSuQ
Posted 2010-06-22 04:59:25

Home Latest Browse Top 25 Random Hall Of Fame Contact Submit