We have created up to 2 million files:
arauzo@brain:/tmp $ ls | wc 2099630 2099630 43839565
It was by mistake. You know, that commented line that did not get uncommented, when it should have. I'm starting to think I do very strange things, and they are becomming very strong stress test for Linux.
Anyway, it is not so simple to delete a big bunch of files. The first you probably think of is:
arauzo@neuron2:/tmp$ rm *.net -bash: /bin/rm: Argument list too long
Yes, I know it is long, but I NEED to remove those files. Let's try another thing:
arauzo@neuron2:/tmp$ for f in *.net; do rm -f $f; done removed `mlp85_57_24-NI6MxY.net' removed `mlp85_57_24-NbyBLS.net' removed `mlp85_57_24-Nc7WVw.net' ...
Nice! This works for thousands of files. But now, what happends with our 2 million files?:
arauzo@brain:/tmp $ rm *.net Connection to brain closed. ... arauzo@brain:/tmp $ for f in *.net; do rm -f $f; done Connection to brain closed.
It crashes! :-( Looks like a 'bug' on bash... :-?
Finally, we have managed to remove the 2 million files in groups by their prefix 32*.net 33*.net 34*.net ...
PD. A more intelligent solution (as it does not need to store the list of files anywhere), suggested by wtanaka and redi:
find /tmp -name "*.net" -print0 | xargs -0 rm -f
Can be 'simplyfied' to:
find -name "*.net" -exec rm -f \{\} \;
PD2. The simplification has an overhead of creating one process per file, while xargs creates a process for a group of files.