Wiki‎ > ‎

Super fast delete a folder with large number of files

posted Jul 4, 2016, 12:08 AM by Danny Xu   [ updated Jul 6, 2016, 9:20 AM ]

time perl -e 'for(<*>){((stat)[9]<(unlink))}'

time find ./ -type f -delete (the BEST)

time rsync -a --delete blanktest/ test/

RM CommandIs not capable of deleting large number of files
Find Command with -exec14 Minutes for half a million files
Find Command with -delete 5 Minutes for half a million files
Perl1 Minute for half a million files
RSYNC with -delete2 Minute 56 seconds for half a million files

Nice article. It inspired me to check results for find -delete, rsync and perl. I got another top. On my PC leader is find. Linux 4.2, Ubuntu 14.04, Intel i5 4 cores, Intel SSD 5xx series, EncFS encryption.

$ time for i in $(seq 1 500000); do echo testing >> $i.txt; done

real 1m13.263s
user 0m7.756s
sys 0m57.268s

Operation was repeated for each test with similar results.

$ time rsync --delete -av ../empty/ ./

real 4m5.197s
user 0m4.308s
sys 1m43.400s

$ time find ./ -delete

real 2m19.819s
user 0m1.044s
sys 0m59.100s

$ time perl -e 'unlink for ( <*> ) '
real 3m17.482s
user 0m2.524s
sys 1m29.196s

You can use this. You need to use glob for removing files:

unlink glob "'/tmp/*.*'";

These extra apostrophes are needed to handle filenames with spaces as one string.

Won't delete files with no "." in them. Won't delete files with a leading ".". No error reporting.

mkdir empty_dir
rsync -a --delete -P
empty_dir/ your_folder/

Note "/" are needed!
Check a folder and list by sub-folder size
du -sh *|sort -h