I have a Django project with a long, long, history. Its accumulated far more migrations
When we deploy we don't automatically clean the old versions out. Since each one of these can quite large, and we deploy > 1 service on each box, the total size can get out of hand quite quickly.
This one liner removes all but the newest 10 directories (in our case we have only directories). I've seen lots of examples online of finding & deleting old files that use
find and modified/created time. In this case we don't care how old files are, only in keeping the most recent N. Reading these posts, usually there is something like this hidden inside somewhere (or a for some reason more complicated version)
$ ls -t | tail -n +11 | sudo xargs rm -rf
ls is used in a pipe like this the file names are printed one per line, rather than many (and nicely colorized) on one line. I've added
-F to show that they are directories.
-t Sort by time modified (most recently modified first) before sorting the operands by lexicographical order.
$ ls -tF bd133af/ 07e1a2a/ bce1a5f/ cf5d31c/ af13a3a/ df3b2ef/ 15f5bc6/
tail -n +4
We use tail to skip the first 3 lines (+4). Lines in
tail are numbered from 1, and the
+n is the starting line number. If there aren't enough lines then the output will be empty.
$ ls -t | tail -n +4 cf5d31c af13a3a df3b2ef 15f5bc6
And then then dangerous one... add
sudo if necessary:
> ls -t | tail -n +4 | xargs rm -rfv # show each file as it is deleted