Question

I have a Linux based server using which i do some file manioulation and serve multiple other servers of mine.

Basically this server (say server1) takes input as .mp3 file from other servers and converts to other file formats (.wav, .txt and .xml) and sends the asked response to other servers.

Over a period of time this folder of mine (Say /Somepath/MyInputFolder) has now GBs of data which i want to delete.

I tried the rm -r * command but it says:

Argument list too long

I also tried rm -r *.mp3 and rm -r *.txt to delete these files separately, but its gives the same error.

I also tried this SO question and read this link.

I tried the solution of above SO question and got the error as warning.

find . -name "*.txt" -maxdepth 1 -print0 | xargs -0 rm
find: warning: you have specified the -maxdepth option after a non-option argument -name, but options are not positional (-maxdepth affects tests specified before it as well as those specified after it).  Please specify options before other arguments.

How can i achieve this?

Was it helpful?

Solution

As already stated in the comment to the answer you linked, you need to put the -maxdepth directly after the path. Like so:

find . -maxdepth 1 -name "*.txt" -print0 | xargs -0 rm

OTHER TIPS

ls | tail -10000 | xargs rm -rf

Executed more times.

This has been answered, but I wanted to add this for those will come across this question.

find has its own -delete option which will delete the files it finds. So you can simply do:

find -maxdepth 1 -name "*.txt" -delete

If you are interested in reading more about this issue, I would suggest this link.

I also got the same exception. I had over 1lakh files. Here is the command you need to run repeatedly

rm -f `ls|tail -10000` 

until the directory is empty.

you can try like this:

for m in * do; rm "$m"; done
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top