Question

I need to save a few thousand files in google cloud storage. Is there a limit of how many files will be returned by gsutil ls command?

If I am not wrong, AWS S3 service has the limitation of a few thousand entries being displayed with every request. If I need to list the next batch, I need to use the API or use some complex sript.

I want to know if there is any such limit while using gsutil command line tool. I can check that myself, but I will have to upload thousands of dummy files!

Was it helpful?

Solution

The gsutil ls command paginates through bucket listings - each underlying GET returns a limited number of results (usually 1000), and gsutil will keep iterating through all the batches until it lists all objects in the bucket.

Note that if you're planning on building a large collection of objects (say, in the hundreds of thousands or millions), we recommend that you not depend on using bucket listings to keep track of your objects, and instead maintain your own metadata (e.g., in a database that lists all the object names and related details).

Mike Schwartz, Google Cloud Storage team

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top