Is there a standard way to perform a full and repeated export from Box platform?

StackOverflow https://stackoverflow.com/questions/20854616

  •  23-09-2022
  •  | 
  •  

Frage

We would like to export ALL of our user uploaded data from Box and into our internal content repository. We would like to do this once (all data) and then repeatedly on an incremental basis to ensure that the two repos are kept up to date.

To do this using the REST api on a user by user, folder by folder, file by file basis seems likely to be time consuming and error prone. So I would like to know if there is a data dump or mass export facility that we could use for this operation? Ideally this would be something that we could use to automate data export with.

War es hilfreich?

Lösung

Keeping an enterprise-worth of files up to date is probably best done the way that sync clients keep an individual user up to date: After an initial export, use the /events API to capture all the file-ids that change for the enterprise, and re-download them when it makes sense to.

If you want to do this, depending on how large of an enterprise you have, you will likely hit rate limits. You should reach out to Box to work with them on getting higher rate limits. Tuning your own algorithm and minimizing how often you do a download will most likely be key to keeping your solution working optimally.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top