Question

I have to backup a huge mongoDB - it's something around 400GB of data.

When I tried copy db from the console it has gone like forever to execute (I understand it is huge amount of data but after 1 day of waiting it looked like hung-up process)

Anyone tried backup/migrate that big mongoDB database? Any tips?

At the moment I don't need to care about slowing down the server etc. so every kind of tips are welcome.

Was it helpful?

Solution

There are various alternative provided by mongoDB in order to backup Mongo data.

1. Mongodump tool

Mongodump can be used to back up the live server data . This can be used to take backup of entire cluster /server /database/collection even while data base is up and running . This can be used if you have less data write activity .

Pros :

Simple and quite easy to use .

No need to shutdown server instance.

Cons :

Can slow down the server while dump is going on.

It is real time backup , not point in time .

Not good alternative if your schema is linked schema .

Chances are there to miss some data present in some other linked collection.

2. shutdown and backup

This is very simple approach where you simple shutdown your server the mongo server and copy mongo data directory to some other backup directory. This is fastest way to do backup . The only disadvantage is that you have to stop server instance .

For more reference you can refer documentation : http://docs.mongodb.org/manual/core/backups/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top