Frage

I have the initial data from my old database which takes around 6GB. I could "dumpdata" my old database without any problem. But when I attempted to restore them to the new database, I got the MemoryError:

    python manage.py loaddata fixtures/initial_data.json
    MemoryError: Problem installing fixture 'fixtures/initial_data.json': 

Is there any way to make loaddata work with chunks or is it possible to load that big file?

War es hilfreich?

Lösung 2

For large database use backup tools for dumping database data instead "django dumpdata". To load database data use restore tools instead "django loaddata".

Andere Tipps

I've wrote this script, which is a fork of django's dumpdata, but dumps data in chunks to avoid MemoryError. And then load these chunks one by one.

Script is available at https://github.com/fastinetserver/django-dumpdata-chunks

Example usage:

1) Dump data into many files:

mkdir some-folder

./manage.py dumpdata_chunks your-app-name
--output-folder=./some-folder --max-records-per-chunk=100000

2) Load data from the folder:

find ./some-folder | egrep -o "([0-9]+_[0-9]+)" | xargs ./manage.py loaddata

PS. I used it to move data from Postgresql to MySQL.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top