문제

I have the initial data from my old database which takes around 6GB. I could "dumpdata" my old database without any problem. But when I attempted to restore them to the new database, I got the MemoryError:

    python manage.py loaddata fixtures/initial_data.json
    MemoryError: Problem installing fixture 'fixtures/initial_data.json': 

Is there any way to make loaddata work with chunks or is it possible to load that big file?

도움이 되었습니까?

해결책 2

For large database use backup tools for dumping database data instead "django dumpdata". To load database data use restore tools instead "django loaddata".

다른 팁

I've wrote this script, which is a fork of django's dumpdata, but dumps data in chunks to avoid MemoryError. And then load these chunks one by one.

Script is available at https://github.com/fastinetserver/django-dumpdata-chunks

Example usage:

1) Dump data into many files:

mkdir some-folder

./manage.py dumpdata_chunks your-app-name
--output-folder=./some-folder --max-records-per-chunk=100000

2) Load data from the folder:

find ./some-folder | egrep -o "([0-9]+_[0-9]+)" | xargs ./manage.py loaddata

PS. I used it to move data from Postgresql to MySQL.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top