Frage

In mysql (8.0), there is a table with over 10 million rows, I want to dump it as bulk insert sql.

I wish each insert sql contains maybe 1k or 10k rows, so that could execute the sql file quickly, overall.

So how to do such a dump ?

War es hilfreich?

Lösung

That's what mysqldump does by default, or at least with some option. Look at the output to confirm.

It will build INSERT statements will hundreds or thousands of rows in each. This is optimal. (A single INSERT with 10M rows is probably not possible.)

Andere Tipps

mysqldump has a"WJERE" clause in which can specify which data or number you want

mysqldump --opt --where="1 limit 1000000" database table > yourdumplimited.sql

or you can LIMIT byy time

 --where "your_date BETWEEN '2020-01-01' AND '2020-06-30'"

You can divide your table into fragmentss which you need.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit dba.stackexchange
scroll top