Question

I'm importing a 400MB(uncompressed) MySQL database. I'm using BIGDUMP, and I am getting this error:

Stopped at the line 387.

At this place the current query includes more than 300 dump lines. That can happen if your dump file was created by some tool which doesn't place a semicolon followed by a linebreak at the end of each query, or if your dump contains extended inserts. Please read the BigDump FAQs for more infos.

I believe the file does contain Extended Inserts, however I have no way to regenerate the database as it has been deleted from the old server. How can I import this database or convert it to be imported?

Thanks for any help.

Best Nick

EDIT: It appears the only viable answer is to separate the extended inserts, but I still need help figuring out how to split the file as the answer below suggests. Please help. Thank you.

Was it helpful?

Solution 3

I was able to import the database successfully after increasing the memory buffer and doing an ordinary mysql import via SSH.

I used the following command with the absolute path:

mysql -u <username> -p < /home/dir/dumpfile.sql

Best Nick

OTHER TIPS

You don't need to make any changes to .sql file: just open bigdump.php in any text editor, find the line

define ('MAX_QUERY_LINES',300);

and replace 300 with anything you want.

Just edit the dump file so that after 300 inserts you start a new statement:

INSERT INTO `myTable` (`id`, `field1`, `field2`) VALUES
    (1, 'a', 'b'),
    (2, 'c', 'd'),
    (3, 'e', 'f'),
    -- snip --
    (300, 'w', 'x');
INSERT INTO `myTable` (`id`, `field1`, `field2`) VALUES
    (301, 'y', 'z'),
    (302, ... etc

You can do this by just opening your <backup>.sql file in any text editor, it's just plain text. Sometimes a backup might be a .gz or .bz file, but they're just zips: if you decompress these gzipped or bzipped files, you'll end up with a single .sql file which you can edit easy.

It probably wouldn't be too hard to write a script to split each statement after an arbitrary number of lines. The basic program logic would look like this, perhaps:

  1. find a line that starts with "INSERT".
  2. copy the line up until the end of the word "VALUES"
  3. step through all the following lines, counting up as you go.
  4. if your counter reaches 300, then add a semicolon to the end of the last line and paste in the "INSERT ... VALUES" statement from before. Reset your counter to 1 and go to step 3.
  5. if you reach a line that starts with "INSERT", reset the counter to 1 and go to step 2.

Though, you could just do this one manually, and then make sure that all future backups don't use extended inserts. Also, phpMyAdmin lets you set a maximal statement size when creating an export, so it'll do the above for you, basically.

Though, what you did with the memory limit is probably a much easier solution. For anyone else in a similar situation who can't change those limits, try the above.

For me it worked as follows:

Search for

$max_query_lines = 300;

then simply replaced to

$max_query_lines = 3000;

Thanks for the 'max_query_lines' and 'INSERT' solutions!

"Both solutions have some drawbacks. Without extended insert, output dump will be many times larger than the dump with extended inserts. Increasing max_query_lines will increase the memory usage of bigdump script and your server might run out of the memory."

Source: http://www.etctips.com/error-at-this-place-the-current-query-includes-more-than-xxx-dump-lines/

Old question, but I found it via internet search, and it doesn't yet have the answer I was looking for, as it looks like phpMyAdmin has renamed the setting.

To disable 'extended inserts' in a phpMyAdmin export, you need to set the 'Syntax to use when inserting data' setting to the first option:

include column names in every INSERT statement 
Example: INSERT INTO tbl_name (col_A,col_B,col_C) VALUES (1,2,3)

Open bigdump.php and change the following line:

$max_query_lines = 300;

to this:

$max_query_lines = 300000;
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top