Question

I've previously asked a question on dumping a database by splitting it into tables separately. Well, I manage to do it with mysqldump command but however, when I try to restore the database locally to extract some data's.. it stops with an error at certain point. A table which contains data's more than 2GB. Even I tried restoring each table to see whether it gives me a success. But it was still the same... its the same table which gives an error.

Can anyone tell me whether there's a way to repair a downloaded dump or check downloaded table by checking each??

thanks in advance.

EDIT: This is the error I got "ERROR 2006 (HY000) at line 5855099: MySQL server has gone away"

EDIT 2: When I tried this on my VPS via SSH.. it gave me this error "ERROR 1153 (08S01) at line 4215: Got a packet bigger than 'max_allowed_packet' bytes"

Was it helpful?

Solution

Finally after a certain search and further checking the errors I found out that max_allowed_packet variable was causing this issue as usual. Though I'm not an expert in such field, I'm happy that I figured out. I would like to share the link which helped me to fix this issue and also how I achieved it.

This is what I did..

  1. I first logged into mysql server using mysql -u user -p & by entering the password.
  2. Then I executed following commands;

set global net_buffer_length=1000000;
set global max_allowed_packet=1000000000;

  1. Finally I left the terminal as it is & opened a new one, then I executed below command which did the magic without any interruptions..

mysql --max_allowed_packet=100M -u root -p database < dump.sql

I hope this may help any other facing such issues.

Thanks.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top