Domanda

I am attempting to import a 40GB SQL backup on one of my servers and the software I used to automate the backups apparently included the "information_schema". Are there any tools/scripts/etc. that could be used to remove this data?

Due to the size of the SQL file, I have tried Notepad++ (file too large) and other Text Editors make it very difficult to tell what information belongs to the information_schema.

About at my wit's end and hoping there is something that could simplify removing this data from the SQL dump. I tried running the import with "-f" to force past it but it made what appears to be a bit of a mess.

È stato utile?

Soluzione

I have tried how to figure out this. but the only idea is using grep and remove TABLES in inforation_schema like this: this example removes 3 tables from dump.sql

egrep -v '(GLOBAL_VARIABLES|CHARACTER_SETS|COLLATIONS)' dump.sql > new_dump.sql

in practice, 40 tables should be given....

I hope you succeed.

FYI. one database is 40GB? how many tables in there. To restore dump file quickly, in later separate big tables or databases from each other. and load concurrently.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top