Question

I need to put the data collected by a data acquisition system on a MySQL database through a C program. I have 2 tables with 3 and 5 columns (and 1 foreign key for the second table). Currently, it takes few minutes to put around 4000 values (it could be a lot more in the end) despite my pretty good computer (and the program would have to run on a crappy one).

Is there any way to make it faster ?

I know similar questions were posted, but the answers were too advanced for me.

Était-ce utile?

La solution

Are you perhaps using a separate query for each row? Don't forget you can add multiple rows with a single query. So for example, instead of

INSERT INTO `foo` VALUES (1, 2, 3);
INSERT INTO `foo` VALUES (4, 5, 6);
INSERT INTO `foo` VALUES (7, 8, 9);

you can issue a single query like this:

INSERT INTO `foo` VALUES (1, 2, 3), (4, 5, 6), (7, 8, 9);

Here are a couple of other optimizations that you could try:

Disable indexes

As you update a MySQL table, it will attempt to reconstruct the table's indexes after each insertion. For large insertions, it's more efficient to disable indexing temporarily while you're updating the table. To do this, send this query before you start the update:

ALTER TABLE `foo` DISABLE KEYS;

and then re-enable indexing when you're done:

ALTER TABLE `foo` ENABLE KEYS;

Disable foreign key checks

Again, these can slow things up when working with large tables. Issue the command

SET FOREIGN_KEY_CHECKS=0;

to disable foreign key checks before you start modifying the table, and

SET FOREIGN_KEY_CHECKS=1;

to re-enable these checks when you're done.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top