Question

I want to insert multiple rows into a MySQL table at once using Java. The number of rows is dynamic. In the past I was doing...

for (String element : array) {
    myStatement.setString(1, element[0]);
    myStatement.setString(2, element[1]);

    myStatement.executeUpdate();
}

I'd like to optimize this to use the MySQL-supported syntax:

INSERT INTO table (col1, col2) VALUES ('val1', 'val2'), ('val1', 'val2')[, ...]

but with a PreparedStatement I don't know of any way to do this since I don't know beforehand how many elements array will contain. If it's not possible with a PreparedStatement, how else can I do it (and still escape the values in the array)?

Was it helpful?

Solution

You can create a batch by PreparedStatement#addBatch() and execute it by PreparedStatement#executeBatch().

Here's a kickoff example:

public void save(List<Entity> entities) throws SQLException {
    try (
        Connection connection = database.getConnection();
        PreparedStatement statement = connection.prepareStatement(SQL_INSERT);
    ) {
        int i = 0;

        for (Entity entity : entities) {
            statement.setString(1, entity.getSomeProperty());
            // ...

            statement.addBatch();
            i++;

            if (i % 1000 == 0 || i == entities.size()) {
                statement.executeBatch(); // Execute every 1000 items.
            }
        }
    }
}

It's executed every 1000 items because some JDBC drivers and/or DBs may have a limitation on batch length.

See also:

OTHER TIPS

When MySQL driver is used you have to set connection param rewriteBatchedStatements to true ( jdbc:mysql://localhost:3306/TestDB?**rewriteBatchedStatements=true**).

With this param the statement is rewritten to bulk insert when table is locked only once and indexes are updated only once. So it is much faster.

Without this param only advantage is cleaner source code.

If you can create your sql statement dynamically you can do following workaround:

    String myArray[][] = { { "1-1", "1-2" }, { "2-1", "2-2" },
            { "3-1", "3-2" } };

    StringBuffer mySql = new StringBuffer(
            "insert into MyTable (col1, col2) values (?, ?)");

    for (int i = 0; i < myArray.length - 1; i++) {
        mySql.append(", (?, ?)");
    }

    myStatement = myConnection.prepareStatement(mySql.toString());

    for (int i = 0; i < myArray.length; i++) {
        myStatement.setString(i, myArray[i][1]);
        myStatement.setString(i, myArray[i][2]);
    }
    myStatement.executeUpdate();

In case you have auto increment in the table and need to access it.. you can use the following approach... Do test before using because getGeneratedKeys() in Statement because it depends on driver used. The below code is tested on Maria DB 10.0.12 and Maria JDBC driver 1.2

Remember that increasing batch size improves performance only to a certain extent... for my setup increasing batch size above 500 was actually degrading the performance.

public Connection getConnection(boolean autoCommit) throws SQLException {
    Connection conn = dataSource.getConnection();
    conn.setAutoCommit(autoCommit);
    return conn;
}

private void testBatchInsert(int count, int maxBatchSize) {
    String querySql = "insert into batch_test(keyword) values(?)";
    try {
        Connection connection = getConnection(false);
        PreparedStatement pstmt = null;
        ResultSet rs = null;
        boolean success = true;
        int[] executeResult = null;
        try {
            pstmt = connection.prepareStatement(querySql, Statement.RETURN_GENERATED_KEYS);
            for (int i = 0; i < count; i++) {
                pstmt.setString(1, UUID.randomUUID().toString());
                pstmt.addBatch();
                if ((i + 1) % maxBatchSize == 0 || (i + 1) == count) {
                    executeResult = pstmt.executeBatch();
                }
            }
            ResultSet ids = pstmt.getGeneratedKeys();
            for (int i = 0; i < executeResult.length; i++) {
                ids.next();
                if (executeResult[i] == 1) {
                    System.out.println("Execute Result: " + i + ", Update Count: " + executeResult[i] + ", id: "
                            + ids.getLong(1));
                }
            }
        } catch (Exception e) {
            e.printStackTrace();
            success = false;
        } finally {
            if (rs != null) {
                rs.close();
            }
            if (pstmt != null) {
                pstmt.close();
            }
            if (connection != null) {
                if (success) {
                    connection.commit();
                } else {
                    connection.rollback();
                }
                connection.close();
            }
        }
    } catch (SQLException e) {
        e.printStackTrace();
    }
}

@Ali Shakiba your code needs some modification. Error part:

for (int i = 0; i < myArray.length; i++) {
     myStatement.setString(i, myArray[i][1]);
     myStatement.setString(i, myArray[i][2]);
}

Updated code:

String myArray[][] = {
    {"1-1", "1-2"},
    {"2-1", "2-2"},
    {"3-1", "3-2"}
};

StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");

for (int i = 0; i < myArray.length - 1; i++) {
    mySql.append(", (?, ?)");
}

mysql.append(";"); //also add the terminator at the end of sql statement
myStatement = myConnection.prepareStatement(mySql.toString());

for (int i = 0; i < myArray.length; i++) {
    myStatement.setString((2 * i) + 1, myArray[i][1]);
    myStatement.setString((2 * i) + 2, myArray[i][2]);
}

myStatement.executeUpdate();

we can be submit multiple updates together in JDBC to submit batch updates.

we can use Statement, PreparedStatement, and CallableStatement objects for bacth update with disable autocommit

addBatch() and executeBatch() functions are available with all statement objects to have BatchUpdate

here addBatch() method adds a set of statements or parameters to the current batch.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top