"PHP Fatal error: Allowed memory size of 134217728 bytes exhausted" while generating database backup

StackOverflow https://stackoverflow.com/questions/21751541

  •  11-10-2022
  •  | 
  •  

Question

I get the subj. when I try to make backup of my database in a text file.

function backup_tables($backup_filename, $tables = '*')
{
    $conf = new JConfig();
    $dbhost = $conf->host;
    $dbuser = $conf->user;
    $dbpassword = $conf->password;
    $dbname = $conf->db;
    $link = mysql_connect($dbhost, $dbuser, $dbpassword);
    mysql_select_db($dbname, $link) or die(mysql_error());
    $return = "drop database if exists `$dbname`;\n\ncreate database `$dbname`;\n\nuse `$dbname`;\n\n";
    $return .= "/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */;\n\n";
    $return .= "/*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */;\n\n";
    $return .= "/*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */;\n\n";
    $return .= "/*!40101 SET NAMES utf8 */;\n\n";

    $handle = fopen($backup_filename, 'w+');
    fwrite($handle, $return); $return = "";

    // get all of the tables
    if ($tables == '*') {
        $tables = array();
        $result = mysql_query('SHOW TABLES');
        while ($row = mysql_fetch_row($result)) {
            $tables[] = $row[0];
        }
    } else {
        $tables = is_array($tables) ? $tables : explode(',', $tables);
    }

    // cycle through
    foreach ($tables as $table) {
        $result = mysql_query('SELECT * FROM ' . $table);
        $num_fields = mysql_num_fields($result);
        $return .= 'DROP TABLE IF EXISTS `' . $table . '`;';
        $return .= "\n\n" . mysql_fetch_row(mysql_query('SHOW CREATE TABLE `' . $table . '`;'))[1] . " DEFAULT CHARSET=cp1251;\n\n";

        while ($row = mysql_fetch_row($result)) {
            $return .= 'INSERT INTO ' . $table . ' VALUES(';
            for ($i = 0; $i < $num_fields; $i++) {
                $row[$i] = str_replace("\n", "\\n", addslashes($row[$i]));
                $return .= '"' . (isset($row[$i])? $row[$i] : '') . '"';
                if ($num_fields - $i - 1) {
                    $return .= ',';
                }
            }
            $return .= ");\n";

            fwrite($handle, $return); $return = "";
        }
        if($return) {
            fwrite($handle, $return);
            $return .= "\n\n\n";
        }
    }

    fclose($handle);
}

This function works well by the exception that there is an memory leaks somewhere. It creates a file ~30 MiB and hungs with mentioned error. Memory usage of the httpd process increases uniformly while file generation is in progress. And one more: generation hungs at a large table (containing a log), but I think this is no matter 'cause information written row by row.

Was it helpful?

Solution

And one more: generation hungs at a large table (containing a log), but I think this is no matter 'cause information written row by row.

Actually this is the cause: I should use mysql_unbuffered_query instead mysql_query. Now this function looks like this:

function backup_tables($backup_filename, $tables = '*')
{
    $conf = new JConfig();
    $dbhost = $conf->host;
    $dbuser = $conf->user;
    $dbpassword = $conf->password;
    $dbname = $conf->db;
    $link = mysql_connect($dbhost, $dbuser, $dbpassword);
    mysql_select_db($dbname, $link) or die(mysql_error());
    $return = "drop database if exists `$dbname`;\n\ncreate database `$dbname`;\n\nuse `$dbname`;\n\n";
    $return .= "/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */;\n\n";
    $return .= "/*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */;\n\n";
    $return .= "/*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */;\n\n";
    $return .= "/*!40101 SET NAMES utf8 */;\n\n";

    $handle = fopen($backup_filename, 'w+');
    fwrite($handle, $return); $return = "";

    // get all of the tables
    if ($tables == '*') {
        $tables = array();
        $result = mysql_query("SHOW TABLES");
        while ($row = mysql_fetch_row($result)) {
            $tables[] = $row[0];
        }
    } else {
        $tables = is_array($tables) ? $tables : explode(',', $tables);
    }

    // cycle through
    foreach ($tables as $table) {
        $return .= "DROP TABLE IF EXISTS `$table`;";
        $return .= "\n\n" . mysql_fetch_row(mysql_query("SHOW CREATE TABLE `$table`;"))[1] . " DEFAULT CHARSET=cp1251;\n\n";

        $result = mysql_unbuffered_query("SELECT * FROM `$table`");
        $num_fields = mysql_num_fields($result);

        while ($row = mysql_fetch_row($result)) {
            $return .= "INSERT INTO `$table` VALUES(";
            for ($i = 0; $i < $num_fields; $i++) {
                $row[$i] = str_replace("\n", "\\n", addslashes($row[$i]));
                $return .= '"' . (isset($row[$i])? $row[$i] : '') . '"';
                if ($num_fields - $i - 1) {
                    $return .= ',';
                }
            }
            $return .= ");\n";

            fwrite($handle, $return); $return = "";
        }
        if($return)
            fwrite($handle, $return);

        $return = "\n\n\n";
    }

    fclose($handle);
}

OTHER TIPS

The PHP answer here is to increase your max memory size if not your max execution time at the same time.

Outside of this being an exercise in re-creating the mysqldump command, is there a reason to perform this from within PHP code?

You might be better off using mysqldump or something like Holland http://hollandbackup.org/ to go through and dump each table individually.

Current answer uses a deprecated function. The new way to do this is using mysqli::use_result.

In my case I ran into the exhausted memory error trying to write a large sql table into a file. Here's how I used it.

$conn = new mysqli("localhost", "my_user", "my_password", "my_db");

$sql = 'SELECT row1, row2 from table';

$fp = fopen('output.json', 'w');
if ($conn->multi_query($sql)) {
    do {
        if ($result = $conn->use_result()) {
            while ($row = $result->fetch_row()) {
              $row1 = $row[0];
              $row2 = $row[1];
              $item = array('row1'=>$row1, 'row2'=>$row2);  
              fwrite($fp, json_encode($item));
            }
            $result->close();
        }
    } while ($conn->more_results() && $conn->next_result());
}

fclose($fp);
$conn->close();
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top