Question

I need to parse large XML files to mysql with php (500.000 lines of xml). But with the following code it takes hours for an xml-file. How an this be optimized? (Im thinking perhaps making an array to parse to mysql at once instead of parsing each variable one at a time?)

foreach ($data as $dat) {
        $object = $dat;
        $UID = $object['id'];
        $test = mysql_query("SELECT * FROM reports WHERE UID = '$UID'");
        if ($test['UNIQUEID'] ==null) {

            $temp = array("MEDIA" => "{$name}");
            foreach ($object as $obj){
                mysql_query("INSERT INTO reports
                    (MEDIA, UID)
                    VALUES
                    ('$name', '$UID')");
                foreach ($obj as $ats) {
                    $attname = $ats['name'];

                    mysql_query("UPDATE reports 
                                SET $attname = '$ats'
                                WHERE UID = '$UID'
                                ");

                } 

            }
        }
        echo "Done";

    }

Edit: XML:

    <object id="382177">
    <attributes>
      <attribute kind="number" name="REVNO">1</attribute>
      <attribute kind="string" name="UNIQUEID">XXX</attribute>
      <attribute kind="number" name="EVENTVERSION">1</attribute>
      <attribute kind="string" name="EVENTASSOCID">4568190</attribute>
      <attribute kind="number" name="EVENTASSOCRELNO">2</attribute>
      <attribute kind="string" name="EVENTTYPE">PageFlow</attribute>
      <attribute format="%Y-%m-%d %H:%M:%S" kind="time" name="EVENTTIME">2014-02-09 09:40:52</attribute>
      <attribute kind="string" name="EVENTMSG">PageLocked=1</attribute>
      <attribute kind="string" name="EVENTUSER">XXX</attribute>
      <attribute kind="string" name="EVENTAPPL">XXX</attribute>
      <attribute kind="string" name="NAME">XXX</attribute>
      <attribute kind="string" name="NEWSROOM">XXX</attribute>
      <attribute kind="string" name="PRODUCT">XXX</attribute>
      <attribute kind="string" name="PUBDATE">11-02-2014</attribute>
      <attribute kind="string" name="ZONE">XXX</attribute>
      <attribute kind="string" name="EDITION">1</attribute>
      <attribute kind="string" name="PAGENAME">XXX</attribute>
      <attribute kind="number" name="PAGENO">1</attribute>
      <attribute kind="string" name="ARTICLE"></attribute>
     </attributes>
  </object>

Edit 2: Thanks to Mike this code drastically improved performance:

        foreach ($data as $key) {

        $lat = array();
        $lat = $key->attributes;

        $UID = $key['id'];

        mysql_query("INSERT INTO reports
                    (MEDIA, UID, REVNO, UNIQUEID, EVENTVERSION, EVENTASSOCID, EVENTASSOCRELNO, EVENTTYPE, EVENTTIME, EVENTMSG, EVENTUSER, EVENTAPPL, NAME, NEWSROOM, PRODUCT, PUBDATE, ZONE, EDITION, PAGENAME, PAGENO, ARTICLE, LAYOUTDESK, LAYOUTSTATE, RUNNINGPAGENO, SECTIONNAME, SECTIONNO, LASTOPERATOR, LASTREV, LASTDATAOPERATOR, LASTDATAREV, TYPE, SUBTYPE, LAYOUTTEMPLATE, EDITORIALSOURCEUID)
                    VALUES
                    ('$name', '$UID', '{$lat->attribute[0]}', '{$lat->attribute[1]}', '{$lat->attribute[2]}', '{$lat->attribute[3]}', '{$lat->attribute[4]}', '{$lat->attribute[5]}', '{$lat->attribute[6]}', '{$lat->attribute[7]}', '{$lat->attribute[8]}', '{$lat->attribute[9]}', '{$lat->attribute[10]}', '{$lat->attribute[11]}', '{$lat->attribute[12]}', '{$lat->attribute[13]}', '{$lat->attribute[14]}', '{$lat->attribute[15]}', '{$lat->attribute[16]}', '{$lat->attribute[17]}', '{$lat->attribute[18]}', '{$lat->attribute[19]}', '{$lat->attribute[20]}', '{$lat->attribute[21]}', '{$lat->attribute[22]}', '{$lat->attribute[23]}', '{$lat->attribute[24]}', '{$lat->attribute[25]}', '{$lat->attribute[26]}', '{$lat->attribute[27]}', '{$lat->attribute[28]}', '{$lat->attribute[29]}', '{$lat->attribute[30]}', '{$lat->attribute[31]}')");


    }
Était-ce utile?

La solution

Inserting a record and then updating a number of columns one at a time will give dire performance. You should parse all the data out of your XML for each record and insert it all in one INSERT. This will give you a quantum improvement in performance

Further, if you're checking each insert for uniqueness before inserting it you should modify your schema with a unique index and use 'INSERT...IGNORE...', checking the number of affected rows to check whether each row has been inserted (if you need to check).

You could use the batch form of INSERT to inset a batch of records all at once.

If that's still not enough try parsing the XML to CSV first, then using LOAD DATA INFILE to do the insertion of all the data as a batch.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top