Question

I need to get online users data from xml file, but simplexml_load_file seems slow. File is big because online users are a lot. I am trying to paginate them 20 per page I have...

<?php $xml = simplexml_load_file('http://domain.tld/data.xml'); ?>

Then

<?php echo $xml->who_is_online[1]->thumbnail_image; ?>
<?php echo $xml->who_is_online[1]->display_name; ?>
<?php echo $xml->who_is_online[1]->display_age; ?>

But there should be about 20 of these per page.

<?php echo $xml->who_is_online[2]->thumbnail_image; ?>
<?php echo $xml->who_is_online[2]->display_name; ?>
<?php echo $xml->who_is_online[2]->display_age; ?>
<?php echo $xml->who_is_online[3]->thumbnail_image; ?>
<?php echo $xml->who_is_online[3]->display_name; ?>
<?php echo $xml->who_is_online[3]->display_age; ?>

etc. (to 20)

What is the better way to get only this certain data from XML without reading the whole file on each page load? I mean get data for user 1 to user 20 on 1st page, then on 2nd page get data for user 21 to 40, etc. On 2nd page I have the same code but getting data for [21], [22], [23], etc.

Was it helpful?

Solution 2

SimpleXML can't partially read a document. Even with an XPath filter, the whole document is still parsed behind the scenes.

A possible solution is use a cronjob or scheduled task to regularly call a PHP script which downloads the XML, parses it and inserts the data into an SQL database (don't forget to delete the old data first).

When a visitor requests a page, just paginate the data using SQL queries on the database.

OTHER TIPS

A DOM parser (such as SimpleXML) can only load the entire document.

See: Can SimpleXML load only a partial of a XML?

but you can iteratethrough your 'who_is_online'.

for ($i=0;$i<20;$i++){
    echo $xml->who_is_online[$i]->thumbnail_image;
    echo $xml->who_is_online[$i]->display_name;
    echo $xml->who_is_online[$i]->display_age;
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top