Pergunta

I have a script which reads a text file which contains sku information.

Then in a loop the script loads each product's ID then creates the product object:

$id = Mage::getModel('catalog/product')->getIdBySku($isbn);     
$product = Mage::getModel('catalog/product')->load($id);

Then the stock data is updated:

$product->setStockData(array(
           'manage_stock' => 1,
           'use_config_manage_stock' => 0,
           'is_in_stock' => 1,
           'qty' => $qty
            ));

$product->save();

This works however on larger amount of products to loop through the script just suddenly stops. For example one text file had 14,000 lines and on approximately the 7,000th loop the script just stops with no error message.

If we change the text file the same thing occurs.

I have checked the max_execution_time on the server and it looks fine.

I'm unsure if there's a Magento reason why the script breaks or anything else.

Any help is much appreciated.

Thanks.

Foi útil?

Solução

If you have a CSV file of the format:

sku,qty
IDE123,12

you can just use Import/Export to update your qty. It is faster then your script and fires one bulk query to the database instead of 14k queries for each product.

Which version of magento are you using?

We built a plugin to use bulk queries based on ImportExport but through the magento product api, you can find it here: https://github.com/magento-hackathon/cutesave

Outras dicas

If it's not the server's PHP max execution time, you are probably running out of memory because running Mage::getModel('catalog/product')->load($id) will cause memory leak that can become quite large over many iterations. How does your script run? In CLI?

You should still get an error message stating you ran out of memory, though, and increasing memory limit is not really a good solution if you have tens of thousands of products. I think 14K is OK.

If you suspect a memory problem and can't afford any more memory, try loading a collection and saving a product model using that. That will not cause memory leaks in my experience. So like.. (untested)

$productCollection = Mage::getModel('catalog/product')->getCollection()
    ->selectAttributesToFilter('*')
    ->addAttributeToFilter('entity_id', $id)
    ->load();

This works however on larger amount of products to loop through the script just suddenly stops. For example one text file had 14,000 lines and on approximately the 7,000th loop the script just stops with no error message.

Looks like you have a problem with server configuration. Try to set bigger value for max_execution_time and also for memory_limit

And if this won't help, check your server log files.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a magento.stackexchange
scroll top