I have a product feed from a vendor, the file comes as a 100mb .gz file. I use PHP to unpack the file to a 1gb .csv file. My server has no problem with the unpacking, however, I want to grab the information from the .csv and drop it into a MySQL database.

The .csv contains about 240,000 rows, with about 20 - 25 comma delimited fields. The trouble is that my hosting provider doesn't allow me to load a file this large into memory for processing.

Does anyone know of a way I can either split the .csv into smaller files (maybe 100mb each), either while unpacking from the original .gz, or is there a way for me to read the information contained within the .csv without loading the whole file into memory?

有帮助吗?

解决方案

You should be able to do this with fgetcsv.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top