Question

I have a product feed from a vendor, the file comes as a 100mb .gz file. I use PHP to unpack the file to a 1gb .csv file. My server has no problem with the unpacking, however, I want to grab the information from the .csv and drop it into a MySQL database.

The .csv contains about 240,000 rows, with about 20 - 25 comma delimited fields. The trouble is that my hosting provider doesn't allow me to load a file this large into memory for processing.

Does anyone know of a way I can either split the .csv into smaller files (maybe 100mb each), either while unpacking from the original .gz, or is there a way for me to read the information contained within the .csv without loading the whole file into memory?

Was it helpful?

Solution

You should be able to do this with fgetcsv.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top