The best way to read larges files in PHP?
-
06-07-2019 - |
Question
I have to read CSV files line by line wich can be 10 to 20 Meg. file() is useless ;-) and I have to find the quickiest way.
I have try with fgets(), wich run fine, but I don't know if it read a small block each time I call it, or if it cache a bigger one and optimize file I/O. Do I have to try the fread() way, parsing EOL by myself?
Thanks Cedric
Solution
You ought to be using fgetcsv() if possible.
Otherwise, there is always fgets().
OTHER TIPS
stream_get_line is apparently more efficient than fgets for large files. If you specify a sensible maximum length for the read I don't see any reason why PHP would have to 'read ahead' to read a line in, as you seem to be worrying.
If you want to use CSVs then fgetcsv will return results in a slightly more sensible format.
You should have a look at fgetcsv()
, it automatically parses the coma seperated line into an array.
As for the runtime efficiency, I have no idea. You will have to run a quick test, preferably with a file of the size you are expecting to handle later on. But I would be surprised if the fget??? and fput??? functions were not I/O optimised.