There's no way to sequentially 'read' a string that's already loaded into memory; it's not really more efficient to split it up. The overhead of multiple variables will use more memory than a single one as well. Ideally you would load the string into a stream, but PHP doesn't really have a string stream.
If you just want to deal with the string in chunks, you can just loop over substrings of it:
$data;
$pointer = 0, $size = strlen($data);
$chunkSize = 1048576;
while ($pointer < $size)
{
$chunk = substr($data, $pointer, $chunkSize);
doSomethingWithChunk($chunk);
$pointer += $chunkSize;
}
I'm not sure how PHP handles large strings internally, but according to the string documentation, a string can only be "as large as up to 2GB (2147483647 bytes maximum)". If your file is about 10MB, it shouldn't be a problem for PHP.
Another option (probably the better option) is to load $data
into a memory or temporary stream. If you want to spare the environment from excessive memory, you can use the php://temp
stream wrapper, where some of the data is stored in a temporary file if it exceeds 2MB. Just load the string into the stream as soon as possible to conserve memory, and then you can use the file stream functions on it.
$dataStream = fopen("php://temp", "w+b");
fwrite($dataStream, funcThatGetsData()); // try not to put data into a variable to save memory
while (!feof($dataStream))
{
$chunk = fread($dataStream, 1048576); // want to read 1MB at a time
doSomethingWithChunk($chunk);
}
fclose($dataStream);
If you get $data
from another function you could pass around $dataStream
instead. If you must have $data
in a string beforehand, be sure to call unset()
on it to free the memory:
$data = getData(); // string from some other function
$dataStream = fopen("php://temp", "w+b");
fwrite($dataStream, $data);
unset($data); // free 10MB of memory!
...
If you want to keep it all in memory you can use php://memory
, but you might as well just use a string in that case.