Question

So, I'm writing a chunked file transfer script that is intended to copy files--small and large--to a remote server. It almost works fantastically (and did with a 26 byte file I tested, haha) but when I start to do larger files, I notice it isn't quite working. For example, I uploaded a 96,489,231 byte file, but the final file was 95,504,152 bytes. I tested it with a 928,670,754 byte file, and the copied file only had 927,902,792 bytes.

Has anyone else ever experienced this? I'm guessing feof() may be doing something wonky, but I have no idea how to replace it, or test that. I commented the code, for your convenience. :)

<?php

// FTP credentials
$server = CENSORED;
$username = CENSORED;
$password = CENSORED;

// Destination file (where the copied file should go)
$destination = "ftp://$username:$password@$server/ftp/final.mp4";

// The file on my server that we're copying (in chunks) to $destination.
$read = 'grr.mp4';

// If the file we're trying to copy exists...
if (file_exists($read))
{
    // Set a chunk size
    $chunk_size = 4194304;

    // For reading through the file we want to copy to the FTP server.
    $read_handle = fopen($read, 'rb');

    // For appending to the destination file.
    $destination_handle = fopen($destination, 'ab');

    echo '<span style="font-size:20px;">';
    echo 'Uploading.....';

    // Loop through $read until we reach the end of the file.
    while (!feof($read_handle))
    {
        // So Rackspace doesn't think nothing's happening.
        echo PHP_EOL;
        flush();

        // Read a chunk of the file we're copying.
        $chunk = fread($read_handle, $chunk_size);

        // Write the chunk to the destination file.
        fwrite($destination_handle, $chunk);

        sleep(1);
    }
    echo 'Done!';
    echo '</span>';
}

fclose($read_handle);
fclose($destination_handle);
?>

EDIT

I (may have) confirmed that the script is dying at the end somehow, and not corrupting the files. I created a simple file with each line corresponding to the line number, up to 10000, then ran my script. It stopped at line 6253. However, the script is still returning "Done!" at the end, so I can't imagine it's a timeout issue. Strange!

EDIT 2

I have confirmed that the problem exists somewhere in fwrite(). By echoing $chunk inside the loop, the complete file is returned without fail. However, the written file still does not match.

EDIT 3

It appears to work if I add sleep(1) immediately after the fwrite(). However, that makes the script take a million years to run. Is it possible that PHP's append has some inherent flaw?

EDIT 4

Alright, further isolated the problem to being an FTP problem, somehow. When I run this file copy locally, it works fine. However, when I use the file transfer protocol (line 9) the bytes are missing. This is occurring despite the binary flags the two cases of fopen(). What could possibly be causing this?

EDIT 5

I found a fix. The modified code is above--I'll post an answer on my own as soon as I'm able.

Was it helpful?

Solution

I found a fix, though I'm not sure exactly why it works. Simply sleeping after writing each chunk fixes the problem. I upped the chunk size quite a bit to speed things up. Though this is an arguably bad solution, it should work for my uses. Thanks anyway, guys!

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top