Question

I have several laptops in the field that need to daily get information from our server. Each laptop has a server2go installation (basically Apache, PHP, MySQL running as an executable) that launches a local webpage. The webpage calls a URL on our server using the following code:

$handle = fopen( $downloadURL , "rb");
$contents = stream_get_contents( $handle );
fclose( $handle );

The $downloadURL fetches a ton of information from a MySQL database on our server and returns the results as output to the device. I am currently returning the results as their own SQL statements (ie. - if I query the database "SELECT name FROM names", I might return to the device the text string "INSERT INTO names SET names='JOHN SMITH'"). This takes the info from the online database and returns it to the device in a SQL statement ready for insertion into the laptop's database.

The problem I am running into is that the amount of data is too large. The laptop webpage keeps timing out when retrieving info from the server. I have set the PHP timeout limits very high, but still run into problems. Can anyone think of a better way to do this? Will stream_get_contents stay connected to the server if I flush the data to the device in smaller chunks?

Thanks for any input.

Was it helpful?

Solution

What if you just send over the data and generate the sql on the receiving side? This will save you a lot of bytes to transmit.

Is the data update incremental? I.e. can you just send over the changes since the last update?

If you do have to send over a huge chunk of data, you might want to look at ways to compress or zip and then unzip on the other side. (Haven't looked at how to do that but I think it's achievable in php)

OTHER TIPS

Write a script that compiles a text file from the database on the server, and download that file.

You might want to consider using third-party file synchronization services, like Windows Live Sync or Dropbox to get the latest file synchronized across all the machines. Then, just have a daemon that loads up the file into the database whenever the file is changed. This way, you avoid having to deal with the synchronization piece altogether.

You are using stream_get_contents (or you could even use file_get_contents without the need of extra line to open stream) but if you amount of text is really large like the title says, you'll fill up your memory.

I came to this problem when writing a script for a remote server, where memory is limited, so that wouldn't work. The solution I found was to use stream_copy_to_stream instead and copy your files directly on the disk rather then into memory.

Here is the complete code for that piece of functionality.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top