Question

I wanted to use the CURL extension for PHP to create some sort of download manager and I was thinking if CURL allowed to implement these 2 features I'm thinking about:

1) Multiple connections or multi-part download just like a normal desktop applications download manager.

2) Constantly update on screen (text or graphical, doesn't matter) the download progress.

Does CURL for PHP allows any of this? If so, care to provide some hints?

Was it helpful?

Solution

The curl_multi_xyz() functions, like curl_multi_exec() allow you to process multiple requests at the same time. Also take a look at CURLOPT_RANGE if you want to download multiple segements of the same file in parallel. And the callback functions you can set with CURLOPT_READFUNCTION and CURLOPT_WRITEFUNCTION would allow you to send some kind of progress data to the client.

OTHER TIPS

To all the "PHP isn't great for multi-tasking" critics:

Take a step back and consider that you have an awesome Multithreading framework at your disposal if you're in a LAMP environment. Use this base architecture to your advantage - i.e. Apache is the multi-threading manager - and a damn good one at that.

It is very easy to setup PHP to work in this environment.

  1. Set max_execution_time = 0 to allow scripts to run indefinatly
  2. Set ignore_user_abort = true to allow scripts to run even after the client has aborted

Design light-weight single-task REST web services. Design them in such a way that you don't care when they return such as in a queue type system. Writing to the queue is thread-safe and removing from the queue is thread-safe if done with some basic OS-level mutexes.

"forking" the web services is as simple as opening a file:

fclose(fopen("http://somewebservice....php?a1=v1&a2=v2&....")); // Launch a web service and continue...

Not only is this approach multi-threaded, but it is inherently distributed as well. The web service can be local or located on across the world. PHP certainly doesn't care.

For a basic system the only thing that holds you back is the number of threads that apache allows. Otherwise your code is ready to take advantage of load-balancing and all the other neat tricks that advanced Apache implementations have to offer.

Too often when developer think "multi-threaded" they think "OMG I have to handle forks and execs and waits and PIDs". And if you design your system that way - you're right, it gets very complicated very quickly. Step back and use what is given. You've got access to directories? Boom - you've got queues. You can issue web calls? Boom - you've got a multi-threaded (distributed) app. Now just merge the concepts together as your app dictates.

PHP is not multi-threaded and, if you try to force it as such by means of multiple file calls or forking, the results are usually sub-optimal. I would suggest against this, HOWEVER, it would be possible to do something like this with a mix of js, php (probably not curl though but a custom php file stream), and long polling

It's possible, take a look into curl_multi_init();

No, that is not case. It is not possible because download manager calls the class that handles download 5 times - that is PHP class instance.

This is a sample class call:

$tr = new teConnections();
$data = $tr->downloadManager(array('http', 'host', path', 'login', 'pass', 'port'), 'file name, compression, streaming); 
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top