Question

I began to mess around with the Web in Perl. I use Windows, and ActivePerl. I wrote a script, which downloads mp3 files (It's an archive of a radio talkshow, all legal, in case you're wondering :) ) It (in theory)

  • parses a website
  • collects all links pointing to an mp3 file to an array
  • iterates through the array and downloads all files

However the script finds all the links, the array is OK, but downloads only one file. It must be a basic error, but I can't find it. Please point out my mistake, thank you.

use strict;
use warnings;
use WWW::Mechanize;
use LWP::Simple;
use File::Basename;

my $mech = WWW::Mechanize->new( autocheck => 1 );  
$mech->get("http://something_or_other.html");

my @mp3links = $mech->find_all_links(    
    tag       => "a",
    url_regex => qr/\.mp3$/,  
);  
print "\n";                 
foreach my $link (@mp3links) {  

    my $url       = $link->url();  
    my $localfile = basename($url);  
    my $localpath = "d:\\Downloads\\$localfile";

    print "$localfile\n";   
    getstore($url, $localpath);   
}  
Was it helpful?

Solution 2

Oh, sorry guys. This had nothing to do with Perl. My internet connection was rather sh***y, the connection was always lost ehile downloading the first file.

OTHER TIPS

Try this:

my $status = getstore($url, $localpath);
die "Error $status on $url" unless is_success($status);

Another possible culprit - if links contain query string (?-mark + params) basename will not chop it off. Problem is that this will make it invalid filename on windows.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top