Question

I want to set up a pipeline of processes from within Perl (running on Linux), consisting of two parts run at separate times.

Eg:

Start the consumer process:

open( OUT, "| tar xvf - " ) || die "Failed: tar: $!";

then much later start the producer process:

open( IN, "gpg -d $file |" ) || die "Failed: gpg: $!";

but then somehow redirect the output from gpg into the input to tar.

I can do this by building a loop:

while (<IN> ) {
  print OUT;
}

But I would like to know if I can somehow glue the two processes together with redirection.

Was it helpful?

Solution

Add

pipe( IN, OUT );

Before the two open statements. That's it!

If you want to do anything more complicated, I would recommend the IPC::Run CPAN module:

http://search.cpan.org/dist/IPC-Run/

It lets you start processes, tie their input and outputs together, and add logging or redirection at any point in the chain.

OTHER TIPS

If the two processes are completely unrelated, use a FIFO.

use POSIX qw(mkfifo);
mkfifo($path, 0700) or die "mkfifo $path failed: $!";

This creates a FIFO at $path. Now have one process write to that file, and the other process read from it.

I like Proc::SafeExec it lets you tie together processes and file handles in almost arbitrary ways easily. Here's an example:

use strict;
use warnings;

use Proc::SafeExec;

open(my $ls, "-|", "ls", "-l") or die "Err: $!";
open(my $fh, ">", "tmp.txt") or die "Err: $!";

my $p = Proc::SafeExec->new({
  exec => [qw(sed -e s/a/b/)],
  stdin => $ls,
  stdout => $fh,
});
$p->wait();

After looking at IPC::Run, it looks a lot simpler...here's the same example using IPC::Run instead:

use IPC::Run qw(run);

run [qw(ls -l)], "|", [qw(sed -e s/a/b/)], ">", "tmp.txt";
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top