문제

I have a variable that contains a long string. (specifically it contains a few kilobytes of javascript-code)

I want to pass this string trough an external command, in this case a javascript-compressor, and capture the output of the external command (the compressed javascript) in php, assigning it to a variable.

I'm aware that there's classes for compressing javascript in php, but this is merely one example of a general problem.

originally we used:

$newvar = passthru("echo $oldvar | compressor");

This works for small strings, but is insecure. (if oldvar contains characters with special meaning to the shell, then anything could happen)

Escaping with escapeshellarg fixes that, but the solution breaks for longer strings, because of OS-limitations on maximum allowable argument-length.

I tried using popen("command" "w") and writing to the command - this works, but the output from the command silently disappears into the void.

Conceptually, I just want to do the equivalent of:

$newvar = external_command($oldvar);
도움이 되었습니까?

해결책

Using the proc_open-function you can get handles to both stdout and stdin of the process and thus write your data to it and read the result.

다른 팁

Using rumpels suggestion, I was able to device the following solution which seems to work well. Posting it here for the benefit of anyone else interested in the question.

public static function extFilter($command, $content){
    $fds = array(
        0 => array("pipe", "r"),  // stdin is a pipe that the child will read from
        1 => array("pipe", "w"),  // stdout is a pipe that the child will write to
        2 => array("pipe", "w")   // stderr is a pipe that the child will write to
    );
    $process = proc_open($command, $fds, $pipes, NULL, NULL);
    if (is_resource($process)) {
        fwrite($pipes[0], $content);
        fclose($pipes[0]);
        $stdout =  stream_get_contents($pipes[1]);
        fclose($pipes[1]);
        $stderr = stream_get_contents($pipes[2]);
        fclose($pipes[2]);
        $return_value = proc_close($process);
        // Do whatever you want to do with $stderr and the commands exit-code.
    } else {
        // Do whatever you want to do if the command fails to start
    }
    return $stdout;
}

There may be deadlock-issues: if the data you send is larger than the combined sizes of the pipes, then the external command will block, waiting for someone to read from it's stdout, while php is blocked, waiting for stdin to be read from to make room for more input.

Possibly PHP takes care of this issue somehow, but it's worth testing out if you plan to send (or receive) more data than fits in the pipes.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top