Question

The title pretty much says it all..

Suppose I have a bash script:

#!/bin/bash

# do some magic here, perhaps fetch something with wget, and then:
if [ "$VAR1" = "foo" ]; then
    export CASEVARA=1
fi
export CASEVARB=2

# and potentially many other vars...

How can I run this script from python and check what env variables were set. Ideally, I'd like to "reverse-inherit" them into the main environment that is running Python.

So that I can access them with

import os

# run the bash script somehow

print os.environ['CASEVARA']
Was it helpful?

Solution

Certainly! It just requires some hacks:

variables = subprocess.Popen(
    ["bash", "-c", "trap 'env' exit; source \"$1\" > /dev/null 2>&1",
       "_", "yourscript"],
    shell=False, stdout=subprocess.PIPE).communicate()[0]

This will run your unmodified script and give you all exported variables in the form foo=bar on different lines.

On supported OS (like GNU) you can trap 'env -0' exit to get \0 separated variables, to support multiline values.

OTHER TIPS

Use the subprocess module:

  • Echo the variables out in your shell script
  • Run the bash script from python using subprocess.Popen with stdout=subprocess.PIPE set
  • Pick them up using subprocess.communicate()
  • Save them to python variables

In order to make it transparent to the bash script, you could subclass Popen from subprocess and get it to expose the environ of it's process. You could also just grab the contents of it's environ off the disk

with open('/proc/%s/environ' % (pid,)) as f:
    env = f.read().replace('\x00', '\n')

Would work, but you're racing against the closing and clean up of the process.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top