Modify launch_command to support global variables

We use launch_command script to run Spark and Storm jobs on the guest.
Sometimes like in case of the Spark Shell job for the proper job execution
we need to set environment parameters on the guest before running the job.
launch_command is using the subprocess module to run the command.
subprocess.Popen is already supporting environment variables through the
"env" parameter, so all we have to do is populate this variable if needed.

This patch proposes to use the same format to transfer environment variables
to the command we use when running a shell script:

FILE="a" DIR="b" test.sh

In this case launch_command will parse a command line, separate environment
variables FILE and DIR from the command and send them to Popen using the "env"
parameter.

Change-Id: If110f00e95f0479b12e2d26cc30e9cbe4513e361
This commit is contained in:
Sergey Gotliv 2015-06-28 01:45:16 +03:00
parent a1c95bca60
commit f0325c4643

View File

@ -34,6 +34,24 @@ def make_handler(a):
log.info("Sent SIGINT to subprocess")
return handle_signal
def parse_env_vars():
index = 1
env_vars = {}
for var in sys.argv[1:]:
# all environment parameters should be listed before the
# executable, which is not suppose to contain the "=" sign
# in the name
kv_pair = var.split("=")
if len(kv_pair) == 2:
key, value = kv_pair
env_vars[key.strip()] = value.strip()
index += 1
else:
break
return env_vars, sys.argv[index:]
log.info("Running %s" % ' '.join(sys.argv[1:]))
try:
@ -42,8 +60,13 @@ try:
# (background processes ignore SIGINT)
signal.signal(signal.SIGINT, signal.SIG_DFL)
# Separate between command including arguments and
# environment variables
env, args = parse_env_vars()
# Interpret all command line args as the command to run
a = subprocess.Popen(sys.argv[1:],
a = subprocess.Popen(args,
env=env,
stdout=open("stdout", "w"),
stderr=open("stderr", "w"))