Question

i wanted to program a python script which runs several c++ programs from a list of jobs.

I think it works fine, but i have a problem with the output file, which seems to be damaged or something. Anyways i can't open them. Maybe you guys can help me.

import multiprocessing
import psutil
import subprocess
import time

jobs2Do = True

while (jobs2Do):    #Ever-running program

cpuLoad = cpuLoad + psutil.cpu_percent()

if (cpuLoad < 90):
    with open('JobQueue.txt', 'r') as f:
        first_line = f.readline()
        path = f.readline()
        f.close()
    if (first_line != ""):
        print "Opening " + first_line
        process = subprocess.Popen( first_line, shell=True, stdout=open(path,"w"),stderr=open("error.log","w"))

        with open('JobQueue.txt', 'r') as fin:
            data = fin.read().splitlines(True)
        with open('JobQueue.txt', 'w') as fout:
            fout.writelines(data[2:])
        with open("JobsDone.txt", "a") as myfile:
            myfile.write(first_line)
            myfile.write(path)
        myfile.close()

    else:
        print "No more jobs... waiting for new jobs...."
        time.sleep(10)

Okay, so what i (want to) do is: I check, if the cpu has some free capacities, if so, open the file with the jobs and get the command, which is in the first line of that file, and the path to where all the output of the program should be saved to. This is always in the second line of the file.

Then I want to open that subprocess, and put the stdout to my preffered path and put the errorstream to wherever. Finally, I delete the job from the list and start from the beginning.

My Problem now is, that the stdout=open(path,"w") seems to be sort of corrupt, because I can't access or even delete it, but I can see it in the folder.

Maybe you guys have an idea, what I did wrong.

Thanks for your effort!

NonStopAggroPop

PS: Maybe I should also add, that the c++-programs are running for a longer period of time. So what I intentionally wanted to do is just let the program execute nohup ./c++ [arguments] and stream the output to a specific file, like it would if I typed it in the console.

PPS: And I want to be able to start multiple c++-programs while the others are still runing, until my cpu reaches 100% usage.

Was it helpful?

Solution

My Problem now is, that the stdout=open(path,"w") seems to be sort of corrupt, because I can't access or even delete it, but I can see it in the folder.

The issue is that path = f.readline() return a string with a newline at the end. open(path, 'w') on some systems doesn't care and creates the file with a newline at the end. Try print(os.listdir('.')) to see \n in the filename.

To fix it, just remove the newline: path = path.strip().


There are other unrelated issues in your code:

  • you are mixing tabs and spaces for indentation. Never do that: it makes the visual indentation different from the one Python sees as it currently is in your question. Either always use tabs or spaces, not both. You can configure your editor to insert 4 spaces when you hit Tab key.

  • you probably meant cpuLoad = psutil.cpu_percent() instead of cpuLoad = cpuLoad + psutil.cpu_percent() that increases each iteration of the loop for no reason

  • remove the newline from the command and remove shell=True. Don't use shell unless it is necessary (it is a good habit that sometimes may be broken if you know what you are doing):

    import shlex
    
    process = Popen(shlex.split(command),
                    stdout=output_file, stderr=stderr_file)
    
  • use with-statements for the code to be compatible with other Python implementations such as Jython, Pypy:

    with open(path, 'w') as output_file:
         with open('error.log', 'w') as stderr_file:
             process = Popen(shlex.split(command),
                             stdout=output_file, stderr=stderr_file)
    

    otherwise the files may remain open in the parent process.

  • remove f.close(), myfile.close() after the with-statement that closes the file by itself even if an exception occurs that is its purpose, its raison d'être. .close() is harmless but pointless in this case

  • use if not first_line.strip(): to test whether the first line is blank (contains only whitespace)

  • the whole idea of manually editing JobQueue.txt is fragile. Nothing stops Python process to read only partial input. You could use a dedicated command to add new jobs e.g., at-like utility. And implement it however you like e.g., listen on a port for a new jobs in the main script and send jobs in at-like utility. Here's a code example of a the very basic socket client and server in Python. Or use other IPC methods.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top