Domanda

I am trying to establish a python script which runs on a scientific cluster.

It split an input file into chunks submit them one to the cluster, sort and evaluate the ouput and submit the next chunk.

However, I have a strange problem.

I create the input file for the subprocess directly before the process. However, it never worked and showed me an "No data to process" error.

I finally run this fragment:

tmp = open ("individual_list.txt","w")
for line in working:
    tmp.write (line)
tmp.close
tmp.flush
os.fsync
time.sleep(60)
command=["srun"]
command.append ("--cpus-per-task=1")
command.append ("--chdir="+cwd)
command.append ("-o")
command.append (uniqueID+"#"+str(loop)+"_mut.out")
command.append (EXEC)
#command=[EXEC]
command.append ("-runfile")
command.append (CMD1)
out= open (uniqueID+"#"+str(loop)+"_mut.out","w")
p1=subprocess.Popen (command, cwd=cwd)
p1.wait()
out.close

You probably noticed that I already went paranoid with buffered output. But still during the minute of waiting, the file individual_list.txt is created in the filesystem but empty. It only gets filled after the subprocess finished. Is this a python problm or do I have to ask our cluster admins for help?

Best, Jan

È stato utile?

Soluzione

You are not calling the close and flush methods. You need to place () after them to do this:

tmp.close()
tmp.flush()
etc.

Otherwise, you just have references to those methods. Below is a demonstration:

>>> def foo():
...     return 'hi'
...
>>> foo
<function foo at 0x020B2540>
>>> foo()
'hi'
>>>
Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top