Although I'm still not exactly sure why you don't have the server itself handle this, my suggestion to handle it from the Python script would be to use the multiprocessing module. Here's a really basic way to handle this using a single worker process:
from multiprocessing import Queue, Process
import multiprocessing
def worker(e, m):
while True:
e.wait() # Wait to be told the file has changed. This will block.
e.clear() # Clear the flag so the main thread can set it again if changes happen while we process
# Send request to server, retrieve data
# Write to HTML files
def watch_file_for_changes():
while True:
if file_changed: # Use whatever watching mechanism you already have for this; inotify, etc.
e.set() # Tell the worker to process the file. This unblocks e.wait()
if __name__ == "__main__":
e = Event()
# Start some workers. You can choose whatever number you want.
p = multiprocessing.Process(target=worker, args=(e,))
p.start()
watch_file_for_changes()
This is completely untested, and needs some cleaning up, but should give you the general idea. This also assumes your worker is smart enough to figure out if more than one new entry has been added to the text file.