質問

I've done a little bit of research, but I cannot seem to find a way to handle if two users have sent data in, but the Python script can only handle one.

This is typically how the script works:

1) User Enters Data '123' 2) Python listener executes on Data | | Sends requests to server and retrieves data (typically ~1 min) | Script writes to HTML files | 3) Finishes writing to files, waits for more User input

Now the problem is that if another user enters data during that step 2 - 3 stage, the script is no longer listening, and will not do anything with this data.

Is there anyway that I can have it always listen for a change, and once it does, pass it onto a class or another entity of itself so it can continue to listen for another asynchronous change?

EDIT:

  • The User enters the Data on a website, which is consequently written to a text file.
  • The Python script currently checks the last modified line in this file to see if it differs from the previous check. If this check results in true, then execute the class with the modified line
役に立ちましたか?

解決

Although I'm still not exactly sure why you don't have the server itself handle this, my suggestion to handle it from the Python script would be to use the multiprocessing module. Here's a really basic way to handle this using a single worker process:

from multiprocessing import Queue, Process
import multiprocessing

def worker(e, m):
    while True:
       e.wait() # Wait to be told the file has changed. This will block.
       e.clear()  # Clear the flag so the main thread can set it again if changes happen while we process
       # Send request to server, retrieve data
       # Write to HTML files

def watch_file_for_changes():
    while True:  
       if file_changed: # Use whatever watching mechanism you already have for this; inotify, etc.
          e.set() # Tell the worker to process the file. This unblocks e.wait()

if __name__ == "__main__":
   e = Event()
   # Start some workers. You can choose whatever number you want.
   p = multiprocessing.Process(target=worker, args=(e,))
   p.start()
   watch_file_for_changes()

This is completely untested, and needs some cleaning up, but should give you the general idea. This also assumes your worker is smart enough to figure out if more than one new entry has been added to the text file.

他のヒント

You are describing client - server architecture.

As you assume multiple clients using the same service, you have to allow some sort of multiprocessing.

Web server is typical example, but there are many others.

In your task, following roles are expected:

  • client: performing request and expecting some response
  • server: accepting requests and managing their processing resulting in sending response back to clients
  • worker: a component at server, which is tasked to do "real work"

What you describe seems like mixture of all that together. When you write the code, you usually start thinking in a script, which is what later ends up at worker.

When designing your solution, you have to make your decision on communication technology. There are any options, some being:

  • http - typical with web servers, Python offers many frameworks
  • TCP sockets - rather low level, but also well supported in Python
  • zeromq - based on TCP or unix sockets, supported by pyzmq package

You will have to write all three parts - client, server and the worker.

One quick example of client server solution based on zeromq is in my answer Distributed lock manager

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top