문제

The author here states that:

Share state between processes. Run a long running batch job in one Python interpreter (say loading a few million lines of CSV in to a Redis key/value lookup table) and run another interpreter to play with the data that’s already been collected, even as the first process is streaming data in. You can quit and restart my interpreters without losing any data.

  • If the interpreter is stopped (quit), how is the sharing going to happen?

What is the concept of it? How do you explain this in simple terms, maybe with example?

도움이 되었습니까?

해결책

State is always kept at isolated places. If you have concurrent states, you could have both states independently. But then, as you noticed, if one of them dies you need to track that somehow. So you have a global state machine to handle this and need to implement sub-states and dying of either.

Examples? Hmm. Telephone (the old ones with coins): One state watches money to be enough and the other the speaker to be hung up.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 softwareengineering.stackexchange
scroll top