質問

I have a situation, where I'm planning to update a single file on S3 a lot of times. To be more specific imagine if I need to add about 50 lines of data (roughly 200 bytes each) in 1 second by using 10 different machines. (so 5 lines each). So my questions are:

  1. Is it possible to do?
  2. If yes than are there any kind of a limits on how many operations I can perform on a single file. Is they exist than what's the price for overusing them.
  3. Will it be possible for the user or ideally multiple users to download this file during the process of updating?
  4. Also if it sounds not realistic could you recomend other way to do this? ( I'm not looking to use DB)

Thank you.

役に立ちましたか?

解決

What you're trying to do is, to the best of my knowledge, not possible. There is no "append" operation on S3 objects — once an object has been uploaded, it's treated as a constant.

From what I'm reading, the task you're trying to perform here would be best accomplished using a database. It might be possible to do it using S3 by creating a separate object with a unique name for each set of lines that you're trying to append, but dealing with all of those objects will rapidly become difficult.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top