Question

Let say I have a page with 100 objects and each page is around 700 bytes when converted to json.

In order to save the objects to the php based controller I have the following options.

Option 1

For each objects (100 objects) do the following

  1. Take object definition 2 .convert to json
  2. Do http post to the php controller
  3. the php controller saves it to a file or database.

Option 2

Variable bigJsonString;

For each objects (100 objects) do the following

  1. Take object definition 2 .convert to json
  2. append the json to a string variable "bigJsonString" with a delimitter to indicate end of object.

After the big fat bigJsonString is constructed

  1. Do http post to the php controller by sending "bigJsonString"
  2. the php controller saves it to a file or database.

In option 1, I am doing 100 http posts one after another. Does this raise any alarms? Is this normal for web applications doing ajax post?

The second option seems safe but then the only concern is when the 100 objects become say 500 objects or to a point where the "bigJsonString" goes several Megabytes long.

The third option we can introduce is a hybrid of option 1 and 2 where we start by constructing the "bigJsonString" and if the length goes to a certain limit then do a ajax post. Flush the string and build the string again for remaining objects.

What are the pitfalls and what is the normal or standard practice. if someone can point to resources where this is already analysed, that would be great.

Thanks very much.

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top