Question

Currently I am running this in a http handler:

err := mongoCollection.Find(bson.M{"name": vars["name"]}).One(&result)
data, err := json.Marshal(result)
w.Write(data)

How can I begin serving the result before the full BSON data is in?

Edit: the answer needs to go beyond the mgo extension and go into bson. mgo as far as I can see will only serve full documents if I not mistaken. I have one - possibly large - document as my code example clearly shows.

Was it helpful?

Solution

In order for this to be possible, you would need these things:

  1. Access to a Reader for the incoming bson stream
  2. A datatype for generic document parts
  3. A streaming decoder for bson which reads from the Reader and produces document parts
  4. A streaming encoder for json that consumes document parts and writes to a Writer

mgo does not provide number 1. encoding/json does not provide number 2 or 4. mgo/bson does not provide number 3. A bit of googling doesn't turn up any help for any of those points in Go, though there are streaming json parsers in other languages (see answers for Is there a streaming API for JSON?).

Your desire to do this is reasonable, but the support just doesn't exist yet. Fortunately, json and bson are simple enough and all the components you're using are open source, so in theory you could write the tools you need.

OTHER TIPS

I don't think there's anything you can do to avoid unmarshalling the whole BSON (and therefore not serving the result until the BSON has been fully delivered by mgo), short of hacking on mgo. Its API only deals in whole, unmarshalled documents, with no access to any BSON-encoded []byte or Reader that you could potentially elementwise bsondecode-then-jsonencode as data comes in.

Take a look at chanson, you can easily construct and stream json. There's an example that reads data from channels for adding elements to a list. You could probably do something similar

Take a look at json.Encoder. It writes JSON objects to an output stream. json.Marshal produces a []byte in one shot and does not provide a stream.

On the MongoDB side take a look at mgo.Iter. In case you have a large number of documents in your result you can serialize them in batches and make your application more memory efficient.

Sample of using json.Encode:

data := map[string]int{"apple": 5, "lettuce": 7}
enc := json.NewEncoder(w)
enc.Encode(data)

Play

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top