Question

So I wrote a daemon in go that handles around 800k documents and i'm having an out of memory problem.

From what i saw when getting the documents from mongodb the memory usage increases with every loop.

func main() {
session, err := mgo.Dial("localhost")
if err != nil { panic(err) }
defer session.Close()

subscriptionsC  = session.DB("sm").C("subscriptions")
subscriptions   := []Subscription{}

for {
    subscriptions = GetSubscriptions()

And the other function is:

func GetSubscriptions()([]Subscription) {
    result  := []Subscription{}
    err    := subscriptionsC.Find(nil).Prefetch(0.0).All(&result)

    if err != nil { Log("signups_err", err.Error() + "\n") }
    return result
}

I don't know if it's redeclaring the array with each loop or what exactly happens.

Any help would be greatly appreciated.

Was it helpful?

Solution 2

The array is definietly being inialized in every loop because of the call to GetSubscriptions() and then inside the loop result := []Subscription{}, but I think that's not the source of the problem.

The problem could be coming from your global session, see Database connections in web applications, The proper way would be by using a session pool.

Edit: also see How do I call mongoDB CRUD method from handler?

OTHER TIPS

Author of mgo here.

There's nothing wrong with your code, but it's incomplete, so it's always possible that something you're not showing is in fact leaking memory.

Can you provide a full example that leaks memory?

There's no point in caching/pooling sessions, by the way, because mgo internally handles pooling of resources for you. What you must do is to make sure you close the sessions you create, which the sample code does.

Update after OP's comment below:

Seems that the problem is with a high amount of docs. pastebin.com/jUDmbS4z this will crash once every 10-15 mins (around 4-5 loops). It's getting around 600k docs from mongo in one loop.

Yeah, running queries that load a ridiculous amount of data in memory at once can easily create trouble for a number of reasons unrelated to mgo.. memory fragmentation, non-precise collector, etc. Just iterate over the items as they arrive as usual; it is comfortable, fast, and will dramatically reduce the amount of memory used, as you already figured.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top