문제

How avoid pooling data in memory. When iterate cursor object in pymongo?

Example:

def iter():
    c=pymongo.Connection()
    cursor=c.db.media.find().skip(0).limit(50000)
    for item in cursor:
        yield item

Before it goes in cycle for there is pause about 2 minus. It loads all data in memory before start iterate for some reason. Can i somehow avoid it?

If I do it in mongodb shell everything is ok.

도움이 되었습니까?

해결책

Do you know if this is possible? If c.db.media.find() returns everything instead of an iterator, I'm not sure there's much you can do.

다른 팁

Look at cursor's block_size method. With it you should be able to set how much you read in advance. I say should, because I'm facing some problems with it now (Getting StopIteration exception on next(cursor) when modifying batch_size in pymongo), but I'm probably making some mistake. block_size should solve your problem.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top