質問

My code is leaking memory. After a couple of hours, it fills up the entire memory and crashes. I've simplified my code here, would anybody be able to tell if this looks like it leaks? Thanks.

var request = require('request').forever(), // as per [1]
    async = require('async'),
    kue = require('kue'),
    jobs = kue.createQueue(),
    pool = { maxSockets: 1 };



function main (job, done) {
    async.series(
        [function (callback) {
            var req = request({url: job.data.URL1, pool: pool}, function (err, resp, body) {
                //stuff...
                callback(err);
            });
        },
        function (callback) {
            var req = request({url: job.data.URL2}, function (err, resp, body) {
                //stuff...
                callback(err);
            });
        }
        ],
        function (err) {
            //stuff...
            done();
        }
    );

}

jobs.process('job_name', function (job, done) {  //many jobs with 'job_name' in the queue
    main (job, done);
});

[1] https://groups.google.com/d/msg/nodejs/ZI6WnDgwwV0/sFm4QKK7ODEJ

役に立ちましたか?

解決

I don't think your code is to blame. I had the very same issue using kue, to be sure that I wasn't doing anything wrong I made a super simple worker like this:

var Redis       = require('redis'),
    kue         = require('kue'),
    config      = require("../../config/local.js"),
    redisClient = Redis.createClient(config.redis),
    jobs        = kue.createQueue({ redis : config.redis });

jobs.process('testjobs', function processJob(job, done, error) {
    console.log(job.data, error);
    done();
});

Running this code made me realize that is kue the one that leaks. The workaround is to use pm2, this guy will run your program and restart it if the memory is going to the roof, I'm using the JSON App Declaration to configure a maximum amount of memory allowed before restarting the process.

{
  "apps" : [
    {
      "name": "test_worker",
      "script": "test.js",
      "instances": 1,
      "max_restarts": 10,
      "max_memory_restart" : "10M",
      "ignore_watch": [
        "[\\/\\\\]\\./",
        "node_modules"
      ],
      "merge_logs": true,
      "exec_interpreter": "node",
      "exec_mode": "fork_mode"
    }
  ]
}

Hope this helps.

他のヒント

If, as it sounds, jobs are added to the queue faster than they are pulled off, you'll see your memory usage grow. This isn't exactly a memory leak. It's part of how Kue makes job-level events available.

By default Kue hangs onto the job in memory until the job has been completed or failed. It does this so that it can trigger job-level events (e.g. start, progress, complete, failed) in the process that created the job.

This means that all jobs currently on the queue also live in the memory of the process that created them (assuming no application restarts). As long as the queue doesn't get backed up you won't see memory grow. However, if the queue backs up memory grows, sometimes alarmingly so.

What to do? If you turn off job-level events Kue won't hang onto the job after it has been enqueued. You can do this globally using the jobEvents flag:

 kue.createQueue({jobEvents: false});

Or you can turn job-level events on or off per job using the job's events method:

var job = queue.create('test').events(false).save();

These work if you don't need to respond to job events at all. However, if you do need handle events for the job, you can use queue-level events. Since the job isn't held in memory, you need to grab the job from redis in order to do anything with it:

queue.on('job complete', function(id, result){
  kue.Job.get(id, function(err, job){
    // do something with the job
  });
});
ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top