Question

We are building a Real Time GPS tracking app on node.js and mysql database (node-mysql2 driver).

The server is a simple TCP server which accepts incoming data from GPS devices, parses them and then write into mysql table. The GPS devices login to the server, reserve a socket and then start sending location (and other) data packets. The devices send data every 10 seconds and each data packet needs 6 queries to be executed.

It works fine for upto 100 devices or so. But when try with 1000 odd devices, it start failing silently. For 1000 devices, it would come up to around 600 odd queries per second.

After about 700 devices login and start sending data, our system abruptly stops inserting data in mysql table. It does not throw any error and apparently it seems that the server is working nicely. Server keeps accepting data from devices and keep parsing them. Our console.log statements execute and we can see that the code flow is being executed to the end without throwing any error at all. Just no mysql inserts after a point in time (in our case around 4-5 minutes from start and when about 700 devices have logged in).

We are using this method to connect to Mysql -

var mysql = require('mysql2');

var connection = mysql.createConnection({
    host        :   'localhost',
    port    :   3306,
    database    :   sample
    user        :   sample
    password    :   pwd
});

connection.connect(function(err) {
    if(err){console.log("couldnt connect to mysql" + err)};
    console.log("connected to mysql");
});


var sql_query1 = "SELECT `dt_tracker` FROM gs_tracker_data_"
            + imei + "  ORDER BY dt_tracker DESC LIMIT 1" ;

connection.query(sql_query1, function(err,rows,fields)
{
   if (err) { console.log("sql_query1 " + err); return; }
   if(rows.length != 0)
   {

    //processing
   }

});

var sql_query4 = "INSERT INTO gs_tracker_data_"
                                + imei
                                + " (`dt_server`, `dt_tracker`, `lat`, `lng`, " +
                                "`altitude`, `angle`, `speed`, `signal_gsm`, " +
                                "`signal_gps`, `offset`, `params`) VALUES ('"
                                + decoded_data.dt_server + "','"
                                + decoded_data.dt_tracker + "'," + decoded_data.lat
                                + "," + decoded_data.lng + ","
                                + 0 + "," + decoded_data.angle + ","
                                + decoded_data.speed + "," + decoded_data.signal_gsm
                                + "," + decoded_data.signal_gps + ","  + 0 + ",'"
                                + params + "')";

connection.query(sql_query4, function(err,rows)
{
  if (err) { console.log("sql_query4 " + err); return; }

});

The Mysql database is of medium size (mysqldump size is around 35 GB now). It is a very simple data structure with 30 small (<1000 rows) and 1000 large tables (>5 million Rows). From what we can fathom, neither nodejs nor mysql should fail for this kind of load - 600-700 simple inserts per second.

We are also using redis.io for storing frequently needed static data. This part seems to be working without hitch.

Any thoughts on what should we check?

PS We are running this on Windows server 2008

Was it helpful?

Solution

I struggeled with a similar problem. Mistake was just a misspelling in code (activeDb was wrongly spelled in 'logger.info') and that gave a silent failure of mysql connect.

connection = mysql.createConnection(activeDB);
        connection.connect( error => {
            if (error) {
                logger.error("Database error: " + error);
                res.status(503).send(error);
            } 
            logger.info("Database " + activeDB.server);
            res.status(200).send("Connected to " + activeDB.server);
        });
    } catch (error) {
        logger.error("Database error ", error);
        res.status(503).send("Database connection problem " + error);
    }
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top