Question

Currently I have a problem displaying 'chunks' of responses that I am sending from my Web Service Node.js server (localhost:3000) to a simulated client running on a Node.js server (localhost:3001).

  • edit * - Current implementation just uses Angular's %http as the transport without web-sockets

The logic goes as follows:

1 . Create an array on the client side of 'Cities' and POST them (from the AngularJS controller) to the Web Service located at: localhost:3000/getMatrix

$http({
    method: 'POST',
    url: 'http://localhost:3000/getMatrix',
    data: cityArray
}).
success(function (data,status,headers,config){
    // binding of $scope variables

    // calling a local MongoDB to store the each data item received
    for(var key in data){
        $http.post('/saveRoutes', data[key])
        .success(function (data, status){
            // Data stored
        })
        .error(function (data, status){
            // error prints in console
        });
   }

}).
error(function (data,status,headers,config){
    alert("Something went wrong!!");
});


2 . The Web Service then runs through its process to make a matrix of 'Cities' (eg. If it was passed 5 cities, it would return a JSON matrix of 5by5 [25 items]). But the catch is that it passes back the data in 'chunks' thanks to Node's > response.write( data )

Side note - Node.js automatically sets 'Transfer-Encoding':'chunked' in the header

* Other code before (routing/variable creation/etc.) *

res.set({
     'Content-Type':'application/json; charset=utf-8',
});
res.write("[\n");

* Other code to process loops and pass arguments *

// query.findOne to MongoDB and if there are no errors
res.write(JSON.stringify(docs)+",\n");


* insert more code to run loops to write more chunks *

// at the end of all loops
res.end("]");

// Final JSON looks like such
[
    { *data* : *data* },
    { *data* : *data* },
    ......
    { *data* : *data* }
]


Currently the problem is not that the 'chunked' response is not reaching its destination, but that I do not know of a way to start processing the data as soon as the chunks come in.

This is a problem since I am trying to do a matrix of 250x250 and waiting for the full response overloads Angular's ability to display the results as it tries to do it all at once (thus blowing up the page).

This is also a problem since I am trying to save the response to MongoDB and it can only handle a certain size of data before it is 'too large' for MongoDB to process.

I have tried looking into Angular's $q and the promise/defer API, but I am a bit confused on how to implement it and have not found a way to start processing data chunks as they come in.

This question on SO about dealing with chunks did not seem to help much either.

Any help or tips on trying to display chunked data as it comes back to AngularJS would be greatly appreciated.

If the responses could be informative code snippets demonstrating the technique, I would greatly appreciate it since seeing an example helps me learn more than a 'text' description.

-- Thanks

Was it helpful?

Solution

No example because I am not sure what you are using in terms of transport code/if you have a websocket available:

$http does not support doing any of the callbacks until a success code is passed back through at the end of the request - it listens for the .onreadystatechange with a 200 -like value.

If you're wanting to do a stream like this you either need to use $http and wrap it in a transport layer that makes multiple $http calls that all end and return a success header.

You could also use websockets, and instead of calling $http, emit an event in the socket.

Then, to get the chunks back the the client, have the server emit each chunk as a new event on the backend, and have the front-end listen for that event and do the processing for each one.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top