문제

I have a csv file having about 10k records. I need to retrieve it one by one in my nodejs app.

The scenario is there is when user clicks button "X" first time, the async request is sent to nodejs app which gets data from first row from CSV file. When he clicks again, it'll show data from second row and it keeps on going.

I tried using fast-csv and lazy but all of them read the complete file. Is their a way I can achieve tihs?

도움이 되었습니까?

해결책

Node comes with a readline module in it's core, allowing you to process a readable stream line by line.

var fs = require("fs"),
    readline = require("readline");

var file = "something.csv";

var rl = readline.createInterface({
    input: fs.createReadStream(file),
    output: null,
    terminal: false
})

rl.on("line", function(line) {
    console.log("Got line: " + line);
});

rl.on("close", function() {
    console.log("All data processed.");
});

다른 팁

I think the module 'split' by dominic tarr will suffice. It breaks up the stream line by line. https://npmjs.org/package/split

fs.createReadStream(file)
    .pipe(split())
    .on('data', function (line) {
      //each chunk now is a seperate line!
    })
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top