Question

I'm after some guidance on a new project I'm working on that requires low latency and high concurrency. The project involves receiving live data from a third party feed and after some basic processing and storage, sending these values to all users currently active on the website.

The data arrives via HTTP Push and my current plan is to use Node.js to receive this data, which then runs the data through an algorithm before updating related data in a database of some kind. Finally, updates are sent to all connected users of the website via a websocket.

Now, I'm trying to have this scalable to handle over 10,000 connected users at once, all connected via websocket and sent updates approximately once every 3 seconds. Given that each user can then interact with the web app during this, it's to result in many requests back and forth.

Now, apart from the high level basic idea I have, with the decision to have Ruby on Rails as the website framework and node js to handle the 'liveness' of it all - I'm a little stuck. I don't know what kind of database to use (I imagine it'll be a non relational database for quick storage) and I don't know the specifics of how to architect such a set up, along with how to implement the logic.

So my question is: Given my goal, how do I go about structuring such an application and what do I need to know in order to have it scalable and real-time to the level I desire?

Thanks greatly for any help.

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top