Question

Why aren't there any Javascript distributed computing frameworks / projects? The idea seems absolutely awesome to me because:

  • The Client is the Browser
  • Iteration can be done with AJAX
  • Webmasters could help projects by linking the respective Javascript
  • Millions or even billions of users would help DC projects without even noticing

Please share your views on this subject.

EDIT: Also, what kind of problems do you think would be suitable for JSDC?

GIMPS for instance would be impossible to implement.

Was it helpful?

Solution

First that comes to my mind is security. Almost all distributed protocols that I know have encryption, thats why they prevent security risks. Although this subject is not so innovative..

http://www.igvita.com/2009/03/03/collaborative-map-reduce-in-the-browser/

Also Wuala is a distributed system, that is implemented using java applet.

OTHER TIPS

I think that Web Workers will soon be used to create distributed computing frameworks, there are some early attempts at this concept. Non-blocking code execution could have been done before using setTimeout, but it made a little sense as most browser vendors focused on optimizing their JS engines just recently. Now we have faster code execution and new features, so running some tasks unconsciously in background as we browse the web is probably just a matter of months ;)

There is something to be said for 'user rights' here. It sounds like you're describing a situation where the webmaster for Foo.com includes the script for, say, Folding@Home on their site. As a result, all visitors to Foo.com have some fraction of their CPU "donated" to Folding@Home, until they navigate away from Foo.com. Without some sort of disclaimer or opt-in, I would consider that a form of malware and avoid vising any site that did that.

That's not to say you couldn't build a system that asked for confirmation or permission, but there is definite potential for abuse.

I have pondered this myself in the context of item recommendation.

First, there is no problem with speed! JIT compiled javascript can be as fast as unoptimized C, especially for numeric code.

The bigger problem is that running javascript in the background will slow down the browser and therefore users may not like your website because it runs slowly.

There is obviously an issue of security, how can you verify the results?

And privacy, can you ensure sensitive data isn't compromised?

On top of this, it's quite a difficult thing to do. Can the number of visits you receive justify the effort that you'll have to put into it? It would be better if you could run the code transparently on either the server or client-side. Compiling other languages to javascript can help here.

In summary, the reason that it's not widespread is because developers' time is more valuable than server time. The risk of losing user data and the inconvenience to users outweighs the potential gains.

I know of pluraprocessing.com doing similar thing, not sure if exactly javascript, but they run Java through browser and runs totally in-memory with strict security.

They have 50,000 computers grid on which they have successfully run applications even like web-crawling (80legs).

I think we can verify results on some kind of problem.

Let's say we have n number of items and need to sort it. We'll give it to worker-1, worker-1 will give us the result. We can verify it O(n) time. Please consider that it take at least O(n*log(n)) time to produce the result. Additionally we should consider how large is n items? (concern about network speed)

Another example, f(x)=12345, and function is given. Purpose is to find value of x. We can test it by replace x with some worker's result. I think some problems that are not verifiable are difficult to give to someone.

The whole idea of Javascript Distributed Computing has number of disadvantages:

  • single point of failure - there is no direct way to comunicate between nodes
  • natural fails of nodes - every node is working as long as browser
  • no guarantee that message sent will be ever received - according to natural fails of nodes
  • no guarantee that message received have been ever sent - because some hacker can interpose
  • annoying load on client side
  • ethical problems

while there is only one (but very tempting) advantage:

  • easy and free access to milions of nodes - almost every device has JS supporting browser nowadays

However the biggest problem is corelation between scalability and annoyance. Let's say you offer some attractive web service and run computing on client side. More people you use for computing, more people are annoyed. More people are annoyed, less people use your service. Well, you can limit annoyance (computing), scalability or try something between.

Consider google for example. If google will run computations on client side, some people will start to use bing. How many ? Depends on annoyance level.

The only hope for Javascript Distributed Computing may be multimedial services. As long as they consume lots of CPU, nobody will notice any additional load.

I think the no.1 problem is javascript inefficiency at computing. It wouldn't be just worth it, because an application in pure c/c++ would be 100 times faster.

I found a question similar to this a while back, so I built a thingy that does this. It uses web workers and fetches scripts dynamically (but no Eval!). Web workers sandbox the scripts so they cannot access the window or the DOM. You can see the code here, and the main website here

The library has a consent popup on first load, so the user knows what's going on in the background.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top