Question

I recently noticed that Rich desktop applications (I define it as an application running inside a virtual machine embded in a web browser, based on tools like Java Web Start, Adobe Flash or Unity web player) seems to become more and more rare. I only see some when I play some plain old flash video games online. They are now replaced by HTML5/Javascript applications, which have the advantage of being compatible with all browsers without plugins. But despite the compatibility, it seems I missed some other reasons.

Technologically speaking, I see important advantages to RDA, especially the possibility to download the application once and being able to run it offline though an application manager. At first sight, it seems to be better than downloading javascript code embded in a web page. It looked promising.

So here are my questions :

  • What are the main reasons (which might be technology-related or not) why this architecture failed?
  • Regarding compatibility, why didn't the main browsers ended up embding by default some of those plugins?
Was it helpful?

Solution

I think you answered your own question partly. People are designing applications for browsers and mobile devices. In order to target both, and share code, using HTML5/JS is a very quick approach.

Technologically speaking, I see important advantages to RDA, especially the possibility to download the application once and being able to run it offline though an application manager.

Oddly enough some of the applications you might use right now are being written with HTML5/JS and you might not even notice it. Electron, nw.js, and probably other containers allow HTML5/JS applications to access the filesystem and OS. These containers essentially remove all the limitations of the web allowing application developers to create cross platform solutions on every device seamlessly. (That and it has all the advantages of HTML allowing simple theming and layouts).

Also just as a side-note you can create a webpage that functions offline in the browser. A few sites do function this way, but it's usually a low priority feature that's ignored. (Especially with cellphones being prevalent).

What are the main reasons (which might be technology-related or not) why this architecture failed?

Things like Flash were created merely to fill holes that the HTML (including WebApps) standards currently didn't have. There was a time when the web was very simple. Things like video and audio were not really considered, so plugins were required to fill in the gaps. Plugins can still exist as long as they fill in a perceived hole. The Unity Web Player probably still fills a niche performance-wise compared to some WebGL alternatives as an example. Using flash for video or audio though, not so much.

Generally plugins are proprietary. This plays into one alternate reason why plugins have failed which is security. A lot of people distrust Flash and Java because they've been exploited numerous times.

Regarding compatibility, why didn't the main browsers ended up embding by default some of those plugins?

Not part of the specs. Web browsers are extremely complicated with thousands of individual pieces spread across numerous specs. Supporting something like Flash, Java, or Unity that merely duplicate most of the functionality would be a waste of time. Most companies, especially Microsoft, have discovered it's far easier to work with the specs and suggest changes over time when features aren't there. A big part of this though is browsers are now controlled by large corporations who have the manpower to implement complex features and organize development. This has made the need for proprietary plugins unnecessary unless the specs can't be modified to include a feature. (A good example recently is Encrypted Media Extensions which used to require plugins).

Licensed under: CC-BY-SA with attribution
scroll top