Pregunta

I'm currently developing a web application for government land planning. The application runs mostly in the browser, using ajax to load and save data.

I will do the initial development, and then graduate (it's a student job). After this, the rest of the team will add the occasional feature as needed. They know how to code, but they're mostly land-planning experts.

Considering the pace at which Javascript technologies change, how can I write code that will still work 20 years from now? Specifically, which libraries, technologies, and design ideas should I use (or avoid) to future-proof my code?

¿Fue útil?

Solución

Planning software for such a lifespan is difficult, because we don't know what the future holds. A bit of context: Java was published 1995, 21 years ago. XmlHttpRequest first became available as a proprietary extension for Internet Explorer 5, published 1999, 17 years ago. It took about 5 years until it became available across all major browsers. The 20 years you are trying to look ahead are just about the time rich web applications have even existed.

Some things have certainly stayed the same since then. There has been a strong standardization effort, and most browsers conform well to the various standards involved. A web site that worked across browsers 15 years ago will still work the same, provided that it worked because it targeted the common subset of all browsers, not because it used workarounds for each browser.

Other things came and went – most prominently Flash. Flash had a variety of problems that led to its demise. Most importantly, it was controlled by a single company. Instead of competition inside the Flash platform, there was competition between Flash and HTML5 – and HTML5 won.

From this history, we can gather a couple of clues:

  • Keep it simple: Do what works right now, without having to use any workarounds. This behaviour will likely stay available long into the future for backwards-compatibility reasons.

  • Avoid reliance on proprietary technologies, and prefer open standards.

The JavaScript world today is relatively volatile with a high flux of libraries and frameworks. However, nearly none of them will matter in 20 years – the only “framework” I'm certain that will still be used by then is Vanilla JS.

If you want to use a library or tool because it really makes development a lot easier, first make sure that it's built on today's well-supported standards. You must then download the library or tool and include it with your source code. Your code repository should include everything needed to get the system runnable. Anything external is a dependency that could break in the future. An interesting way to test this is to copy your code to a thumb drive, go to a new computer with a different operating system, disconnect it from the internet, and see whether you can get your frontend to work. As long as your project consists of plain HTML+CSS+JavaScript plus perhaps some libraries, you're likely going to pass.

Otros consejos

What is even more important than your code surviving for 20 years is that your data survives for 20 years. Chances are, that's the thing worth preserving. If your data is easy to work with, building an alternate system on top of it with newer technology will be easy.

  • So start with a clear and well documented data model.
  • Use an established, well supported database system, such as Oracle[1] or SQL Server.
  • Use basic features, don't try to squeeze in flashy new ones.
  • Prefer simple over clever.
  • Accept that future maintainability can come at the expense of aspects like performance. For instance, you might be tempted to use stored procedures, but these might limit future maintainability if they prevent someone from migrating the system to a simpler storage solution.

Once you have that, future-proofing the app itself is simpler, because it's a wrapper around the data model, and can be replaced if, in 10 years, no one uses Javascript anymore, for instance, and you need to migrate the app to WASM or something. Keeping things modular, less interdependent, allows for easier future maintenance.


[1] Most comments to this answer take a strong stance against using Oracle for a DB, citing a lot of perfectly legitimate reasons why Oracle is a pain to work with, has a steep learning curve and installation overhead. These are entirely valid concerns when choosing Oracle as a DB, but in our case, we're not looking for a general purpose DB, but one where the primary concern is maintainability. Oracle has been around since the late 70's and will probabl be supported for many years to come, and there's a huge ecosystem of consultants and support options that can help you keep it running. Is this an overpriced mess for many companies? Sure. But will it keep your database running for 20 years? Quite likely.

The previous answer by amon is great, but there are two additional points which weren't mentioned:

  • It's not just about browsers; devices matter too.

    amon mentions the fact that a “web site that worked across browsers 15 years ago will still work the same”, which is true. However, look at the websites created not fifteen, but ten years ago, which, when created, worked in most browsers for most users. Today, a large part of users won't be able to use those websites at all, not because browsers changed, but because devices did. Those websites would look terrible on small screens of mobile devices, and eventually not work at all if developers decided to rely on JavaScript click event, without knowing that tap event is also important.

  • You're focusing on a wrong subject.

    Technology changes are one thing, but a more important one is the changes of requirements. The product may need to be scaled, or may need to have additional features, or may need its current features to be changed.

    It doesn't matter what will happen to browsers, or devices, or W3C, or... whatever.

    If you write your code in a way it can be refactored, the product will evolve with technology.

    If you write your code in a way nobody can understand and maintain it, technology doesn't matter: any environmental change will bring your application down anyway, such as a migration to a different operating system, or even a simple thing as natural data growth.

    As an example, I work in software development for ten years. Among the dozens and dozens of projects, there were only two I decided to change because of technology, more precisely because PHP evolved a lot over the last ten years. It wasn't even the decision of the customer: he wouldn't care less if the site uses PHP's namespaces or closures. However, changes related to new requirements and scalability, there were plenty!

You do not plan to last 20 years. Plain and simple. Instead you shift your goals to compartmentalization.

Is your app database agnostic? If you had to switch data-bases right now, could you. Is your logic language agnostic. If you had to rewrite the app in a totally new language right now, could you? Are you following good design guidelines like SRP and DRY?

I have had projects live for longer then 20 years, and I can tell you that things change. Like pop-ups. 20 Years ago you could rely on a pop-up, today you can not. XSS wasn't a thing 20 years ago, now you have to account for CORS.

So what you do is make sure your logic is nicely separated, and that you avoid using ANY technology that locks you in to a specific vendor.

This can be very tricky at times. .NET for example is great at exposing logic and method for it's MSSQL database adapter that don't have equivalents in other adapters. MSSQL might seems like a good plan today but will it remain so for 20 years? Who knows. An example of how to get around this to to have a data layer totally separate from the other parts of the application. Then, worst case, you only have to re-write the entire data layer, the rest of your application stays unaffected.

In other words think of it like a car. Your car is not going to make it 20 years. But, with new tires, new engine, new transmission, new windows, new electronics, etc. That same car can be on the road for a very long time.

The answers by @amon and some others are great, but I wanted to suggest you look at this from another perspective.

I've worked with Large Manufacturers and Government Agencies who were relying on programs or code-bases that had been used for well over 20 years, and they all had one thing in common -- the company controlled the hardware. Having something running and extensible for 20+ years isn't difficult when you control what it runs on. The employees at these groups developed code on modern machines that were hundreds of times faster than the deployment machines... but the deployment machines were frozen in time.

Your situation is complicated, because a website means you need to plan for two environments -- the server and the browser.

When it comes to the server, you have two general choices:

  • Rely on the operating system for various support functions which may be much faster, but means the OS may need to be "frozen in time". If that's the case, you'll want to prepare some backups of the OS installation for the server. If something crashes in 10 years, you don't want to make someone go crazy trying to reinstall the OS or rewrite the code to work in a different environment.

  • Use versioned libraries within a given language/framework, which are slower, but can be packaged in a virtual environment and likely run on different operating systems or architectures.

When it comes to the browser, you'll need to host everything on the server (i.e. you can't use a global CDN to host files). We can assume that future browsers will still run HTML and Javascript (at least for compatibility), but that's really a guess/assumption and you can't control that.

The core of most applications is the data. Data is forever. Code is more expendable, changeable, malleable. The data must be preserved, though. So focus on creating a really solid data model. Keep the schema and the data clean. Anticipate, that a fresh application might be built on top of the same database.

Pick a database that is capable of enforcing integrity constraints. Unenforced constraints tend to be violated as time passes. Nobody notices. Make maximum use of facilities such as foreign keys, unique constraints, check constraints and possibly triggers for validation. There are some tricks to abuse indexed views to enforce cross-table uniqueness constraints.

So maybe you need to accept that the application will be rewritten at some time. If the database is clean there will be little migration work. Migrations are extremely expensive in terms of labor and defects caused.

From a technology perspective it might be a good idea to put most of the application on the server and not in a JavaScript form on the client. You'll probably be able to run the same application in the same OS instance for an extremely long time thanks to virtualization. That's not really nice but it's a guarantee the app will work 20 years from now without any expensive maintenance and hardware costs. Doing this you at least have the safe and cheap fallback of continuing to run old, working code.

Also, I find that some technology stacks are more stable than others. I'd say that .NET has the best possible backwards compatibility story currently. Microsoft is dead serious about it. Java and C/C++ are really stable as well. Python has proven that it is very unstable with the Python 3 breaking changes. JavaScript actually seems quite stable to me because breaking the web is not an option for any browser vendor. You probably should not rely on anything experimental or funky, though. ("Funky" being defined as "I know it when I see it").

The other answers do make sense. However, I feel the comments on the client technology is over complicating things. I've been working as a developer for the past 16 years. In my experience, as long as you keep your client code intuitive, you should be fine. So no "hacks" with frames / iframes, etc.. Only use well defined functions in the browsers.

You can always use compatibility modes in browsers to keep them working.

To prove my point, only a few months ago I fixed a millennium bug in the javascript code for a customer, who has been running their web app for 17 years. Still works on recent machines, recent database, recent operating system.

Conclusion: keep it simple and clean and you should be fine.

A few axioms:

  • Truth survives. In this context, it would be algorithms and data models - that which truthfully represents the "what" and the "how" of your problem space. Although, there is always the potential for refinement and improvement, or an evolution of the problem itself.
  • Languages evolve. This is as true for computer languages as it is for natural languages.
  • All technology is vulnerable to obsolescence. It just may take longer for some technologies than others

The most stable technologies and standards (those least vulnerable to obsolescence) tend to be those which are non-proprietary and have been most widely adopted. The wider the adoption, the greater the inertia against almost any form of change. Proprietary "standards" are always vulnerable to the fortunes and whims of their owner and competitive forces.

Twenty years is a very long time in the computer industry. Five years is a more realistic target. In five years' time, the whole problem your application is meant to solve could be completely redefined.

A few examples to illustrate:

C and C++ have been around for a long time. They have implementations on just about every platform. C++ continues to evolve, but "universal" features (those available on all platforms) are pretty much guaranteed to never be deprecated.

Flash almost became a universal standard, but it is proprietary. Corporate decisions to not support it on popular mobile platforms have basically doomed it everywhere - if you're authoring for the web, you want your content available on all platforms; you don't want to miss the major market mobile has become.

WinTel (Windows/x86) despite being proprietary to Microsoft and Intel, having started out on a less-than-optimal platform (16 bit internal / 8 bit external 8088 vs contemporaneous Apple Macintosh 32 bit internal / 16 bit external 68000), and erosion to Apple in the consumer market remains a de facto choice for business platforms. In all that time (25 years), a commitment to backward compatibility has both hobbled future development and inspired considerable confidence that what worked on the old box will still work on the new one.

Final thoughts

JavaScript might not be the best choice for implementing business logic. For reasons of data integrity and security, business logic should be performed on the server, so client-side JavaScript should be limited to UI behavior. Even on the server, JavaScript might not be the best choice. Although easier to work with than other stacks (Java or C#) for small projects, it lacks the formality which can help you write better, more organized solutions when things get more complex.

Licenciado bajo: CC-BY-SA con atribución
scroll top