Pergunta

Web pages are usually tested by refreshing the page, clicking some UI component, then either writing to a debug log or adding some breakpoints in the IDE... in larger applications, unit tests are written to guarantee the output of the program, etc.

It seems like Java is an awful server-side language for a web page. You have to recompile every time you modify any line of code and pass the built EAR or WAR to the application server... something like Python or PHP (or any other interpreted language) would execute at run-time, removing the need for compilation before testing.

My question: what is the justification to using Java as a server-side language? Why is it better to use than an interpreted language?

Foi útil?

Solução

There's the title of your question which is a valid one and then there is the content of the question which contains some poor assumptions and/or incorrect statements. I'll address the body first:

You have to recompile every time you modify any line of code and pass the built EAR or WAR to the application server...

Not true. First of all, technology exists for Java which allows you to modify code and have it change a running application without even restarting the application. I link to the product not to endorse it but rather as proof. I have used it and can verify that it works. Secondly, python must also be compiled. It's just when that compilation occurs. In java you do it ahead up front. In python it happens at runtime (the details may vary depending on the Python variant)

As far as why one would use a pre-compiled language, here are a few reasons off the top of my head

  • bytecode files are smaller than source files and servers (not being human) don't need source.
  • Compiling ahead of time allows you to catch syntactical errors. Why wait until you've loaded your file on the server to find out that you forgot a colon?

The rest of the reasons around why you would choose one over the other really aren't about when you compile. They tend to center around dynamic versus static code and the advantages and disadvantages between the two approaches. I would expect well written Java code to be faster than well written Python though this is a factor that can change over time in both actual speed and whether it matters.

Outras dicas

First, your hyperbole is a little off:

You have to recompile every time you modify any line of code and pass the built EAR or WAR to the application server...

That depends entirely if you are using JSP or some other template library to build your application. JSPs and most alternative templates allow you to edit in place and simply reload the page. You just have to remember to copy those changes back to your repository. But there's more, see below.

Why Java?

The biggest advantage that Java has over several other languages is the ecosystem surrounding it. Whether you are building a Spring based application or doing your own thing, typically there is some API out there that is ready to support what you want to do. The Java ecosystem has largely specialized on the server side. Knowing you don't have to reinvent the wheel is a big boon.

Does Java have it's downsides? Absolutely, but that applies to every computer language invented. If you don't think so, you probably haven't built anything substantial with that language.

Blaming Java for a slow build is unfair. Take some time to see if a better build tool can improve your build times. If your build server is anemic, take time to figure out how to improve how it builds. For example, I have a project that takes 2.5 minutes to build on my local machine to compile and create a deployment package from all the Java, C#, and Python pieces. The same process on the build server takes 14 minutes. A lot of that has to do with disk speed and it being an older server. Fix the build.

Why not an interpreted language?

That really depends on the platform you are building and the team you have. It's one thing to say you are using Ruby on Rails or PHP, etc. and another to find competent developers to support your app. To be honest, the personnel issue is the one reason we migrated away from Ruby on Rails in a project.

That said, there are very few technical reasons not to use an interpreted language. Typically it depends on if you can find all the support libraries you need for your particular domain.

  • Pick the language that serves your needs best--including finding people
  • Be careful to work in a way where you can commit working code to version control
  • Be objective when selecting your platform. An app is bigger than the framework or language it's built in.

They aren't mutually exclusive

For teams that use Java for web services and build Single Page Apps in JavaScript, they have the best of both worlds. The user interface is built using one of the many Single Page App platforms with the advantage of being able to experiment quickly. Meanwhile the web service part which doesn't typically change all that much can just be used.

In this world of microservices, it's becoming increasingly common to have a heterogeneous set of technologies. For example, you might have a python based service to take advantage of the natural language support libraries for part of your app mixed in with Java based infrastructure pieces.

The existing answers haven't touched on the real reason Java became popular as a server side language.

When improvements in CPU performance started coming from additional cores rather than increasing clock speeds, applications needed to become multi-threaded to realize those performance gains. Java was one of the first languages in common use to introduce high-level multithreading constructs that made writing highly concurrent applications relatively easy. Despite Java's (no longer deserved) reputation for being slow, it was evident that you could write multithreaded server applications faster and easier in Java than you could in nearly any other language. So that's what people did.

You have to recompile every time you modify any line of code and pass the built EAR or WAR to the application server

This is a process issue, and one that I didn't see answered above (including the voluminous comments).

Yes, for a production or QA deployment, you need to build a deployable artifact. This is a Good Thing, if only because it prevents well-meaning developers from logging into the server to make code changes.

But for development, you should be using an IDE (such as Eclipse or IntelliJ) to run the server locally. You can change a single line of code and the IDE will incrementally compile and hot-swap the class into the running server. There are some cases where you will need to redeploy the WAR/EAR, but you should never need a full build to do that.

If you don't like IDEs, you can get much the same effect by working with "exploded" WAR/EAR files. This is painful, so I don't recommend it, but it definitely works: Java will let you recompile a single class (which was one of the things that I enjoyed about Java vs C++ in the late 90s), and both Tomcat and Jetty will automatically detect changes to JSPs.

Licenciado em: CC-BY-SA com atribuição
scroll top